About Me

Nicolas Burrus

Email: nicolas@burrus.name

I am a PhD in computer vision with a strong interest in real-time 3D tracking and mapping. Between 2009 and 2012 I was a postdoc at Carlos III University of Madrid working on object 3D reconstruction and tracking for robotics.

The success of RGBDemo led me to cofound ManCTL in 2011. After some initial R&D projects we got selected to become a member of the Microsoft Kinect Accelerator powered by Techstars in 2012, and developed Skanect, a real-time 3D scanning software compatible with low-cost RGB-D cameras. Watch our first prototype in action in 2011!

In 2013 we joined forces with Occipital to better explore the possibilities of depth sensing on mobile devices. We participated to the launch of Structure Sensor, the first depth sensor for mobile, and it became the #6 most funded project on Kickstarter.

Here are some of the projects I’ve participated to at Occipital:

  • Launch of Structure Sensor, worked with our small team on the 3D reconstruction stack and wrote the first version of the Objective C API for Structure SDK to expose it to developers.
  • Calibrator iOS app to calibrate the iOS color camera with Structure Sensor, using feature matching between the IR camera in the sensor and color.
  • Kept improving our RGBD tracking, 3D reconstruction and texturing in real-time on iOS.

  • Unbounded positional tracking using RGBD for CES 2015.

  • Mixed reality demo for iOS at CES 2016. Combining RGBD tracking, 3D reconstruction, and physics via an integration to the Scene Kit (and later on Unity) game engines.

  • Did not get to work on medical apps myself, but ManCTL started with an R&D project for foot orthotics, but I’ve been very proud to see the many medical use of the Structure Sensor SDK and our live 3d reconstruction.

  • We adapted Bridge Engine to launch a VR headset for iPhone. Optimized for latency, and leveraged visual-inertial sensor fusion for pose prediction.

  • Launch of Canvas (scan your home), that leveraged our work on real-time unbounded large-scale SLAM for mobile.

  • Positional tracking for AR/VR with a single camera and an IMU. Port from mobile to Windows/PC. Depth sensor not always required anymore :)

  • TapMeasure led a small team to build that iOS app in a very short time to leverage ARKit and take 3D measurements.
  • Positional tracking for AR/VR extended to stereo and including more room perception for CES 2018.