
Put your head in front of the camera and you will see it in the view of the scanner.Select Run from the 3D Scan (C++) section of the sample browser.These are samples that will work on both the front-facing and rear-facing RealSense cameras. In the Sample Browser you will find a section called Common Samples. Find it in the list, right-click it, and select Run as Administrator. The RealSense SDK (which is free to download)Īfter installing the SDK, go to the Charm Bar (in Windows 8) or the Start Menu (in Windows 7 or 10) and search for RealSense SDK Sample Browser.A computer with an Intel Core 4th gen processor (or better).An Intel RealSense camera (or computer with integrated RealSense camera).
Polycount meshlab how to#
(In the next tutorial, we'll see how to actually get this into Unity, and what you could use this for in a game.) We'll end up with a 3D, colored mesh of your head, ready to use in Unity. Edit the 3D mesh to give it a lower vertex and poly count.Convert the vertex colors to a UV texture map.Convert the 3D file from OBJ format to PLY format (for editing in Blender).Scan your head using a RealSense camera and the SDK.Previously, I had written about 3D-scanning the world around you in real time and putting a virtual object in it however in this tutorial, I'd like to explain how to do the converse, by 3D-scanning yourself then adding that scan to the virtual world. Asan Intel Software evangelist (and a self-confessed hobbyist coder), I'm fortunate enough to get to experiment with these possibilities using Intel's cutting-edge technology. I'm very excited about sensification of computing: an idea where smart things sense the environment, are connected, then become an extension of the user.
