Very happy to get my hands on this brand new tech from Autodesk _labs called Project Photofly. This technology uses a series of 2D photographs stitched together to create a photo realistic high density polygon model and texture using cloud computing for the heavy grunt work via the photo scene editor GUI. Here is the result of my first attempt with Project Photofly using a clay Buddha sculpture of mine.
Here are the photos I took to create the model. I ran out of light to take some top down photos, which caused problems in the stitch later.
So if you are wondering what you can use with this cool new tech? For one digital set extensions, props and background elements can now all be made simply by taking photographs, yes you can take photos of set and recreate it in 3D, check out this great example.
Clever people have also figured out this is a great way to create models for games using photofly + Zbrush, here is a very detailed tutorial series covering step by step process to transfer the hi-res photofly mesh to a low-res game friendly mesh.
Now, you can use the above tut to generate VFX ready meshes as well, why? because the mesh that comes straight from photofly while very high in detail is not production ready, the tris are often in many disconnected islands and the UV is pretty random. So for any production use its highly recommended that you retopo the mesh and transfer the details. Tools like 3Dcoat, Maya / Mudbox and Zbrush will all do this for you.
If you are keen to find out more, here is a little demo I created showing the workflow from photofly to Maya / Mental Ray. Enjoy! I think we should be seeing some new production ready tools from Autodesk using this new tech soon, not only for modelling, but also for 3D tracking ( Matchmover update please! )
Update: A little FYI if you have issues connecting to the cloud ( http://labs.blogs.com/its_alive_in_the_lab/2011/08/steps-for-identifying-network-issues-with-project-photofly.html )