When I first saw Photosynth my reaction was "Here is a gimmick looking for a real world application." Today it is just what I need for teaching a forensics class about photographing crime and accident scenes. With Synth you can clearly establish a 360-degree view of the scene, which helps put more detailed photos into overall perspective, visually establish relative positions, and greatly enhance the written documentation that accompanies an investigation.
Another use, be it for a grandchild, an object at home or at a crime scene, is to do a 360-degree Synth around just that person or object to help document it.
You know its a good idea but I would like them to use video to create the 3d space sure it would take a LOT more computing power due to the amount of frames but the synths would be much better and more realistic. Of course from personal experience the interface could be worked on as well as the way it puts the photos together. Right now it just seems to show the photo thats at the closest angle to what you try to view rather than create a 3d viewing space from pushing certain parts of pictures into the back round onto their proper place using the reference points it develops(which, because they're visible they resemble the actual 3d view Microsoft claims. Otherwise I like the concept and could see it really going somewhere, like creating ultra realistic games and digital sets for cinematographers which would get rid of the need for 3d modeling for impossible angles in movies.