The project aims at the development of a low-cost system for the three-dimensional (3D) modeling of scenes and objects starting from the acquisition of real subjects. From analysis and processing of raw data (measured by the sensors), 3D models will be extracted and utilised for different applications. The system will be focused on all the modeling stages, from the sensor design and development, and related image generation (optical and range), to the graphic rendering and synthetic reproduction of the observed objects and scenes. The system, and the specific developed techniques, will be characterised by the versatility of dealing with different types of data (i.e., independence from the specific acquisition sensorial device), and will be devoted to the reconstruction of both single objects (simple or more complex, like, e.g., the human face), and whole scenes.


The objective of this project is to develop augmented reality technologies that improve supervisory control of remote vehicles for eventual application to on-orbit satellite servicing by free-flying servicing robots.
A video-based augmented reality interface for remote supervisory control of free-flying space robots will be designed. The system will analyze the video image to locate the vehicle and produce a virtual feedback (both visual and acoustical) to augment the real feedback.



More information and other projects in which VIPS lab is involved can found here.

For more information :


Last Revision: 5 February 2014