TY - GEN
T1 - Synthesizing reality for realistic physical behavior of virtual objects in augmented reality applications for smart-phones
AU - Amar, Nir
AU - Raviv, Meir
AU - David, Barak
AU - Chernoguz, Oleg
AU - El-Sana, Jihad
PY - 2013/10/7
Y1 - 2013/10/7
N2 - This paper presents a framework for augmented reality applications that runs on smart mobile phones and enables realistic physical behavior of the virtual objects in the real-world. The used mobile phone is equipped with two cameras and provides stereo images and live video. The two images are used to reconstruct a 3D representation of the real world, which is good enough to enable physical interaction between the virtual and the real-world objects, but not fine enough to synthesize the real-world view for back projection. The visual synthesis of the real-world view is done through the video stream. The constructed 3D representation is registered in the real-world view and used to place the virtual objects, determine their physical behavior, and detect collision with objects in the realworld view. The 3D reconstruction is not performed at each frame, but applied when necessary based on the position of the dynamic objects. Pose estimation is determined based on the movement of the mobile phone (gyroscope and accelerometer) and the viewed images. A physics engine, which utilizes the gravity vector obtained from the accelerometer sensor of the mobile device, is integrated into our framework. The physics engine equips virtual objects with realistic physical behavior.
AB - This paper presents a framework for augmented reality applications that runs on smart mobile phones and enables realistic physical behavior of the virtual objects in the real-world. The used mobile phone is equipped with two cameras and provides stereo images and live video. The two images are used to reconstruct a 3D representation of the real world, which is good enough to enable physical interaction between the virtual and the real-world objects, but not fine enough to synthesize the real-world view for back projection. The visual synthesis of the real-world view is done through the video stream. The constructed 3D representation is registered in the real-world view and used to place the virtual objects, determine their physical behavior, and detect collision with objects in the realworld view. The 3D reconstruction is not performed at each frame, but applied when necessary based on the position of the dynamic objects. Pose estimation is determined based on the movement of the mobile phone (gyroscope and accelerometer) and the viewed images. A physics engine, which utilizes the gravity vector obtained from the accelerometer sensor of the mobile device, is integrated into our framework. The physics engine equips virtual objects with realistic physical behavior.
KW - I.2.10 [Computing Methodologies]: Vision and Scene Understanding-3D/stereo scene analysis
UR - http://www.scopus.com/inward/record.url?scp=84884882691&partnerID=8YFLogxK
U2 - 10.1109/VR.2013.6549393
DO - 10.1109/VR.2013.6549393
M3 - Conference contribution
AN - SCOPUS:84884882691
SN - 9781467347952
T3 - Proceedings - IEEE Virtual Reality
SP - 123
EP - 124
BT - IEEE Virtual Reality Conference 2013, VR 2013 - Proceedings
T2 - 20th IEEE Virtual Reality Conference, VR 2013
Y2 - 16 March 2013 through 20 March 2013
ER -