Estimating scale using depth from focus for mobile augmented reality.

Čopič Pucihar, Klen and Coulton, Paul (2011) Estimating scale using depth from focus for mobile augmented reality. In: Proceedings of the 3rd ACM SIGCHI symposium on Engineering interactive computing systems :. ACM, pp. 1-6. ISBN 978-1-4503-0670-6

[thumbnail of Estimating_Scale_using_Depth_From_Focus_for_Mobile_Augmented_Reality_DRAFT.pdf]
PDF (Estimating_Scale_using_Depth_From_Focus_for_Mobile_Augmented_Reality_DRAFT.pdf)

Download (2MB)


Whilst there has been considerable progress in augmented reality over recent years it has principally been related to either marker based or apriori mapped systems which limits its opportunity for wide scale deployment. Recent advances in marker-less systems that have no apriori information using techniques borrowed from robotic vision are now finding their way into mobile augmented reality and are producing exciting results. However, unlike marker based and apriori tracking systems these techniques are independent of scale which is a vital component in ensuring that augmented objects are contextually sensitive to the environment they are projected upon. In this paper we address the problem of scale by adapting a Depth From Focus (DFF) technique, which has previously been limited to high-end cameras to a commercial mobile phone. The results clearly show that the technique is viable and with the ever-improving quality of camera phone optics, add considerably to the enhancement of mobile augmented reality solutions. Further as it simple require a platfrom with an auto-focusing camera the solution is applicable to other AR platforms.

Item Type:
Contribution in Book/Report/Proceedings
Uncontrolled Keywords:
?? computing, communications and ictqa75 electronic computers. computer science ??
ID Code:
Deposited By:
Deposited On:
15 Mar 2011 08:45
Last Modified:
16 Jul 2024 02:38