BazEkon - Biblioteka Główna Uniwersytetu Ekonomicznego w Krakowie

BazEkon home page

Meny główne

Autor
Koniarski Konrad (Polish Academy of Science)
Tytuł
Augmented Reality Using Optical Flow
Źródło
Annals of Computer Science and Information Systems, 2015, vol. 5, s. 841-847, rys., bibliogr. 24 poz.
Słowa kluczowe
Algorytmy
Algorithms
Uwagi
summ.
Abstrakt
The paper deals with the application of Lucas- Kanade optical flow algorithm to develop an augmented reality (AR) system. Merging of a live view of the physical real world with context-related computer-rendered images to create a mixed image is a challenging problem. A virtual object has to be located in the correct pose and position in real time and perspective. Besides the occlusion problem need to be taken into consideration. In the paper a computer-vision based method for AR systems based on the fiducial marker matching is proposed. For simplicity black square was used as the marker. This method consists of two main steps. The initial step uses Hough's transformation to detect the marker initial position and to select the marker tracked points. In the second step for each image frame these selected points are being tracked using Lucas-Kanade optical flow method. The positions of the selected points are used for calculating the pose and position of a virtual object. Unlike existing method proposed system using optical flow to increase speed performance. The examples of AR applications using the proposed algorithm are provided and discussed.(original abstract)
Pełny tekst
Pokaż
Bibliografia
Pokaż
  1. Chari V, Singh J M, Narayanan P J, (2008) Augmented Reality using Over-Segmentation, National Conference on Computer Vision Pattern Recognition Image Processing and Graphics.
  2. Lucas B D, Kanada T (1981) An Iterative Image Registration Technique with an Application to Stereo Vision, IJCAI, 81, 674-679.
  3. Kato H, Billinghurst M (1999) Marker tracking and HMD calibration for a video-based augmented reality conferencing system, Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99), 85-94.
  4. Malik S, Roth G, McDonald C (2002) Robust Corner Tracking for Real-Time Augmented Reality, Vision Interface 2002, Calgary, Alberta, Canada, May 2002, 399-406. (National Research Council of Canada, Report No 45860).
  5. Koniarski K (2011), Image features detection methods in multiframe analysis(in Polish), in: Techniki informacyjne: teoria i zastosowania, eds: J. Hołubiec, 1(13), 68-82.
  6. Rublee E, Rabaud V, Konolige K, Bradski G (2011) ORB: An efficient alternative to SIFT or SURF, , 2011 IEEE International Conference on Computer Vision (ICCV), 2564-2571.
  7. Klein G, Murray D (2009) Parallel Tracking and Mapping on a Camera Phone, 8th IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2009), 83-86.
  8. Zhang Z (2000) A Flexible New Technique for Camera Calibration, IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11), 1330-1334.
  9. Bouguet J Y (2001) Pyramidal implementation of the affine Lucas Kanade feature tracker description of the algorithm, Intel Corporation, 5.
  10. Sun D, Roth S, Lewis J P, (2008) Learning Optical Flow, Computer Vision - ECCV 2008, Springer, 83-97.
  11. Computer Vision Library: OpenCV (2012) http://opencv.willowgarage. com version 2.4.
  12. Beauchemin S S, Baron J L (1995) The computation of optical flow, ACM Computing Surveys (CSUR), 27(3), 433-466.
  13. Baker S, Scharstein D, Lewis J L, Roth S, Black M, Szeliski M (2011) A database and Evaluation Metodology for Optical Flow, International Journal of Computer Vision, 92(1), 1-31.
  14. Herakleous K, Poullis C H (2013) Improving augmented reality applications with optical flow, IEEE International Conference on Image Processing 2013, 3403-3406.
  15. Li H, Qi M, Wu Y (2012) A Real-Time Registration Method of Augmented Reality based on SURF and Optical Flow, Journal of Theoretical and Applied Information Technology, 42(2), 281-286.
  16. Ji J, Chen G, Sun L (2011) A novel Hough transform method for line detection by anhancing accumulator array, Pattern Recognition Letters, 32(11), 1503-1510.
  17. Hirzer M (2008) Marker detection for augmented reality applications, Institut for Computer Graphics and Vision, Graz University, Technical Report, ICG-TR-08/05.
  18. Fuhrt B (2011) Handbook of Augmented Reality, Springer Science+ Business Media, New York, New York.
  19. Lee T, Hollerer T (2009) Multithreaded Hybrid Feature Tracking for Markerless Augmented Reality, IEEE Transactions on Visualization and Computer Graphics, 15(3), 355-368.
  20. Gedik O S, Alatan A A (2013) 3-D Rigid Body Tracking Using Vision and Depth Sensors, IEEE Transactions on Cybernetics, 43(5), 1395- 1405.
  21. Cheok A D, Qiu Y, Xu K, Kumar G K (2007) Combined Wireless Hardware and Real-Time Computer Vision Interface for Tangible Mixed Reality, IEEE Transactions on Industrial Electronics, 54(4), 2174-2189.
  22. Chen Z, Li X (2010) Markless Tracking based on Natural Feature for Augmented Reality, IEEE International Conference on Educational and Information Technology (ICEIT 2010), 2, 126-129.
  23. Fiala M (2010) Designing highly reliable fiducial markers, IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(7), 1317-1324.
  24. Demuynck O, Menendez J M (2013) Magic Cards: A New Augmented Reality Approach, IEEE Computer Graphics and Applications, 33(1), 12-19.
Cytowane przez
Pokaż
ISSN
2300-5963
Język
eng
URI / DOI
http://dx.doi.org/10.15439/2015F202
Udostępnij na Facebooku Udostępnij na Twitterze Udostępnij na Google+ Udostępnij na Pinterest Udostępnij na LinkedIn Wyślij znajomemu