Skip to main content

Main menu

  • Home
  • Current Issue
  • Archive
  • About Us
    • About NAVIGATION
    • Editorial Board
    • Peer Review Statement
    • Open Access
  • More
    • Email Alerts
    • Info for Authors
    • Info for Subscribers
  • Other Publications
    • ion

User menu

  • My alerts

Search

  • Advanced search
NAVIGATION: Journal of the Institute of Navigation
  • Other Publications
    • ion
  • My alerts
NAVIGATION: Journal of the Institute of Navigation

Advanced Search

  • Home
  • Current Issue
  • Archive
  • About Us
    • About NAVIGATION
    • Editorial Board
    • Peer Review Statement
    • Open Access
  • More
    • Email Alerts
    • Info for Authors
    • Info for Subscribers
  • Follow ion on Twitter
  • Visit ion on Facebook
  • Follow ion on Instagram
  • Visit ion on YouTube
Research ArticleRegular Papers
Open Access

High-Precision Vision Localization System for Autonomous Guided Vehicles in Dusty Industrial Environments

Xingjie Liu, Guolei Wang, and Ken Chen
NAVIGATION: Journal of the Institute of Navigation March 2022, 69 (1) navi.502; DOI: https://doi.org/10.33012/navi.502
Xingjie Liu
State Key Laboratory of Tribology, Tsinghua University, Beijing, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Guolei Wang,
State Key Laboratory of Tribology, Tsinghua University, Beijing, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: [email protected]
Ken Chen
State Key Laboratory of Tribology, Tsinghua University, Beijing, China
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Supplemental
  • References
  • Info & Metrics
  • PDF
Loading

REFERENCES

  1. ↵
    1. Agarwal, S.,
    2. Mierle, K.
    , & Others. (n.d.). Ceres solver. http://ceres-solver.org
  2. ↵
    1. Ahmadinia, M.,
    2. Alinejad-Rokny, H., &
    3. Ahangarikiasari, H.
    (2014). Data aggregation in wireless sensor networks based on environmental similarity: A learning automata approach. Journal of Networks, 9(10), 2567. https://doi.org/10.4304/jnw.9.10.2567-2573
  3. ↵
    1. Ahmadinia, M.,
    2. Meybodi, M. R.,
    3. Esnaashari, M., &
    4. Alinejad-Rokny, H.
    (2013). Energy-efficient and multi-stage clustering algorithm in wireless sensor networks using cellular learning automata. IETE Journal of Research, 59(6), 774–782. https://doi.org/10.4103/0377-2063.126958
  4. ↵
    1. An, X.,
    2. Zhao, S.,
    3. Cui, X.,
    4. Shi, Q., &
    5. Lu, M.
    (2020). Distributed multi-antenna positioning for automatic-guided vehicle. Sensors, 20(4), 1155. https://doi.org/10.3390/s20041155
  5. ↵
    1. Arthur, J.,
    2. Bowman, R., &
    3. Straw, R.
    (2008). Robotic laser coating removal system (Tech. Rep.). Air Force Research Lab Wright-Patterson AFB OH. https://apps.dtic.mil/sti/citations/ADA608206
  6. ↵
    1. Bai, M.,
    2. Huang, Y.,
    3. Chen, B.,
    4. Yang, L., &
    5. Zhang, Y.
    (2020). A novel mixture distributions-based robust Kalman filter for cooperative localization. IEEE Sensors Journal, 20(24), 14994–15006. https://doi.org/10.1109/JSEN.2020.3012153
  7. ↵
    1. Bianchi, V.,
    2. Ciampolini, P., &
    3. De Munari, I.
    (2019). RSSI-based indoor localization and identification for ZigBee wireless sensor networks in smart homes. IEEE Transactions on Instrumentation and Measurement, 68(2), 566–575. https://doi.org/10.1109/TIM.2018.2851675
  8. ↵
    1. Boutteau, R.,
    2. Rossi, R.,
    3. Qin, L.,
    4. Merriaux, P., &
    5. Savatier, X.
    (2020). A vision-based system for robot localization in large industrial environments. Journal of Intelligent & Robotic Systems, 99(2), 359–370. https://doi.org/10.1007/s10846-019-01114-x
  9. ↵
    1. Bradski, G.
    (2000). The OpenCV library. Dr. Dobb’s Journal of Software Tools.
  10. ↵
    1. Chen, C.,
    2. Wang, B., &
    3. Ye, Q.-T.
    (2004). Application of automated guided vehicle (AGV) based on inductive guidance for newsprint rolls transportation system. Journal of Donghua University, 21(2), 88–92. www.doi.org/10.3969/j.issn.1672-5220.2004.02.017
  11. ↵
    1. Chen, Z.,
    2. Wang, G., &
    3. Hua, X.
    (2019). High-precise monocular positioning with infrared LED visual target. 2019 IEEE International Conference on Real-time Computing and Robotics (RCAR), Irkutsk, Russia. https://doi.org/10.1109/RCAR47638.2019.9044142
  12. ↵
    1. Cho, H.,
    2. Kim, E. K., &
    3. Kim, S.
    (2018). Indoor SLAM application using geometric and ICP matching methods based on line features. Robotics and Autonomous Systems, 100, 206–224. https://doi.org/10.1016/j.robot.2017.11.011
  13. ↵
    1. He, C.,
    2. Tang, C., &
    3. Yu, C.
    (2020). A federated derivative cubature Kalman filter for IMU-UWB indoor positioning. Sensors, 20(12), 3514. https://doi.org/10.3390/s20123514
  14. ↵
    1. Hofacker, S. A.
    (1993). Large aircraft robotic paint stripping (LARPS) system. SAE Transactions, 102, 11–21. http://www.jstor.org/stable/44739952
  15. ↵
    1. Holmes, S.,
    2. Klein, G., &
    3. Murray, D. W.
    (2008). A square root unscented Kalman filter for visual monoSLAM. 2008 IEEE International Conference on Robotics and Automation, Pasadena, CA. https://doi.org/10.1109/ROBOT.2008.4543780
    1. Hu, X.,
    2. Luo, Z., &
    3. Jiang, W.
    (2020). AGV localization system based on ultra-wideband and vision guidance. Electronics, 9(3), 448. https://doi.org/10.3390/electronics9030448
  16. ↵
    1. Jazwinski, A.
    (2007). Stochastic processes and filtering theory. Dover Publications.
  17. ↵
    1. Julier, S. J., &
    2. Uhlmann, J. K.
    (1997). New extension of the Kalman filter to nonlinear systems. In I. Kadar (Ed.), Signal Processing, Sensor Fusion, and Target Recognition VI (Vol. 3068, pp. 182–193). https://doi.org/10.1117/12.280797
  18. ↵
    1. Kalman, R. E.
    (1960). A new approach to linear filtering and prediction problems. Journal of Basic Engineering, 82(1), 35–45. https://doi.org/10.1115/1.3662552
    CrossRef
  19. ↵
    1. Karkus, P.,
    2. Cai, S., &
    3. Hsu, D.
    (2021). Differentiable SLAM-net: Learning particle SLAM for visual navigation. Proc. of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2815–2825.
    1. Lee, J.,
    2. Hyun, C.-H., &
    3. Park, M.
    (2013). A vision-based automated guided vehicle system with marker recognition for indoor use. Sensors, 13(8), 10052–10073. https://doi.org/10.3390/s130810052
  20. ↵
    1. Lepetit, V.,
    2. Moreno-Noguer, F., &
    3. Fua, P.
    (2009). EPnP: An accurate O(n) solution to the PnP problem. International Journal of Computer Vision, 81. https://doi.org/10.1007/s11263-008-0152-6
  21. ↵
    1. Li, B.,
    2. Hao, Z., &
    3. Dang, X.
    (2019). An indoor location algorithm based on Kalman filter fusion of ultra-wide band and inertial measurement unit. AIP Advances, 9(8). https://doi.org/10.1063/1.5117341
  22. ↵
    1. Li, S.,
    2. Xu, B.,
    3. Wang, L., &
    4. Razzaqi, A. A.
    (2020). Improved maximum correntropy cubature Kalman filter for cooperative localization. IEEE Sensors Journal, 20(22), 13585–13595. https://doi.org/10.1109/JSEN.2020.3006026
  23. ↵
    1. Mur-Artal, R., &
    2. Tardós, J. D.
    (2017). ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Transactions on Robotics, 33(5), 1255–1262. https://doi.org/10.1109/TRO.2017.2705103
  24. ↵
    1. Nguyen, T. H.,
    2. Nguyen, T.-M., &
    3. Xie, L.
    (2021). Range-focused fusion of camera-IMU-UWB for accurate and drift-reduced localization. IEEE Robotics and Automation Letters, 6(2), 1678–1685. https://doi.org/10.1109/LRA.2021.3057838
  25. ↵
    1. Nørgaard, M.,
    2. Poulsen, N. K., &
    3. Ravn, O.
    (2000). New developments in state estimation for nonlinear systems. Automatica, 36(11), 1627–1638. https://doi.org/10.1016/S0005-1098(00)00089-3
  26. ↵
    1. Ortiz-Fernandez, L. E.,
    2. Cabrera-Avila, E. V.,
    3. da Silva, B. M., &
    4. Gonçalves, L. M.
    (2021). Smart artificial markers for accurate visual mapping and localization. Sensors, 21(2), 625. https://doi.org/10.3390/s21020625
  27. ↵
    1. Ronzoni, D.,
    2. Olmi, R.,
    3. Secchi, C., &
    4. Fantuzzi, C.
    (2011). AGV global localization using indistinguishable artificial landmarks. 2011 IEEE International Conference on Robotics and Automation, Shanghai, China. https://doi.org/10.1109/ICRA.2011.5979759
  28. ↵
    1. Schulze, L., &
    2. Wullner, A.
    (2006). The approach of automated guided vehicle systems. 2006 IEEE International Conference on Service Operations and Logistics, and Informatics, Shanghai, China. https://doi.org/10.1109/SOLI.2006.328941
  29. ↵
    1. Shule, W.,
    2. Almansa, C. M.,
    3. Queralta, J. P.,
    4. Zou, Z., &
    5. Westerlund, T.
    (2020). UWB-based localization for multi-UAV systems and collaborative heterogeneous multi-robot systems. Procedia Computer Science, 175, 357–364. https://doi.org/10.1016/j.procs.2020.07.051
  30. ↵
    1. Song, Y.,
    2. Nuske, S., &
    3. Scherer, S.
    (2017). A multi-sensor fusion MAV state estimation from long-range stereo, IMU, GPS and barometric sensors. Sensors, 17(1), 11. https://doi.org/10.3390/s17010011
  31. ↵
    1. Sorkine-Hornung, O., &
    2. Rabinovich, M.
    (2017). Least squares rigid motion using SVD.
  32. ↵
    1. Tang, J.,
    2. Zhu, W., &
    3. Bi, Y.
    (2020). A computer vision-based navigation and localization method for station-moving aircraft transport platform with dual cameras. Sensors, 20(1), 279. https://doi.org/10.3390/s20010279
  33. ↵
    1. Tang, S.,
    2. Tang, C.,
    3. Huang, R.,
    4. Zhu, S., &
    5. Tan, P.
    (2021). Learning camera localization via dense scene matching. Proc. of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 1831–1841.
  34. ↵
    1. Tao, B.,
    2. Wu, H.,
    3. Gong, Z.,
    4. Yin, Z., &
    5. Ding, H.
    (2021). An RFID-based mobile robot localization method combining phase difference and readability. IEEE Transactions on Automation Science and Engineering, 18(3), 1406–1416. https://doi.org/10.1109/TASE.2020.3006724
  35. ↵
    1. Wu, X.,
    2. Sun, C.,
    3. Zou, T.,
    4. Xiao, H.,
    5. Wang, L., &
    6. Zhai, J.
    (2019). Intelligent path recognition against image noises for vision guidance of automated guided vehicles in a complex workspace. Applied Sciences, 9(19), 4108. https://doi.org/10.3390/app9194108
  36. ↵
    1. Zhang, L.,
    2. Chen, Z.,
    3. Cui, W.,
    4. Li, B.,
    5. Chen, C.,
    6. Cao, Z., &
    7. Gao, K.
    (2020). WiFi-based indoor robot positioning using deep fuzzy forests. IEEE Internet of Things Journal, 7(11), 10773–10781. https://doi.org/10.1109/JIOT.2020.2986685
  37. ↵
    1. Zhuang, Y.,
    2. Wang, Q.,
    3. Shi, M.,
    4. Cao, P.,
    5. Qi, L., &
    6. Yang, J.
    (2019). Low-power centimeter-level localization for indoor mobile robots based on ensemble Kalman smoother using received signal strength. IEEE Internet of Things Journal, 6(4), 6513–6522. https://doi.org/10.1109/JIOT.2019.2907707
  38. ↵
    1. Zou, Q.,
    2. Sun, Q.,
    3. Chen, L.,
    4. Nie, B., &
    5. Li, Q.
    (2021). A comparative analysis of lidar SLAM-based indoor navigation for autonomous vehicles. IEEE Transactions on Intelligent Transportation Systems, 1–15. https://doi.org/10.1109/TITS.2021.3063477
PreviousNext
Back to top

In this issue

NAVIGATION: Journal of the Institute of Navigation: 69 (1)
NAVIGATION: Journal of the Institute of Navigation
Vol. 69, Issue 1
Spring 2022
  • Table of Contents
  • Index by author
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on NAVIGATION: Journal of the Institute of Navigation.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
High-Precision Vision Localization System for Autonomous Guided Vehicles in Dusty Industrial Environments
(Your Name) has sent you a message from NAVIGATION: Journal of the Institute of Navigation
(Your Name) thought you would like to see the NAVIGATION: Journal of the Institute of Navigation web site.
Citation Tools
High-Precision Vision Localization System for Autonomous Guided Vehicles in Dusty Industrial Environments
Xingjie Liu, Guolei Wang,, Ken Chen
NAVIGATION: Journal of the Institute of Navigation Mar 2022, 69 (1) navi.502; DOI: 10.33012/navi.502

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
High-Precision Vision Localization System for Autonomous Guided Vehicles in Dusty Industrial Environments
Xingjie Liu, Guolei Wang,, Ken Chen
NAVIGATION: Journal of the Institute of Navigation Mar 2022, 69 (1) navi.502; DOI: 10.33012/navi.502
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Bookmark this article

Jump to section

  • Article
    • Abstract
    • 1 INTRODUCTION
    • 2 LOCALIZATION SYSTEM
    • 3 EXPERIMENT
    • 4 CONCLUSION
    • HOW TO CITE THIS ARTICLE
    • CONFLICT OF INTEREST
    • ACKNOWLEDGMENTS
    • REFERENCES
  • Figures & Data
  • Supplemental
  • References
  • Info & Metrics
  • PDF

Related Articles

  • No related articles found.
  • Google Scholar

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Comprehensive Analysis of Acquisition Time for a Multi-Constellation and Multi-Frequency GNSS Receiver at GEO Altitude
  • Performance Evaluation of DFMC SBAS Messages Broadcast by the Japanese Quasi-Zenith Satellite System (QZSS) in Oslo, Norway
  • Overbounding of Near Real-Time Estimated Ionospheric Gradient Slope in Low-Latitude Regions
Show more Regular Papers

Similar Articles

Keywords

  • autonomous guided vehicle
  • dusty spraying environment
  • LED array target
  • vision localization

Unless otherwise noted, NAVIGATION content is licensed under a Creative Commons CC BY 4.0 License.

© 2025 The Institute of Navigation, Inc.

Powered by HighWire