Skip to main content

Main menu

  • Home
  • Current Issue
  • Archive
  • About Us
    • About NAVIGATION
    • Editorial Board
    • Peer Review Statement
    • Open Access
  • More
    • Email Alerts
    • Info for Authors
    • Info for Subscribers
  • Other Publications
    • ion

User menu

  • My alerts

Search

  • Advanced search
NAVIGATION: Journal of the Institute of Navigation
  • Other Publications
    • ion
  • My alerts
NAVIGATION: Journal of the Institute of Navigation

Advanced Search

  • Home
  • Current Issue
  • Archive
  • About Us
    • About NAVIGATION
    • Editorial Board
    • Peer Review Statement
    • Open Access
  • More
    • Email Alerts
    • Info for Authors
    • Info for Subscribers
  • Follow ion on Twitter
  • Visit ion on Facebook
  • Follow ion on Instagram
  • Visit ion on YouTube
Research ArticleOriginal Article
Open Access

3D Vision Aided GNSS Real-Time Kinematic Positioning for Autonomous Systems in Urban Canyons

Weisong Wen, Xiwei Bai, and Li-Ta Hsu
NAVIGATION: Journal of the Institute of Navigation September 2023, 70 (3) navi.590; DOI: https://doi.org/10.33012/navi.590
Weisong Wen
Department of Aeronautical and Aviation Engineering, The Hong Kong Polytechnic University, Hong Kong
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Xiwei Bai,
Department of Aeronautical and Aviation Engineering, The Hong Kong Polytechnic University, Hong Kong
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Li-Ta Hsu
Department of Aeronautical and Aviation Engineering, The Hong Kong Polytechnic University, Hong Kong
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: [email protected]
  • Article
  • Figures & Data
  • Supplemental
  • References
  • Info & Metrics
  • PDF
Loading

REFERENCES

  1. ↵
    1. Bai, X.,
    2. Wen, W., &
    3. Hsu, L.-T.
    (2020a). Robust visual-inertial integrated navigation system aided by online sensor model adaption for autonomous ground vehicles in urban areas. Remote Sensing, 12(10), 1686. https://doi.org/10.3390/rs12101686
  2. ↵
    1. Bai, X.,
    2. Wen, W., &
    3. Hsu, L. T.
    (2020b). Using sky-pointing fish-eye camera and Lidar to aid GNSS single-point positioning in urban canyons. IET Intelligent Transport Systems, 14(8), 908–914. http://dx.doi.org/10.1049/iet-its.2019.0587
  3. ↵
    1. Bai, X.,
    2. Wen, W., &
    3. Hsu, L.-T.
    (2022). Time-correlated window-carrier-phase-aided GNSS positioning using factor graph optimization for urban positioning. IEEE Transactions on Aerospace and Electronic Systems, 58(4), 3370–3384. https://doi.org/10.1109/TAES.2022.3149730
  4. ↵
    1. Bai, X.,
    2. Zhang, B.,
    3. Wen, W.,
    4. Hsu, L.-T., &
    5. Li, H.
    (2020c). Perception-aided visual-inertial integrated positioning in dynamic urban areas. Proc. of the IEEE/ION Position, Location and Navigation Symposium (PLANS 2020). Portland, OR. 1563–1571. https://doi.org/10.1109/PLANS46316.2020.9109963
  5. ↵
    1. Bloesch, M.,
    2. Omari, S.,
    3. Hutter, M., &
    4. Siegwart, R.
    (2015). Robust visual inertial odometry using a direct EKF-based approach. Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2015), Hamburg, Germany. 298–304. https://doi.org/10.1109/IROS.2015.7353389
  6. ↵
    1. Cadena, C.,
    2. Carlone, L.,
    3. Carrillo, H.,
    4. Latif, Y.,
    5. Scaramuzza, D.,
    6. Neira, J.,
    7. Reid, I.,
    8. Leonard, J. J.
    (2016). Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age. IEEE Transactions on Robotics, 32(6), 1309–1332. https://doi.org/10.1109/TRO.2016.2624754
  7. ↵
    1. Campos, C.,
    2. Elvira, R.,
    3. Rodríguez, J. J. G.,
    4. Montiel, J. M., &
    5. Tardós, J. D.
    (2021). Orb-SLAM3: An accurate open-source library for visual, virual-inertial, and multimap SLAM. IEEE Transactions on Robotics, 37(6), 1874–1890. https://doi.org/10.1109/TRO.2021.3075644
  8. ↵
    1. Cao, S.,
    2. Lu, X., &
    3. Shen, S.
    (2022). GVINS: Tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation. IEEE Transactions on Robotics, 38(4), 2004–2021. https://doi.org/10.1109/TRO.2021.3133730
  9. ↵
    1. Ch’ng, S.-F.,
    2. Khosravian, A.,
    3. Doan, A.-D., &
    4. Chin, T.-J.
    (2019). Outlier-robust manifold pre-integration for INS/GPS fusion. Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2019), Macau, China. 7489–7496. https://doi.org/10.1109/IROS40897.2019.8967643
  10. ↵
    1. Cho, A.,
    2. Kim, J.,
    3. Lee, S., &
    4. Kee, C.
    (2011). Wind estimation and airspeed calibration using a UAV with a single-antenna GPS receiver and pitot tube. IEEE Transactions on Aerospace and Electronic Systems, 47(1), 109–117. https://doi.org/10.1109/TAES.2011.5705663
    CrossRef
  11. ↵
    1. Counselman, C. C., &
    2. Gourevitch, S. A.
    (1981). Miniature interferometer terminals for earth surveying: ambiguity and multipath with Global Positioning System. IEEE Transactions on Geoscience and Remote Sensing GE- 19(4), 244–252. https://doi.org/10.1109/TGRS.1981.350379
  12. ↵
    1. Enge, P. K.
    (1994). The global positioning system: Signals, measurements, and performance. International Journal of Wireless Information Networks, 1(2), 83–105. https://doi.org/10.1007/BF02106512
  13. ↵
    1. Fan, P.,
    2. Li, W.,
    3. Cui, X., &
    4. Lu, M.
    (2019). Precise and robust RTK-GNSS positioning in urban environments with dual-antenna configuration. Sensors, 19(16), 3586. https://doi.org/10.3390/s19163586
  14. ↵
    1. Forster, C.,
    2. Carlone, L.,
    3. Dellaert, F., &
    4. Scaramuzza, D.
    (2016). On-manifold preintegration for real-time visual-inertial odometry. IEEE Transactions on Robotics, 33(1), 1–21. https://doi.org/10.1109/TRO.2016.2597321
  15. ↵
    1. Furukawa, R.,
    2. Kubo, N., &
    3. El-Mowafy, A.
    (2020). Prediction of RTK-GNSS performance in urban environments using a 3D model and continuous LoS method. Proc. of the International Technical Meeting of the Institute of Navigation, (ION GNSS+ 2020), San Diego, CA. 763–771. https://doi.org/10.33012/2020.17176
  16. ↵
    1. Geiger, A.,
    2. Lenz, P.,
    3. Stiller, C., &
    4. Urtasun, R.
    (2013). Vision meets robotics: The KITTI dataset. The International Journal of Robotics Research, 32(11), 1231–1237. https://doi.org/10.1177/0278364913491297
    CrossRef
  17. ↵
    1. Geneva, P.,
    2. Eckenhoff, K.,
    3. Lee, W.,
    4. Yang, Y., &
    5. Huang, G.
    (2020). Openvins: A research platform for visual-inertial estimation. Proc. of the IEEE International Conference on Robotics and Automation (ICRA 2020), Paris, France. 4666–4672. https://doi.org/10.1109/ICRA40945.2020.9196524
  18. ↵
    1. Gong, Z.,
    2. Liu, P.,
    3. Wen, F.,
    4. Ying, R.,
    5. Ji, X.,
    6. Miao, R., &
    7. Xue, W.
    (2020). Graph-based adaptive fusion of GNSS and VIO under intermittent GNSS-degraded environment. IEEE Transactions on Instrumentation and Measurement, 70, 1–16. https://doi.org/10.1109/TIM.2020.3039640
  19. ↵
    1. Grupp, M.
    (2017). evo: Python package for the evaluation of odometry and slam. https://github.com/MichaelGrupp/evo
  20. ↵
    1. He, H.,
    2. Li, J.,
    3. Yang, Y.,
    4. Xu, J.,
    5. Guo, H., &
    6. Wang, A.
    (2014). Performance assessment of single-and dual-frequency BeiDou/GPS single-epoch kinematic positioning. GPS Solutions, 18, 393–403. https://doi.org/10.1007/s10291-013-0339-3
  21. ↵
    1. Herrera, A. M.,
    2. Suhandri, H. F.,
    3. Realini, E.,
    4. Reguzzoni, M., &
    5. de Lacy, M. C.
    (2015). goGPS: opensource MATLAB software. GPS Solutions, 20(3), 595–603. https://doi.org/10.1007/s10291-015-0469-x
  22. ↵
    1. Herrera, A. M.,
    2. Suhandri, H. F.,
    3. Realini, E.,
    4. Reguzzoni, M., &
    5. de Lacy, M. C.
    (2016). goGPS: opensource MATLAB software. GPS Solutions, 20, 595–603.
  23. ↵
    1. Hsu, L.-T.,
    2. Kubo, N.,
    3. Wen, W.,
    4. Chen, W.,
    5. Liu, Z.,
    6. Suzuki, T., &
    7. Meguro, J.
    (2021). UrbanNav: An open-sourced multisensory dataset for benchmarking positioning algorithms designed for urban areas. Proc. of the 34th International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS+ 2021), St. Louis, MO. 226–256. https://doi.org/10.33012/2021.17895
  24. ↵
    1. Hsu, L.-T.,
    2. Tokura, H.,
    3. Kubo, N.,
    4. Gu, Y., &
    5. Kamijo, S.
    (2017). Multiple faulty GNSS measurement exclusion based on consistency check in urban canyons. IEEE Sensors Journal, 17(6), 1909–1917. https://doi.org/10.1109/JSEN.2017.2654359
  25. ↵
    1. Kennedy, S.,
    2. Hamilton, J., &
    3. Martell, H.
    (2006). Architecture and system performance of SPAN-NovAtel’s GPS/INS solution. Proc. of the IEEE/ION Position, Location, And Navigation Symposium (PLANS 2006), Coronado, CA. 216. https://doi.org/10.1109/PLANS.2006.1650612
  26. ↵
    1. Lesouple, J.,
    2. Robert, T.,
    3. Sahmoudi, M.,
    4. Tourneret, J.-Y., &
    5. Vigneau, W.
    (2018). Multipath mitigation for GNSS positioning in an urban environment using sparse estimation. IEEE Transactions on Intelligent Transportation Systems, 20(4), 1316–1328. https://doi.org/10.1109/TITS.2018.2848461
  27. ↵
    1. Leutenegger, S.,
    2. Lynen, S.,
    3. Bosse, M.,
    4. Siegwart, R., &
    5. Furgale, P.
    (2015). Keyframe-based visual-inertial odometry using nonlinear optimization. The International Journal of Robotics Research, 34(3), 314–334. https://doi.org/10.1177/0278364914554813
    CrossRef
  28. ↵
    1. Li, K.,
    2. Li, M., &
    3. Hanebeck, U. D.
    (2021). Towards high-performance solid-state-Lidar-inertial odometry and mapping. IEEE Robotics and Automation Letters, 6(3), 5167–5174. https://doi.org/10.1109/LRA.2021.3070251
  29. ↵
    1. Li, M., &
    2. Mourikis, A. I.
    (2013). High-precision, consistent EKF-based visual-inertial odometry. The International Journal of Robotics Research, 32(6), 690–711. https://doi.org/10.1177/0278364913481251
    CrossRefWeb of Science
  30. ↵
    1. Li, T.,
    2. Zhang, H.,
    3. Gao, Z.,
    4. Niu, X., &
    5. El-Sheimy, N.
    (2019). Tight fusion of a monocular camera, MEMS-IMU, and single-frequency multi-GNSS RTK for precise navigation in GNSS-challenged environments. Remote Sensing, 11(6), 610. https://doi.org/10.3390/rs11060610
  31. ↵
    1. Li, T.,
    2. Zhang, H.,
    3. Niu, X., &
    4. Gao, Z.
    (2017). Tightly-coupled integration of multi-GNSS single-frequency RTK and MEMS-IMU for enhanced positioning performance. Sensors, 17(11), 2462. https://doi.org/10.3390/s17112462
  32. ↵
    1. Li, X.,
    2. Wang, X.,
    3. Liao, J.,
    4. Li, X.,
    5. Li, S., &
    6. Lyu, H.
    (2021). Semi-tightly coupled integration of multi-GNSS PPP and S-VINS for precise positioning in GNSS-challenged environments. Satellite Navigation, 2(1), 1–14. https://doi.org/10.1186/s43020-020-00033-9
  33. ↵
    1. Liu, J.,
    2. Gao, W., &
    3. Hu, Z.
    (2021). Optimization-based visual-inertial SLAM tightly coupled with raw GNSS measurements. Proc. of the IEEE International Conference on Robotics and Automation (ICRA 2021), Xi’an, China. 11612–11618. https://doi.org/10.1109/ICRA48506.2021.9562013
  34. ↵
    1. Liu, T.,
    2. hai Liao, Q.,
    3. Gan, L.,
    4. Ma, F.,
    5. Cheng, J.,
    6. Xie, X.,
    7. Wang, Z.,
    8. Chen Y.,
    9. Zhu, Y.,
    10. Zhang, S.,
    11. Chen, Z.,
    12. Liu, Y.,
    13. Xie, M.,
    14. Y, Y.,
    15. Guo, Z.,
    16. Li, G.,
    17. Yuan, P.,
    18. Han, D.,
    19. Chen, Y., … &
    20. Liu, M.
    (2021). The role of the Hercules autonomous vehicle during the COVID-19 pandemic: An autonomous logistic vehicle for contactless goods transportation. IEEE Robotics & Automation Magazine, 28(1), 48–58. https://doi.org/10.1109/MRA.2020.3045040
  35. ↵
    1. Lucas, B. D., &
    2. Kanade, T.
    (1981). An iterative image registration technique with an application to stereo vision. Proc. of the 7th International Joint Conference on Artificial Intelligence (IJCAI’81), Vancouver, Canada. 674–679. https://hal.science/hal-03697340/document
  36. ↵
    1. Marais, J.,
    2. Ambellouis, S.,
    3. Meurie, C.,
    4. Moreau, J.,
    5. Flancquart, A., &
    6. Ruichek, Y.
    (2015). Image processing for a more accurate GNSS-based positioning in urban environment. Proc. of the 22nd ITS World Congress (2015). Bordeaux, France. http://dx.doi.org/10.13140/RG.2.2.10493.97760
  37. ↵
    1. Marais, J.,
    2. Kazim, S. A.,
    3. Cocheril, Y., &
    4. Meurie, C.
    (2020). Multipath and NLOS detection based on the combination of CN0 values and a fish-eye camera. Proc. of the European Navigation Conference (ENC 2020), Dresden, Germany. 1–13. https://doi.org/10.23919/ENC48637.2020.9317408
  38. ↵
    1. Marais, J.,
    2. Meurie, C.,
    3. Attia, D.,
    4. Ruichek, Y., &
    5. Flancquart, A.
    (2014). Toward accurate localization in guided transport: Combining GNSS data and imaging information. Transportation Research Part C: Emerging Technologies, 43, 188–197. https://doi.org/10.1016/j.trc.2013.11.008
  39. ↵
    1. Ng, H.-F., &
    2. Hsu, L.-T.
    (2021). 3D mapping database-aided GNSS RTK and its assessments in urban canyons. IEEE Transactions on Aerospace and Electronic Systems, 57(5), 3150–3166. https://doi.org/10.1109/TAES.2021.3069271
  40. ↵
    1. Niu, Z.,
    2. Guo, F.,
    3. Shuai, Q.,
    4. Li, G., &
    5. Zhu, B.
    (2021). The integration of GPS/BDS real-time kinematic positioning and visual-inertial odometry based on smartphones. ISPRS International Journal of Geo-Information, 10(10), 699. https://doi.org/10.3390/ijgi10100699
  41. ↵
    1. Qin, C.,
    2. Ye, H.,
    3. Pranata, C. E.,
    4. Han, J.,
    5. Zhang, S., &
    6. Liu, M.
    (2020). Lins: A Lidar-inertial state estimator for robust and efficient navigation. In Proc. of the IEEE International Conference on Robotics and Automation (ICRA 2020). https://doi.org/10.48550/arXiv.1907.02233
  42. ↵
    1. Qin, T.,
    2. Cao, S.,
    3. Pan, J., &
    4. Shen, S.
    (2019). A general optimization-based framework for global pose estimation with multiple sensors. Pre-print; arXiv:1901.03642 https://doi.org/10.48550/arXiv.1901.03642
  43. ↵
    1. Qin, T.,
    2. Li, P., &
    3. Shen, S.
    (2018). Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics, 34(4), 1004–1020. https://doi.org/10.1109/TRO.2018.2853729
  44. ↵
    1. Ronneberger, O.,
    2. Fischer, P., &
    3. Brox, T.
    (2015). U-net: Convolutional networks for biomedical image segmentation. Proc.(Part III) of the 18th International Conference Medical Image Computing and Computer-Assisted Intervention(MICCAI 2015), Munich, Germany. 18 https://doi.org/10.48550/arXiv.1505.04597
  45. ↵
    1. Rycroft, M. J.
    (1997). Understanding GPS. Principles and applications. Journal of Atmospheric and Solar-Terrestrial Physics, 5(59), 598–599. https://doi.org/10.1016/s1364-6826(97)83337-8
  46. ↵
    1. Shan, T., &
    2. Englot, B.
    (2018). Lego-loam: Lightweight and ground-optimized LiDAR odometry and mapping on variable terrain. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain (pp. 4758–4765). https://doi.org/10.1109/IROS.2018.8594299
  47. ↵
    1. Shan, T.,
    2. Englot, B.,
    3. Meyers, D.,
    4. Wang, W.,
    5. Ratti, C., &
    6. Rus, D.
    (2020). Lio-sam: Tightly-coupled LiDAR inertial odometry via smoothing and mapping. Proc.of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2020), Las Vegas, NV. https://doi.org/10.48550/arXiv.2007.00258
  48. ↵
    1. Shi, J.
    (1994). Good features to track. Proc. of the IEEE Conference on Computer Vision and Pattern Recognition (1994), Seattle, WA. 593–600. https://doi.org/10.1109/CVPR.1994.323794
  49. ↵
    1. Sibley, G.,
    2. Matthies, L., &
    3. Sukhatme, G.
    (2010). Sliding window filter with application to planetary landing. Journal of Field Robotics, 27(5), 587–608. https://doi.org/10.1002/rob.20360
  50. ↵
    1. Suzuki, T., &
    2. Kubo, N.
    (2013). Correcting GNSS multipath errors using a 3D surface model and particle filter. Proc. of the 26th International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS+ 2013), Nashville, TN. 1583–1595.
  51. ↵
    1. Takasu, T., &
    2. Yasuda, A.
    (2009). Development of the low-cost RTK-GPS receiver with an opensource program package RTKLIB. Proc. of the International Symposium on GPS/GNSS (2009), Seogwipo-si, Korea.
  52. ↵
    1. Teunissen, P.
    (1997). A canonical theory for short GPS baselines. Part IV: Precision versus reliability. Journal of Geodesy, 71(9), 513–525. https://doi.org/10.1007/s001900050119
  53. ↵
    1. Teunissen, P.
    (2000). The GPS integer least-squares statistics. Physics and Chemistry of the Earth, Part A: Solid Earth and Geodesy, 25(9–11), 673–677. https://doi.org/10.1016/S1464-1895(00)00104-6
    GeoRef
  54. ↵
    1. Teunissen, P.
    (2001). Integer estimation in the presence of biases. Journal of Geodesy, 75, 399–407. https://doi.org/10.1007/s001900100191
  55. ↵
    1. Teunissen, P.
    (2003). Theory of integer equivariant estimation with application to GNSS. Journal of Geodesy, 77(7–8), 402–410. https://doi.org/10.1007/s00190-003-0344-3
    GeoRef
  56. ↵
    1. Wang, H.,
    2. Wang, C., &
    3. Xie, L.
    (2021). Intensity-slam: Intensity assisted localization and mapping for large scale environment. IEEE Robotics and Automation Letters, 6(2), 1715–1721. https://doi.org/10.1109/LRA.2021.3059567
    GeoRef
  57. ↵
    1. Wang, K.,
    2. Teunissen, P. J., &
    3. El-Mowafy, A.
    (2020). The ADOP and PDOP: two complementary diagnostics for GNSS positioning. Journal of Surveying Engineering, 146(2), 04020008. https://doi.org/10.1061/(ASCE)SU.1943-5428.0000313
  58. ↵
    1. Wen, W.
    (2020). 3D LiDAR aided GNSS and its tightly coupled integration with INS via factor graph optimization. In Proc. of the 33rd International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS+ 2020), St. Louis, MO. 1649–1672 https://doi.org/10.33012/2020.17557
  59. ↵
    1. Wen, W.,
    2. Bai, X.,
    3. Kan, Y.C., &
    4. Hsu, L.-T.
    (2019c). Tightly-coupled GNSS/INS integration via factor graph and aided by fish-eye camera. IEEE Transactions on Vehicular Technology, 68(11), 10651–10662. https://doi.org/10.1109/TVT.2019.2944680
  60. ↵
    1. Wen, W., &
    2. Hsu, L.-T.
    (2021). Towards robust GNSS positioning and real-time kinematic using factor graph optimization. Proc. of the IEEE International Conference on Robotics and Automation (ICRA 2021), Xi’an, China. 5884–5890. https://doi.org/10.1109/ICRA48506.2021.9562037
  61. ↵
    1. Wen, W.,
    2. Zhang, G., &
    3. Hsu, L.-T.
    (2018). Exclusion of GNSS NLOS receptions caused by dynamic objects in heavy traffic urban scenarios using real-time 3D point cloud: An approach without 3D maps. Proc. of the IEEE/ION Position, Location and Navigation Symposium (PLANS 2018), Monterey, CA, 158–165. https://doi.org/10.1109/PLANS.2018.8373377
  62. ↵
    1. Wen, W.,
    2. Zhang, G., &
    3. Hsu, L.-T.
    (2019a). GNSS NLOS exclusion based on dynamic object detection using LiDAR point cloud. IEEE Transactions on Intelligent Transportation Systems, 22(2), 853–862. https://doi.org/10.1109/TITS.2019.2961128
  63. ↵
    1. Wen, W.,
    2. Zhang, G., &
    3. Hsu, L. T.
    (2019b). Correcting NLOS by 3D Lidar and building height to improve GNSS single point positioning. NAVIGATION, 66(4), 705–718 https://doi.org/10.1002/navi.335
  64. ↵
    1. Wen, W.,
    2. Zhou, Y.,
    3. Zhang, G.,
    4. Fahandezh-Saadi, S.,
    5. Bai, X.,
    6. Zhan, W.,
    7. Tomizuka, M., and
    8. Hsu, L. T.
    (2020). Urbanloco: A full sensor suite dataset for mapping and localization in urban scenes. Proc. of the IEEE International Conference on Robotics and Automation (ICRA 2020), Paris, France. 2310–2316. http://dx.doi.org/10.1109/ICRA40945.2020.9196526
  65. ↵
    1. Ye, H.,
    2. Chen, Y., &
    3. Liu, M.
    (2019). Tightly coupled 3D LiDAR inertial odometry and mapping. Proc. of the International Conference on Robotics and Automation (ICRA 2019), Montreal, QC, Canada. pp. 3144–3150. https://doi.org/10.1109/ICRA.2019.8793511
  66. ↵
    1. Zampieri, G.,
    2. Narayanan, S.,
    3. Crespillo, O. G., &
    4. Osechas, O.
    (2020). A regularized least squares estimator for pseudorange-based terrestrial positioning under degraded geometries. Proc. of the 33rd International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS+ 2020), St. Louis, MO. 698–707 https://doi.org/10.33012/2020.17690
  67. ↵
    1. Zhang, J.,
    2. Liu, W., &
    3. Wu, Y.
    (2011). Novel technique for vision-based UAV navigation. IEEE Transactions on Aerospace and Electronic Systems, 47(4), 2731–2741. https://doi.org/10.1109/TAES.2011.6034661
  68. ↵
    1. Zhang, J., &
    2. Singh, S.
    (2017). Low-drift and real-time Lidar odometry and mapping. Autonomous Robots, 41(2), 401–416. https://link.springer.com/article/10.1007/s10514-016-9548-2
    CrossRef
PreviousNext
Back to top

In this issue

NAVIGATION: Journal of the Institute of Navigation: 70 (3)
NAVIGATION: Journal of the Institute of Navigation
Vol. 70, Issue 3
Fall 2023
  • Table of Contents
  • Index by author
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on NAVIGATION: Journal of the Institute of Navigation.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
3D Vision Aided GNSS Real-Time Kinematic Positioning for Autonomous Systems in Urban Canyons
(Your Name) has sent you a message from NAVIGATION: Journal of the Institute of Navigation
(Your Name) thought you would like to see the NAVIGATION: Journal of the Institute of Navigation web site.
Citation Tools
3D Vision Aided GNSS Real-Time Kinematic Positioning for Autonomous Systems in Urban Canyons
Weisong Wen, Xiwei Bai,, Li-Ta Hsu
NAVIGATION: Journal of the Institute of Navigation Sep 2023, 70 (3) navi.590; DOI: 10.33012/navi.590

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
3D Vision Aided GNSS Real-Time Kinematic Positioning for Autonomous Systems in Urban Canyons
Weisong Wen, Xiwei Bai,, Li-Ta Hsu
NAVIGATION: Journal of the Institute of Navigation Sep 2023, 70 (3) navi.590; DOI: 10.33012/navi.590
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Bookmark this article

Jump to section

  • Article
    • Abstract
    • 1 INTRODUCTION
    • 2 SYSTEM OVERVIEW AND NOTATIONS
    • 3 IMPROVING GEOMETRY: TIGHTLY-COUPLED INTEGRATION OF GNSS-RTK/VISUAL/INERTIAL
    • 4 SYSTEM INITIALIZATION
    • 5 EXPERIMENTAL VALIDATION
    • 6 CONCLUSIONS
    • HOW TO CITE THIS ARTICLE
    • ACKNOWLEDGMENTS
    • 7 APPENDIX
    • Footnotes
    • REFERENCES
  • Figures & Data
  • Supplemental
  • References
  • Info & Metrics
  • PDF

Related Articles

  • Google Scholar

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • ATLAS: Orbit Determination and Time Transfer for a Lunar Radio Navigation System
  • GNSS L5/E5a Code Properties in the Presence of a Blanker
  • Robust Interference Mitigation in GNSS Snapshot Receivers
Show more Original Article

Similar Articles

Keywords

  • 3D vision
  • autonomous system
  • GNSS-RTK
  • NLOS
  • urban canyons

Unless otherwise noted, NAVIGATION content is licensed under a Creative Commons CC BY 4.0 License.

© 2025 The Institute of Navigation, Inc.

Powered by HighWire