Skip to main content

Main menu

  • Home
  • Current Issue
  • Archive
  • About Us
    • About NAVIGATION
    • Editorial Board
    • Peer Review Statement
    • Open Access
  • More
    • Email Alerts
    • Info for Authors
    • Info for Subscribers
  • Other Publications
    • ion

User menu

  • My alerts

Search

  • Advanced search
NAVIGATION: Journal of the Institute of Navigation
  • Other Publications
    • ion
  • My alerts
NAVIGATION: Journal of the Institute of Navigation

Advanced Search

  • Home
  • Current Issue
  • Archive
  • About Us
    • About NAVIGATION
    • Editorial Board
    • Peer Review Statement
    • Open Access
  • More
    • Email Alerts
    • Info for Authors
    • Info for Subscribers
  • Follow ion on Twitter
  • Visit ion on Facebook
  • Follow ion on Instagram
  • Visit ion on YouTube
Research ArticleRegular Papers
Open Access

Cooperative Localization for GNSS-Denied Subterranean Navigation: A UAV–UGV Team Approach

David Akhihiero, Uthman Olawoye, Shounak Das, and Jason Gross
NAVIGATION: Journal of the Institute of Navigation December 2024, 71 (4) navi.677; DOI: https://doi.org/10.33012/navi.677
David Akhihiero
1Department of Mechanical, Materials and Aerospace Engineering, West Virginia University, West Virginia, U.S.A.
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Uthman Olawoye
1Department of Mechanical, Materials and Aerospace Engineering, West Virginia University, West Virginia, U.S.A.
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Shounak Das,
1Department of Mechanical, Materials and Aerospace Engineering, West Virginia University, West Virginia, U.S.A.
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Jason Gross
1Department of Mechanical, Materials and Aerospace Engineering, West Virginia University, West Virginia, U.S.A.
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • For correspondence: [email protected]
  • Article
  • Figures & Data
  • Supplemental
  • References
  • Info & Metrics
  • PDF
Loading

Abstract

This paper presents a cooperative navigation architecture in a global navigation satellite system (GNSS)-denied subterranean environment using an unmanned ground vehicle (UGV) and unmanned aerial vehicle (UAV) team. The main focus of this design is to prolong the UAV mission time by reducing the UAV payload, sensing, and computational elements. To accomplish this, the UGV handles the mapping of the environment, its own state estimation, and the state estimation of the UAV using the UAV’s proprioceptive sensors, a three-dimensional lidar, and an ultra-wideband ranging radio that communicates with a similar radio on the UAV. The UAV is assumed to be instrumented with an inertial measurement unit, stereo camera, and laser altimeter, and the data from these instruments are shared with the UGV over a local network for use in UAV state estimation. This paper presents the architecture for localization of a UAV/UGV team and realizes the implementation using two different nonlinear state estimators. Details and a comparison between an extended Kalman filter and an incremental factor graph optimization implementation are provided. The performance of the presented algorithms is analyzed via experiments conducted in a motion-capture facility.

Keywords
  • cooperative navigation
  • GNSS-denied
  • mapping
  • state estimation
  • subterranean environment

1 INTRODUCTION

Robotic systems have many potential applications in subterranean environments, such as exploration and mapping (Dang et al., 2019), search and rescue (Zhao et al., 2017), infrastructure inspection and maintenance (Martinez Rocamora Jr et al., 2023; Szrek et al., 2020), mining and excavation (Marshall et al., 2016), and hazardous material handling (Dekkata et al., 2020). These environments present significant challenges for autonomous navigation, as global navigation satellite system (GNSS) signals are often degraded or absent, lighting is poor, the terrain is typically rough, uneven, and unstructured, and pollutants such as dust and smoke, which can degrade sensor performance, are often encountered (Khattak et al., 2020; Papachristos et al., 2019).

Autonomous navigation in these environments is greatly enhanced when a robot can map and localize itself within the map. Several works have demonstrated that visual–inertial (VI) localization methods can be effectively used in subterranean environments. For example, Li et al. (2020) used a novel VI localization algorithm for unmanned aerial vehicle (UAV) localization in dynamic underground environments. Their results showed that their proposed method can improve localization accuracy by more than 67% in these environments as compared with monocular VI methods. Further, Weiss et al. (2012) proposed an onboard navigation algorithm for micro aerial vehicles integrating a single camera and an inertial measurement unit (IMU), enabling real-time speed estimation and full self-calibration within an extended Kalman filter (EKF) framework and thus facilitating robust six-degree-of-freedom pose estimation and rapid speed control. In addition, Shen et al. (2015) introduced a monocular VI navigation system for autonomous quad-rotors, utilizing an off-the-shelf camera and an IMU; this approach enables robust state estimation for executing trajectories at high speeds and angles alongside innovative methods for motion estimation without initialization and real-time scale determination without encountering degeneracy. Kimera VI odometry (VIO) is a popular VIO package that uses pre-integrated IMU data and random sample consensus for outlier rejection on the front-end and a fixed-lag smoother on the back-end optimization (Rosinol et al., 2020). While VI methods can yield high localization accuracies, the perceptually degraded nature of subterranean environments makes mapping difficult. Consequently, although the long, featureless corridors in such environments can cause drift in lidar-based mapping, three-dimensional (3D) lidar mapping is often the better choice for simultaneous localization and mapping (SLAM) (Ren et al., 2019).

Several authors have realized the benefit of heterogeneous robots for various applications, such as construction monitoring (Asadi et al., 2020), stone mine exploration and mapping (Martinez Rocamora Jr et al., 2023), and target detection (Minaeian et al., 2015), amongst others. These applications typically leverage the higher-fidelity sensing of an unmanned ground vehicle (UGV) and the greater mobility of a UAV in harsh terrains to improve overall situational awareness and operations. These systems can also enhance the overall localization. For example, Sivaneri and Gross (2017) and, more recently, Wang et al. (2021) demonstrated the benefits of UAVs for assisting UGV localization in GNSS-degraded environments, such as urban canyons. These cooperative UAV/UGV localization solutions with ultra-wideband (UWB) ranging solutions have been demonstrated in field tests in areas such as GNSS-degraded urban environments and provide an alternative solution when GNSS alone is not sufficient (Xianjia et al., 2021). Another line of research has considered cooperative mapping with heterogeneous teams. For example, Hood et al. (2017) considered indoor exploration in which a UAV followed a UGV to provide a bird‘s-eye view while each platform performed SLAM, and cooperative localization was employed to refine state estimations. Similarly, Kim et al. (2019) showed the benefit of having a UAV develop an initial model to facilitate route planning for the UGV to collect data for a more detailed map. Li et al. (2021) considered the application of smart cities and used UAV and UGV mapping to develop a 3D occupancy grid for traversability mapping.

Our work considers approaches for cases in which GNSS is completely denied. Here, the UAV and UGV work together so that the heterogeneous team has different sensing and computational capabilities. In particular, to reduce its payload and complexity, the UAV does not have a mapping capability; thus, cooperative localization via teaming is required, where the UAV and UGV benefit from their partner’s strength.

This paper builds on our previous works (De Petrillo et al., 2021, 2023; Gross et al., 2019) with multiple significant differences. First, in all of our prior papers, the presented UAV localization algorithm was constrained to the assumption that either the UAV or the UGV is in motion, but not both at the same time. In particular, while the UAV was in motion and the UAV localization was estimated, the UGV was assumed to be static. This assumption was made to simplify the problem by reducing the observation models needed for ranging and visual-sensor-based tracking. In this paper, we remove this constraint and estimate the UAV location when both vehicles are in motion. Second, in this paper, in addition to presenting an error-state EKF localization formulation, we also re-formulate the problem using factor graph optimization (FGO) and present a comparison between using an EKF or incremental FGO for the nonlinear estimator in UAV localization. The contributions of this paper include a novel UAV–UGV teaming and relative localization formulation and an experimental comparison between two popular nonlinear estimation approaches, namely, an error-state EKF and incremental FGO. In particular, insights are drawn from experiments conducted in a large motion-capture facility, which provides localization ground truths for both the UAV and the UGV.

2 SYSTEM DESIGN

Several sensing modalities were used to estimate the UAV state, as shown in Table 1. First, in addition to the IMU in the UAV’s flight controller, the UAV had a laser altimeter mounted as part of the sensor payload. Second, a UWB radio, DWM-1001, was mounted on the UAV, and a companion UWB was mounted on the UGV to offer ranging measurements between the UAV and UGV. A stereo camera connected to the UAV’s onboard computer was used for velocity estimation.

View this table:
  • View inline
  • View popup
TABLE 1

Summary of Sensors

Figure 1 presents a photograph of the UAV with its sensors. The UAV platform was a Tarot 650 drone equipped with a single board computer and some minimal sensors. A SECO Intel Celeron N3350 Core Single Board Computer coupled with a PixHawk 4 autopilot with an integrated IMU served as the UAV’s onboard autopilot and computer. A LIDAR-Lite single-beam laser altimeter was integrated into the bottom of the UAV, which also carries a UWB radio to provide a direct ranging measurement to the UGV. An Intel RealSense T265 camera was incorporated into the UAV for use as a body-frame velocity sensor on the UAV. To facilitate communication over a local network, the UAV and UGV were Wi-Fi-enabled. As a result, a single instance of the robot operating system (ROS) core running on the computer of the UGV allowed both the UAV and UGV to access sensor messages. With sharing of the IMU and laser-altimeter data from the autopilot, it was possible to control the UAV from the UGV and support relative localization.

FIGURE 1
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 1

Quad-rotor UAV for experimental evaluation (West Virginia University [WVU] photo)

The UGV used for the experiments in this work was equipped with an array of sensors to support 3D SLAM and to estimate the location of a multi-rotor UAV. In particular, as shown in Figure 2, a Clearpath Robotics Husky UGV platform (Clearpath, 2023) served as the main drive chassis for the UGV. The UGV was equipped with a custom computer box and sensor mounting stack constructed from a T-bar as well as a 200-m-range Ouster OS1 64-channel 3D lidar with a 45° vertical field of view ( + 22.5° to −22.5° ) and 360° horizontal field of view. The UGV was also equipped with a tactical-grade micro-electromechanical system (MEMS) IMU (Analog Devices ADIS 16488) to support 3D lidar–inertial SLAM. The main computer was equipped with an i7-7700 processor, and the ROS framework (Quigley et al., 2009) served as the foundation for all software running on the UGV.

FIGURE 2
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 2

Sensor setup for a subterranean UGV called Badger (WVU photo)

3 FORMULATION OF THE NAVIGATION ALGORITHM

For our UAV/UGV collaboration, deviating from our past related work (De Petrillo et al., 2021, 2023; Gross et al., 2019), both the UAV and UGV are assumed to potentially be in motion. While the UGV navigates and maps the environment, the UAV provides an aerial viewpoint ideal for locating potential targets. The UAV can use the map generated by the UGV for path planning. The UGV generates a 3D SLAM solution as it traverses the environment using the lidar–inertial odometry package FAST-LIO (Xu & Zhang, 2021). Other state-of-the-art SLAM packages such as LOAM (Zhang & Singh, 2014) and LIO-SAM (Shan et al., 2020) could also potentially have been used without issue. However, the primary focus of this paper is UAV localization within a local frame of reference, with no contributions to UGV SLAM technology.

In this work, to expand upon our prior work and offer additional improvements, we implement and compare two different nonlinear state estimation frameworks, namely, an error-state EKF and an incremental FGO. Although the EKF is relatively easier to implement and is usually more computationally efficient, it often requires more extensive tuning and can easily be derailed by degraded measurements (Hu et al., 2024; Wen et al., 2021). Compared with the EKF, FGO has a few advantages. First, unlike a standard Kalman filter, which iterates only once for each state, the FGO uses multiple iterations to minimize cost. Second, in contrast to the single linearization step of the standard Kalman filter, the FGO also linearizes the nonlinear measurement model at each iteration step for each state. Additionally, it has been demonstrated that factor graphs more effectively utilize the temporal correlation between earlier and more recent epochs, which has been attributed to the batch nature of the estimation method (Das et al., 2021). Given that some measurement updates that depend on the UGV localization will be error-prone because of errors in the UGV pose estimate and that the lidar position update can be an outlier as well as highly nonlinear because of the short-distance ranging measurements, we expect that a comparison will help determine which approach is better suited for our application. More details regarding the benefits of FGO have been detailed in a recent tutorial article (Taylor & Gross, 2024). Figures 3 and 5 show summaries of these two approaches. To predict the navigation state, the UAV’s onboard inertial navigation is integrated, and four different asynchronous measurement sources are available for sensor fusion within the EKF/incremental FGO to determine the UAV’s full state.

FIGURE 3
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 3

Block diagram of the GNSS-denied UAV–UGV navigation system

FIGURE 4
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 4

Flow chart for object tracking and detection with 3D lidar data

FIGURE 5
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 5

Incremental FGO for the GNSS-denied UAV–UGV navigation system

3.1 Error-State EKF Algorithm

Using a formulation similar to that in our previous work (Gross et al., 2019) and building upon it to allow UGV motion, the attitude, position, and velocity of the UAV in a local north–east–down (NED) navigation frame, with its origin centered at the UGV’s starting location, comprise the state vector of interest, as shown in Equation (1), where we have the following:

  • (ϕbn,θbn,ψbn) represents the UAV attitude,

  • (vN, vE, vD) represents the UAV velocity, and

  • (rN, rE, rD) represents the UAV position.

As shown in Equation (2), the error-state EKF estimates the errors of the state vector of interest as small deviations (δ), as well as the IMU accelerometer and rate gyroscope biases (ba and bg), where γbn represents the small-angle attitude error:

x^=[ϕbn,θbn,ψbn,vN,vE,vD,rN,rE,rD]T1

δ^x=[γ(1)bn,γ(2)bn,γ(3)bn,δvN,δvE,δvD,δrN,δrE,δrD,bax,bay,baz,bgx,bgy,bgz]T2

The states and error states in Equations (1) and (2) were predicted within the error-state EKF at a high rate (50 Hz) using inertial navigation and were then updated as new measurements became available at different rates. The available measurements included data from a laser altimeter on the UAV publishing at 10 Hz, lidar-based tracking of the UAV from the UGV at 10 Hz, stereo-camera-based measurements of the UAV’s body-axis velocity at 40 Hz, and UWB ranging measurements between the UAV and UGV publishing at 10 Hz.

By describing the nonlinear state updates and linearized error-state transition model for the inertial navigation system (INS), as well as the observation models and observation sensitivity matrices for each measurement update source, the subsections below detail the specific formulations adopted for each of the primary navigation information sources used for fusion in the EKF.

3.1.1 UAV Inertial Navigation

Because a MEMS IMU is used, the INS models were simplified by omitting the effects of the Earth’s rotation and craft-rate terms (e.g., assuming that the partial derivatives are zero), and we applied the NED navigation frame described by Groves (2015). The localization accuracy is not significantly impacted by these contributions because our UAV had relatively low velocities in our local frame and these signals were imperceptible relative to the stochastic errors of our IMU’s sensor-grade. Further, these models require a knowledge of the platform location on the surface of the Earth, which is a limited assumption for a GNSS-denied application. The UAV attitude is predicted using Equation (3):

C^b,k+1∣kn=C^b,k∣kn(I3×3+Ωibδt)+ϵATT3

where ϵATT is the attitude integration process noise, Cnb is the rotation matrix between the navigation and body frame, Ωib is the skew-symmetric matrix with terms defined by the IMU rate gyroscopes (p: roll rate, q: pitch rate, r: yaw rate), as shown in Equation (4) (Groves, 2015), and δt is the IMU sampling time:

Ωib=[0−rqr0−p−qp0]4

The velocity update then converts the specific force measurements from the IMU’s body frame to a local navigation frame, accounts for the acceleration due to gravity, and integrates over time, disregarding Coriolis terms as a further simplification, as shown in Equation (5):

[v^Nv^Ev^D]k+1∣k=[v^Nv^Ev^D]k∣k+[C^b,k+1∣kn[axayaz]+[00g]]δt+ϵVEL5

where ϵvel is the velocity integration process noise. The position estimates are updated by simply integrating the velocity estimates using trapezoidal integration to reduce integration errors over the integration interval:

[r^Nr^Er^D]k+1∣k=[r^Nr^Er^D]k∣k+[[v^Nv^Ev^D]k∣k+[v^Nv^Ev^D]k+1∣k]δt2+ϵPOS6

where ϵPOS denotes the noise in the assumed position integration process. The IMU sensor biases are all updated, assuming random walk bias dynamics as in Equation (7):

bk+1∣k=bk∣k+ϵbias7

Note that for various classes of IMUs, a more rigorous approach for bias and noise stochastic modeling could have been applied by characterizing the bias noise and bias instability characteristics of each sensor via Allan variance analysis (El-Sheimy et al., 2007). In particular, the parameters of a first-order Gauss–Markov stochastic model that represents the non-infinite time correlation of the bias dynamics would be more accurate than the infinite time correlation utilized for the random walk in our formulation. However, given the short time span of our experimental UAV flights, we elected to use a simple random walk formulation. Equation (8) provides the linearized error-state dynamics:

FINS=[03×303×303×303×3C^b,k+1∣kn[−(C^bn[axayaz])Λ]03×303×3C^b,k+1∣kn03×303×3I3×303×303×303×303×303×303×303×303×303×303×303×303×303×3]8

where Λ denotes a vector’s skew-symmetric matrix. Equation (9) provides the error-state transition matrix expanded to the first order:

ΦINS=I15×15+FINSδt9

3.1.2 Laser-Altimeter Measurement Update

A local Wi-Fi network was used to transmit data from a laser altimeter mounted on the bottom of the UAV, close to its center, to the UGV. Wi-Fi communication latency is assumed to be negligible for this application, as our test environment is relatively small. We acknowledge that this is a limitation of the current work but leave this issue to future work. When the UAV has zero pitch and roll attitude, altimeter measurements are taken to equal the height of the UAV above the ground, assuming that the environment has a flat, level surface below the UAV and UGV. This assumption is valid for our test environment; however, a changing topography would lead to systematic errors. This measurement scales with the orientation of the UAV, as shown in Equation (10), whose observation model is provided in Equation (11) under the assumption that the pitch and roll angles are relatively small (i.e., no singularities):

z^ALT=rDcosθcosϕ+ϵALT10

HALT=[01×8−1cosθcosϕ01×6]11

It was assumed that the measurement error covariance of the altimeter measurement was ϵALT∼N(0,σ2)(σ=0.1m).

3.1.3 Lidar UAV Position Measurement Update

The lidar sensor generates a point cloud at a rate of approximately 10 Hz, P = {pi ∈ R3 for i = 1, 2, …}, where each point pi = {xi, yi, zi} represents the relative location from the sensor reference frame to the surface at which the point was sampled. The 3D lidar data are used to constrain the relative position estimate between the UAV and UGV. A sparse representation of the area around the sensor is given by the collection of these points, P. As a result, identifying the points in the point cloud from which the UAV has reflected provides the UAV’s precise position. The rolling shutter effect of the 10-Hz sampled spinning lidar sensor was not considered in this work.

Figure 4 shows the algorithm for detecting the UAV within the point cloud. A torus is formed from the point cloud using the UWB and altimeter measurements. In this way, many potential false UAV detections are removed, and the computation required to process the point cloud is reduced. Using the open-source Point Cloud Library (Rusu & Cousins, 2011), a Euclidean clustering algorithm (Cao et al., 2022; Wei et al., 2019) is applied to the segmented point cloud. This clustering algorithm can potentially identify several clusters in each point cloud. Therefore, to further identify which candidate clusters can represent the UAV, the candidate clusters are assessed against a set of heuristics that serve to reduce the potential for false positives (i.e., a simple decision tree).

The following heuristics were utilized to identify a cluster as a UAV:

  • A minimum volume constraint regarding the extremities-defined bounding box of the cluster,

  • Constraints on the shape of the bounding box that encapsulates the cluster, for example, specifying the permitted variation in the length, width, and height of the bounding box,

  • A requirement that the point cloud height be within the manually adjustable threshold of the UAV onboard laser altimeter, and

  • A constraint that the detected cluster be close to the predicted UAV position, assuming a reasonable UAV velocity between detections.

Empirically tuning the thresholds for these heuristics allowed us to reduce the number of clusters incorrectly reported as UAV locations based on visual inspection. Based on the size and shape of the UAV, these thresholds will need to be updated. Once a particular cluster satisfies the set of heuristics, its centroid is reported as a UAV position estimate in the sensor frame of the UGV lidar ([r^X,r^Y,r^Z]T). The centroid of the cluster is only an approximation of the UAV position. However, based on the shape of the drone, this approximation was considered acceptable for this work, and the measurement error covariance for this measurement update was used to compensate for this uncertainty. Using the transformation matrix T, shown in Equation (12), where (ϕ, θ, ψ, lidarX, lidarY, lidarZ) is the lidar’s pose in the sensor frame, the lidar pose measurement of the UAV is transformed to the local NED frame and fused with the error-state EKF:

T=[c(ψ)c(θ)−s(ψ)c(ϕ)+c(ψ)s(θ)s(ϕ)s(ψ)s(ϕ)+c(ψ)s(θ)c(ϕ)lidarX−s(ψ)c(θ)−c(ψ)c(ϕ)−s(ψ)s(θ)s(ϕ)c(ψ)s(ϕ)−s(ψ)s(θ)c(ϕ)−lidarYs(θ)−c(θ)s(ϕ)−c(θ)c(ϕ)−lidarZ0001]12

z^lidar=(T[r^Xr^Yr^Z1])[1:3]=[rNrErD]+ϵlidar13

Hlidar=[03×6−I3×303×6]14

It was assumed that the lidar-position-updated measurement error covariance was ϵlidar∼N(03×3,σ2I3×3)(σ=0.2m).

3.1.4 UWB Ranging Radio Measurement Update

A 10-Hz ranging measurement is provided by the pair of ranging radios on the UAV and UGV. This range is used to aid in segmenting the lidar point cloud, as discussed in Section 3.1.3. In the error-state EKF, this measurement is also fused as an observation. Equation (15) models the range measurement, Equation (16) models the predicted measurement, and Equation (17) provides the observation sensitivity matrix. Accounting for the different frames of the UGV, the UGV pose is used, with the range between the UAV and UGV, to update the UAV pose:

z^UWB=ρUGVUAV+ϵUWB15

z^INS=(r^N−UGVX)2+(r^E+UGVY)2+(r^D+UGVZ)216

HUWB=[01×6−r^N−UGVXz^INS−r^E+UGVYz^INS−r^D+UGVZz^INS01×6]17

Here, it is assumed that the measurement error covariance of the UWB radio is ϵUWB∼N(0,σ2)(σ=0.1m). It is important to note that, in our application, we used UWB radios for short-range line-of-sight ranging observations with relatively slow dynamics. A more detailed measurement error model and dynamic calibration procedure would likely be required for applications with faster dynamics and/or non-line-of-sight ranging (Denis et al., 2003).

3.1.5 UAV Velocity Update

An Intel RealSense T265 camera provided UAV velocity measurements directly in the body frame, which were then fused in the error-state EKF. The models for the velocity measurement and the observation matrix are shown in Equations (18) and (19), respectively:

z^VEL=[vNvEvD]+ϵVEL18

HVEL=[03×3−I3×303×9]19

It is assumed that the measurement error covariance of the velocity measurement is ϵVEL∼N(03×1,σ213×1)(σ=0.01m/s).

3.2 Incremental Factor Graph Algorithm

FGO (Dellaert et al., 2017) is an estimation framework that utilizes the local structure of measurements and states. For example, range measurements depend only on UGV and UAV state nodes corresponding to a specific time, and visual odometry measurements depend only on UAV nodes i and i + 1. This locality helps factorize the joint distribution of all states of the trajectory into a product of local factors Ψ. Because the marginal distribution of measurements is independent of the states, the posterior of the states is proportional to the product of the factors. Maximizing this product leads to the maximum a posteriori (MAP) estimate of the states:

XMAP=argmaxXp(X|Z)=argmaxX∏Ψi20

This maximization problem can be converted into a minimization problem by using a negative logarithm, resulting in a batch nonlinear least-squares problem that can be efficiently solved by existing open-source solvers such as GTSAM (Dellaert, 2012).

The UAV state estimation problem was formulated as an FGO that incrementally smooths the state trajectory over time. The optimization estimates a navigation state Xi comprising the drone’s 3D pose, velocity, and IMU biases at each time step i. The factor graph structure incorporates inertial and visual–range sensor measurements. Initially, the prior factor for the first state node is based on the provided initialization parameters. For subsequent states, prior factors come from IMU propagation or lidar and UWB ranging measurements, when available. At each time step, a new IMU factor is added between consecutive state nodes to constrain the motion predicted by pre-integrating inertial measurements (Forster et al., 2015). Additional measurement factors are incorporated to correct the current state estimate when new visual or range observations are received.

These measurement factors include a prior factor (Ψprior) initialized with velocity prior information obtained from in-built visual odometry from the T265 camera and a position prior from lidar segmentation achieved by the UGV, as shown in Section 3.1.3. Measurements from the altimeter are used as the prior for the elevation factor (Ψelevation), which constrains the z-position at all states. An additional constraint is utilized by adding a range factor (Ψrange) between the UAV and the UGV, whose state is added as a node in the graph with its prior set from the FAST-LIO result. The prior of this factor is obtained from the UWB.

After adding new factors, the incremental FGO computes the MAP estimate of the total state trajectory. Incrementally smoothing the trajectory with new constraints improves the state estimates over time. The optimization integrates IMU propagation with intermittent visual and range updates for accurate and robust state estimation. The formulation is shown in Equation (21):

X=argminX∑i‖Xi−X0‖Σp2+‖Zi−h(Xi)‖Σel2+‖Xi−XUGV‖Σrange2+∑j‖Xj−f(Xj−1)‖ΣIMU221

4 EXPERIMENTAL SETUP

Multiple experimental field tests consisting of simultaneous, autonomous UAV flights and UGV traversals were conducted to assess the performance of the proposed GNSS-denied UAV–UGV navigation system. These tests were conducted inside a large motion-capture facility, shown in Figure 6, whose dimensions were approximately 10 m by 5 m. The room was outfitted with 30 VICON Vantage V5 high-resolution cameras, each with a resolution of 5 megapixels, to capture motion data. Feedback from the VICON motion-capture cameras was used for real-time control. This feedback had a high frame rate of approximately 400 frames per second, allowing precise motion tracking.

FIGURE 6
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 6

Motion-capture facility with the UAV and UGV navigating autonomously during the experimental test (WVU photo)

The trajectory waypoints, as shown in Figure 11, traversed by both the UGV and UAV were pre-planned before the experiment commenced. Further, the real-time estimate of the UAV pose that was used for waypoint following was obtained from the VICON motion-tracking system. These pose estimates were obtained by attaching six reflective markers to the body of the UAV and calculating the geometric center from the constellation of markers. The flight controller onboard the UAV used the VICON pose data to fly the UAV across the planned waypoints autonomously. Therefore, the UAV position estimates from our EKF and FGO solutions were not used in a closed loop, and the experiments only serve to evaluate the accuracy of the localization approach, not its use for UAV control. Unlike the UAV, the UGV used the VICON motion data only in post-processing to assess the efficacy of the motion controller, which uses pose estimates from the implemented FAST-LIO algorithm (Xu & Zhang, 2021).

To reduce the potential of accidental collisions between the vehicles, the UAV was flown above the UGV at a fixed altitude of 2 m, ensuring that both vehicles were operating in the same test environment. During each test, the UGV followed the predetermined driving trajectory in an 8-shaped pattern across the floor. This specific trajectory was chosen so that the UAV would come into the field of view of the lidar sensor mounted on the UGV platform multiple times during the experiment. With a total test time of approximately 150 s, all of the data collected during the two experimental tests were then post-processed for analysis by a replay of the data stored in ROS bag files.

5 RESULTS

5.1 SLAM Solution

The accuracy of the UGV pose estimation achieved by the FAST-LIO SLAM algorithm (Xu & Zhang, 2021) used by the UGV is examined first. A 3D position root-mean square (RMS) error of 0.4 m was obtained from the UGV pose estimation. Figure 7 compares the estimated UGV position to the ground truth determined by our motion-capture system. The error estimates for the SLAM solution are highlighted in Table 2. The accuracy of the UAV localization estimation is directly affected by the accuracy of the UGV pose estimation. Inaccuracies in the UGV pose lead to erroneous measurement updates because the lidar and UWB measurement updates rely on accurate UGV localization. The RMS position and attitude errors in Table 2 are reasonably small and were deemed sufficient for this work. While improving UGV SLAM is not the focus of this work, we acknowledge that other implementations, such as LIO-SAM (Shan et al., 2020; You et al., 2023) or R-LIO (Chen et al., 2022), have been compared with the adopted FAST-LIO formulation and have demonstrated potential for performance increases. There was a notable peak in the 3D position error, approaching almost 1 m at one point in the experiment. This event would result in an extremely erroneous lidar and UWB measurement update for the UAV in the period around that time in the current implementation. Future work will consider how the UAV information can be fed back to improve the UGV SLAM.

FIGURE 7
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 7

UGV SLAM position estimation compared with the ground truth position

View this table:
  • View inline
  • View popup
TABLE 2

SLAM Error Estimates of the UGV State

View this table:
  • View inline
  • View popup
TABLE 3

Runtimes for FGO in Experiments 1 and 2

5.2 UAV Localization Results

The results of our EKF and incremental FGO are presented in this section. Experimental testing was conducted to determine how well the estimation algorithms performed; the results for two flight trajectories are outlined below.

To improve UAV pose estimation, we systematically assessed the effects of adding various updates to the filter by post-processing the recorded data and excluding or including the use of various sensor information in each of the estimation algorithms. At first, only inertial navigation and lidar position updates were used. Figure 8 compares the ground truth from the motion-capture solution and lidar position updates for one experiment, and Figure 9 shows a snapshot of a lidar position detection from a point cloud (blue box). However, when only lidar updates with inertial navigation were used, the resulting filter estimate was poor, owing to the periods of drift during the periods in which there were not many lidar updates. Additionally, frame transformation errors and SLAM localization errors contributed to errors in the lidar position measurements. Because the east and down positions of the UGV, as well as the roll and pitch attitudes, have the largest errors, the lidar position measurement errors for the east and down axes are the largest.

FIGURE 8
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 8

Lidar position measurement updates compared with the ground truth position

FIGURE 9
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 9

Example of lidar position detection (blue box) and EKF estimate (red box) in a 3D point cloud (WVU data)

As shown in Figure 8, the UAV is not consistently tracked during the entire experiment. This result arises because the UAV was outside the lidar’s field of view for portions of the experiments and because the heuristics used to identify the UAV, as detailed above, are conservative to avoid falsely identifying another object as the UAV. To improve UAV detection via lidar, research is being conducted in parallel (Olawoye & Gross, 2023) to complement the scope of this paper.

In addition to lidar position updates, the analysis was expanded to include UWB updates. The results show that the UWB updates reduced the 3D localization RMS error to approximately 12 m. The error was further reduced by including the altimeter updates, which, together with the UAV velocity updates, brought the error to less than 1 m. Figure 10 compares the estimated NED position from our filter/incremental FGO with the ground truth for the first experiment, Figure 11 compares the estimate with a 3D plot for the first experiment, and Figure 12 compares the estimated attitude from our filter/incremental FGO with the ground truth for the first experiment. Figures 13 and 14 show comparison plots for the second experiment. For the second experiment, despite being shorter, we observe consistent trends, as reflected in the comparable patterns of RMS errors observed for both trajectories. These results indicate that the incremental FGO solution outperforms the EKF, particularly when there are fewer measurement updates. The error estimates for the FGO solution are significantly smaller than those of the EKF up until the addition of the velocity updates. When lidar, UWB, and altimeter updates are included, the EKF has position errors that are up to five-fold larger than the FGO errors, and for most of the experiment, the FGO solution is smoother. The FGO solution also appears to recover more quickly from poor measurement updates. With all of the measurement updates, both approaches had similar performances, with the EKF performing better in some metrics. Considering the simplicity and lower computational overhead of the error-state EKF compared with the FGO, with sufficient measurement updates and more careful tuning, the EFK is probably the more practical approach. Yet, the incremental FGO approach is a better choice when measurement updates are limited.

FIGURE 10
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 10

EKF/FGO position compared with the ground truth position (Experiment 1)

FIGURE 11
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 11

3D plot of EKF/FGO position compared with the ground truth position (Experiment 1)

FIGURE 12
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 12

EKF/FGO attitude compared with the ground truth attitude (Experiment 1)

FIGURE 13
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 13

EKF/FGO position compared with the ground truth position (Experiment 2)

FIGURE 14
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 14

EKF/FGO attitude compared with the ground truth attitude (Experiment 2)

Tables 4 and 6 show the RMS errors for the two flight trajectories using different combinations of measurement updates. In the second experiment, the UAV was outside the UGV lidar’s field of view for almost the entire trajectory; thus, there was an insignificant number of lidar updates. Estimation for this experiment was conducted with the other measurement updates, with the RMS error shown in Table 6. For the first trajectory, lidar was also only able to offer updates during the short period in which the UAV was in the UGV lidar’s field of view; we feel that comparing these results with those for the second flight (with no lidar) provides insight into the importance of these updates. As shown in both tables, the error decreased with each additional update. The UWB updates provided the most significant improvement on the NED axes for both the EKF and the FGO. The FGO appears to better use the UWB updates to improve its attitude estimation, as shown by the reduction in the roll and pitch RMS errors of the FGO. The addition of the altimeter updates provided only a slight improvement for most states, but a significant improvement in the down-axis estimate. This result is due to the fact that the altimeter information primarily constrains only one state (position in the down-axis). Adding velocity updates significantly improved performance for all states of interest because velocity information constrains the position in all axes. Tables 5 and 7 show the maximum error for each axis across both flight trajectories. The maximum error follows trends similar to those of the RMS errors. The EKF attitude estimates were much better in the second experiment than in the first, likely because the flight experiment was shorter, such that the gyroscope bias did not lead to as much attitude error as in the first experiment. Smaller attitude errors translate to smaller position errors, as attitude and position are strongly correlated in inertial navigation. This trend can be seen in the “IMU Only” column of Table 6, where the position error is much smaller than the corresponding field in Table 4. The benefit of lidar updates can be seen by comparing the FGO position RMS errors for the two trajectories. The FGO position errors are notably larger in the second experiment than in the first for all measurement update scenarios. Tables 4 and 6 show that lidar updates benefit both estimation methods, but the FGO, again, seems better able to utilize the information from lidar updates simply because it serves as a batch estimator over the entire time history rather than acting as a sequential estimator.

View this table:
  • View inline
  • View popup
TABLE 4

EKF/FGO RMS Error (Experiment 1)

View this table:
  • View inline
  • View popup
TABLE 5

EKF/FGO Maximum Error (Experiment 1)

View this table:
  • View inline
  • View popup
TABLE 6

EKF/FGO RMS Error (Experiment 2)

View this table:
  • View inline
  • View popup
TABLE 7

EKF/FGO Maximum Error (Experiment 2)

Our analysis further investigates the runtime of the FGO algorithm for both flight trajectories, unveiling insights into its computational characteristics across varying graph sizes. Notably, we observe a consistent linear growth in computational time as the size of the FGO increases, as shown in Figure 15. This linear relationship underscores the scalability of FGO for this particular problem formulation, indicating that its computational complexity increases proportionally with the size of the graph. This result is expected, as the computational overhead of propagating messages over dense or large graphs amplifies the runtime. At the same time, the order in which messages are passed can also impact convergence speed and overall runtime performance. This observed relationship is consistent with other runs in our scenario, as shown in Table 3.

FIGURE 15
  • Download figure
  • Open in new tab
  • Download powerpoint
FIGURE 15

Graph size versus runtime for the FGO algorithm

6 CONCLUSION

In this work, an approach for cooperative localization of a UAV using information from a UGV in GNSS-denied subterranean environments was designed using multiple nonlinear estimators, implemented using real sensors and communication, and assessed experimentally. Experiments were conducted in a motion-capture room to assess the system’s performance while both the UAV and UGV were in autonomous motion to estimate the UAV pose. A teaming arrangement such as this would enable the UAV to be deployed by a UGV to facilitate an overall mission goal such as search and rescue.

The UGV, equipped with a 64-channel lidar and tactical-grade IMU, kept track of its pose using the FAST-LIO SLAM algorithm (Xu & Zhang, 2021), with a 3D RMS error of less than 0.5 m. For the UAV pose estimation, we employed an error-state EKF and an incremental FGO, integrating data from IMU, lidar, UWB, altimeter, and velocity measurements. Measurements from the flight controller IMU were noisy and led to considerable INS errors when left unaided. To mitigate this issue, using a higher-grade IMU on the flight controller, similar to the UGV IMU, would be preferable for future applications. However, other measurement updates compensated for the drift of the UAV IMU. The lidar position measurement updates were sometimes biased because of errors in the UGV pose. While our setup demonstrated efficient real-time data transmission between the UAV-mounted laser altimeter and the UGV through Wi-Fi, it is essential to recognize this limitation for applications in which latency is a critical concern, particularly in larger environments or settings with potential for significant delays in communication. Future work could consider incorporating more realistic communication signals, such as 5G, which would have the added benefit of being useful for ranging and communications. By carefully tuning the filter’s system and measurement noise models, the effect of these offsets was reduced in the localization solution. In future works, the pose estimate of the UGV could be improved by using updates from the UAV, such as range measurements, and by taking advantage of pseudo-measurement constraints of the UGV. This approach would make the algorithm truly collaborative. Additional sensor updates progressively improved the UAV pose accuracy, with the FGO approach outperforming the error-state EKF by providing smoother and more accurate pose estimates.

HOW TO CITE THIS ARTICLE

Akhihiero, D., Olawoye, U., Das, S., & Gross, J. (2024). Cooperative localization for GNSS-denied subterranean navigation: A UAV–UGV team approach. NAVIGATION, 71(4). https://doi.org/10.33012/navi.677

CONFLICT OF INTEREST

The authors declare no conflicts of interest.

ACKNOWLEDGMENTS

This research was supported in part by an academic grant from the National Geospatial-Intelligence Agency (NGA) (Award No. HM0476-18-1-2000, Project Title: Autonomous Navigation of Small UAV/UGV Teams in Underground Tunnels). Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NGA, Department of Defense, or US government. This work has been approved for public release (NGA-U-2024-01756).

This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

REFERENCES

  1. ↵
    1. Asadi, K.,
    2. Suresh, A. K.,
    3. Ender, A.,
    4. Gotad, S.,
    5. Maniyar, S.,
    6. Anand, S.,
    7. Noghabaei, M.,
    8. Han, K.,
    9. Lobaton, E., &
    10. Wu, T.
    (2020). An integrated UGV-UAV system for construction site data collection. Automation in Construction, 112, 103068. https://doi.org/10.1016/j.autcon.2019.103068
  2. ↵
    1. Cao, Y.,
    2. Wang, Y.,
    3. Xue, Y.,
    4. Zhang, H., &
    5. Lao, Y.
    (2022). FEC: Fast Euclidean clustering for point cloud segmentation. Drones, 6(11), 325. https://doi.org/10.3390/drones6110325
  3. ↵
    1. Chen, K.,
    2. Zhan, K.,
    3. Pang, F.,
    4. Yang, X., &
    5. Zhang, D.
    (2022). R-LIO: Rotating LiDAR inertial odometry and mapping. Sustainability, 14(17), 10833. https://doi.org/10.3390/su141710833
  4. ↵
    1. Clearpath
    . (2023). Husky UGV. Retrieved August 16, 2023, from https://www.clearpathrobotics.com/wp-content/uploads/2013/02/HUSKY_A200_UGV_2013_TEASER_email.pdf
  5. ↵
    1. Dang, T.,
    2. Mascarich, F.,
    3. Khattak, S.,
    4. Papachristos, C., &
    5. Alexis, K.
    (2019). Graph-based path planning for autonomous robotic exploration in subterranean environments. Proc. of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3105–3112. https://doi.org/10.1109/IROS40897.2019.8968151
  6. ↵
    1. Das, S.,
    2. Watson, R., &
    3. Gross, J.
    (2021). Review of factor graphs for robust GNSS applications. arXiv. https://doi.org/https://doi.org/10.48550/arXiv.2112.07794
  7. ↵
    1. De Petrillo, M.,
    2. Beard, J.,
    3. Gu, Y., &
    4. Gross, J. N.
    (2021). Search planning of a UAV/UGV team with localization uncertainty in a subterranean environment. IEEE Aerospace and Electronic Systems Magazine, 36(6), 6–16. https://doi.org/10.1109/MAES.2021.3065041
  8. ↵
    1. De Petrillo, M.,
    2. Ross, D., &
    3. Gross, J. N.
    (2023). Gaussian process regression for learning environment impacts on localization accuracy of a UAV with respect to UGV for search planning. Proc. of the 2023 IEEE/ION Position, Location and Navigation Symposium (PLANS), Monterey, CA, 260–271. https://doi.org/10.1109/PLANS53410.2023.10139936
  9. ↵
    1. Dekkata, S. C.,
    2. Okore-Hanson, T.,
    3. Yi, S.,
    4. Hamoush, S.,
    5. Seong, Y., &
    6. Plummer, J.
    (2020). Autonomous navigation and control of UGVs’ in nuclear power plants-20381. Proc. of the 46th Annual Waste Management Conference (WM2020), Phoenix, AZ. https://www.osti.gov/biblio/23028007
  10. ↵
    1. Dellaert, F.
    (2012). Factor graphs and GTSAM: A hands-on introduction. Georgia Institute of Technology, Tech. Rep, 2, 4. http://hdl.handle.net/1853/45226
  11. ↵
    1. Dellaert, F., &
    2. Kaess, M.
    (2017). Factor graphs for robot perception. Foundations and Trends® in Robotics, 6(1–2), 1–139. https://doi.org/10.1561/2300000043
  12. ↵
    1. Denis, B.,
    2. Keignart, J., &
    3. Daniele, N.
    (2003). Impact of NLOS propagation upon ranging precision in UWB systems. Proc. of the IEEE Conference on Ultra Wideband Systems and Technologies, 2003, Reston, VA, 379–383. https://doi.org/10.1109/UWBST.2003.1267868
  13. ↵
    1. El-Sheimy, N.,
    2. Hou, H., &
    3. Niu, X.
    (2007). Analysis and modeling of inertial sensors using Allan variance. IEEE Transactions on Instrumentation and Measurement, 57(1), 140–149. https://doi.org/10.1109/TIM.2007.908635
  14. ↵
    1. Forster, C.,
    2. Carlone, L.,
    3. Dellaert, F. and
    4. Scaramuzza, D.
    (2015). IMU preintegration on manifold for efficient visual-inertial maximum-a-posteriori estimation. In Robotics: Science and Systems XI. https://www.roboticsproceedings.org/rss11/p06.pdf
  15. ↵
    1. Gross, J.,
    2. De Petrillo, M.,
    3. Beard, J.,
    4. Nichols, H.,
    5. Swiger, T.,
    6. Watson, R.,
    7. Kirk, C.,
    8. Kilic, C.,
    9. Hikes, J.,
    10. Upton, E.,
    11. Ross, D.,
    12. Russell, M.,
    13. Gu, Y., &
    14. Griffin, C.
    (2019). Field-testing of a UAV-UGV team for GNSS-denied navigation in subterranean environments. Proc. of the 32nd International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS+ 2019), Miami, FL, 2112–2124. https://doi.org/10.33012/2019.16912
  16. ↵
    1. Groves, P. D.
    (2015). Principles of GNSS, inertial, and multisensor integrated navigation systems, [book review]. IEEE Aerospace and Electronic Systems Magazine, 30(2), 26–27. https://doi.org/10.1109/MAES.2014.14110
  17. ↵
    1. Hood, S.,
    2. Benson, K.,
    3. Hamod, P.,
    4. Madison, D.,
    5. O’Kane, J. M., &
    6. Rekleitis, I.
    (2017). Bird’s eye view: Cooperative exploration by UGV and UAV. Proc. of the 2017 International Conference on Unmanned Aircraft Systems (ICUAS), Miami, FL, 247–255. https://doi.org/10.1109/ICUAS.2017.7991513
  18. ↵
    1. Hu, W.,
    2. Uwineza, J.-B., &
    3. Farrell, J. A.
    (2024). Outlier accommodation for GNSS precise point positioning using risk-averse state estimation. arXiv. https://doi.org/10.48550/arXiv.2402.01860
  19. ↵
    1. Khattak, S.,
    2. Nguyen, H.,
    3. Mascarich, F.,
    4. Dang, T., &
    5. Alexis, K.
    (2020). Complementary multimodal sensor fusion for resilient robot pose estimation in subterranean environments. Proc. of the 2020 International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece, 1024–1029. https://doi.org/10.1109/ICUAS48674.2020.9213865
  20. ↵
    1. Kim, P.,
    2. Price, L. C.,
    3. Park, J., &
    4. Cho, Y. K.
    (2019). UAV-UGV cooperative 3D environmental mapping. Proc. of the ASCE International Conference on Computing in Civil Engineering 2019, Atlanta, GA, 384–392. https://doi.org/10.1061/9780784482438.049
  21. ↵
    1. Li, D.,
    2. Yang, W.,
    3. Shi, X.,
    4. Guo, D.,
    5. Long, Q.,
    6. Qiao, F., &
    7. Wei, Q.
    (2020). A visual-inertial localization method for unmanned aerial vehicle in underground tunnel dynamic environments. IEEE Access, 8, 76809–76822. https://doi.org/10.1109/ACCESS.2020.2989480
  22. ↵
    1. Li, J.,
    2. Cheng, Y.,
    3. Zhou, J.,
    4. Chen, J.,
    5. Liu, Z.,
    6. Hu, S., &
    7. Leung, V. C.
    (2021). Energy-efficient ground traversability mapping based on UAV-UGV collaborative system. IEEE Transactions on Green Communications and Networking, 6(1), 69–78. https://doi.org/10.1109/TGCN.2021.3107291
  23. ↵
    1. Marshall, J. A.,
    2. Bonchis, A.,
    3. Nebot, E., &
    4. Scheding, S.
    (2016). Robotics in mining. In Springer handbook of robotics, 1549–1576. Springer. https://doi.org/10.1007/978-3-319-32552-1_59
  24. ↵
    1. Martinez Rocamora Jr., B.,
    2. Lima, R. R.,
    3. Samarakoon, K.,
    4. Rathjen, J.,
    5. Gross, J. N., &
    6. Pereira, G. A.
    (2023). Oxpecker: A tethered UAV for inspection of stone-mine pillars. Drones, 7(2), 73. https://doi.org/10.3390/drones7020073
  25. ↵
    1. Minaeian, S.,
    2. Liu, J., &
    3. Son, Y.-J.
    (2015). Vision-based target detection and localization via a team of cooperative UAV and UGVs. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 46(7), 1005–1016. https://doi.org/10.1109/TSMC.2015.2491878
  26. ↵
    1. Olawoye, U., &
    2. Gross, J. N.
    (2023). UAV position estimation using a LiDAR-based 3D object detection method. Proc. of the 2023 IEEE/ION Position, Location and Navigation Symposium (PLANS), Monterey, CA, 46–51. https://doi.org/10.1109/PLANS53410.2023.10139979
  27. ↵
    1. Papachristos, C.,
    2. Khattak, S.,
    3. Mascarich, F.,
    4. Dang, T., &
    5. Alexis, K.
    (2019). Autonomous aerial robotic exploration of subterranean environments relying on morphology–aware path planning. Proc. of the 2019 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, 299–305. https://doi.org/10.1109/ICUAS.2019.8797885
  28. ↵
    1. Quigley, M.,
    2. Conley, K.,
    3. Gerkey, B.,
    4. Faust, J.,
    5. Foote, T.,
    6. Leibs, J.,
    7. Wheeler, R.,
    8. Ng, A. Y., &
    9. Berger, E.
    (2009). ROS: An open-source robot operating system. In ICRA Workshop on Open Source Software, (Vol. 3, No. 3.2, 5). https://robotics.stanford.edu/~ang/papers/icraoss09-ROS.pdf
  29. ↵
    1. Ren, Z.,
    2. Wang, L., &
    3. Bi, L.
    (2019). Robust GICP-based 3D LiDAR SLAM for underground mining environment. Sensors, 19(13), 2915. https://doi.org/10.3390/s19132915
  30. ↵
    1. Rosinol, A.,
    2. Abate, M.,
    3. Chang, Y., &
    4. Carlone, L.
    (2020). Kimera: An open-source library for real-time metric-semantic localization and mapping. Proc. of the IEEE International Conference on Robotics and Automation (ICRA), Paris, France. https://doi.org/10.1109/ICRA40945.2020.9196885
  31. ↵
    1. Rusu, R. B., &
    2. Cousins, S.
    (2011). 3D is here: Point cloud library (PCL). Proc. of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 1–4. https://doi.org/10.1109/ICRA.2011.5980567
  32. ↵
    1. Shan, T.,
    2. Englot, B.,
    3. Meyers, D.,
    4. Wang, W.,
    5. Ratti, C., &
    6. Rus, D.
    (2020). LIO-SAM: Tightly-coupled LiDAR inertial odometry via smoothing and mapping. Proc. of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, 5135–5142. https://doi.org/10.1109/IROS45743.2020.9341176
  33. ↵
    1. Shen, S.,
    2. Mulgaonkar, Y.,
    3. Michael, N., &
    4. Kumar, V.
    (2015). Initialization-free monocular visual-inertial state estimation with application to autonomous MAVs. In Experimental Robotics: The 14th International Symposium on Experimental Robotics, 211–227. https://doi.org/10.1007/978-3-319-23778-7_15
  34. ↵
    1. Sivaneri, V. O., &
    2. Gross, J. N.
    (2017). UGV-to-UAV cooperative ranging for robust navigation in GNSS-challenged environments. Aerospace Science and Technology, 71, 245–255. https://doi.org/10.1016/j.ast.2017.09.024
  35. ↵
    1. Szrek, J.,
    2. Wodecki, J.,
    3. Błażej, R., &
    4. Zimroz, R.
    (2020). An inspection robot for belt conveyor maintenance in underground mine—infrared thermography for overheated idlers detection. Applied Sciences, 10(14), 4984. https://doi.org/10.3390/app10144984
  36. ↵
    1. Taylor, C., &
    2. Gross, J.
    (2024). Factor graphs for navigation applications: A tutorial. NAVIGATION, 71(3). https://doi.org/10.33012/navi.653
  37. ↵
    1. Wang, D.,
    2. Lian, B., &
    3. Tang, C.
    (2021). UGV-UAV robust cooperative positioning algorithm with object detection. IET Intelligent Transport Systems, 15(7), 851–862. https://doi.org/10.1049/itr2.12063
  38. ↵
    1. Wei, S.,
    2. Niu, D.,
    3. Li, Q.,
    4. Chen, X., &
    5. Liu, J.
    (2019). A 3D vehicle recognition system based on point cloud library. Proc. of the 2019 Chinese Control Conference (CCC), Guangzhou, China, 7023–7027. https://doi.org/10.23919/ChiCC.2019.8865898
  39. ↵
    1. Weiss, S.,
    2. Achtelik, M. W.,
    3. Lynen, S.,
    4. Chli, M., &
    5. Siegwart, R.
    (2012). Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. Proc. of the 2012 IEEE International Conference on Robotics and Automation, St. Paul, MN, 957–964. https://doi.org/10.1109/ICRA.2012.6225147
  40. ↵
    1. Wen, W.,
    2. Pfeifer, T.,
    3. Bai, X., &
    4. Hsu, L.-T.
    (2021). Factor graph optimization for GNSS/INS integration: A comparison with the extended Kalman filter. NAVIGATION, 68(2), 315–331. https://doi.org/10.1002/navi.421
  41. ↵
    1. Xianjia, Y.,
    2. Qingqing, L.,
    3. Queralta, J. P.,
    4. Heikkonen, J., &
    5. Westerlund, T.
    (2021). Cooperative UWB-based localization for outdoors positioning and navigation of UAVs aided by ground robots. Proc. of the 2021 IEEE International Conference on Autonomous Systems (ICAS), Montreal, QC, Canada, 1–5. https://doi.org/10.1109/ICAS49788.2021.9551177
  42. ↵
    1. Xu, W., &
    2. Zhang, F.
    (2021). Fast-LIO: A fast, robust LiDAR-inertial odometry package by tightly-coupled iterated Kalman filter. IEEE Robotics and Automation Letters, 6(2), 3317–3324. https://doi.org/10.1109/LRA.2021.3064227
  43. ↵
    1. You, B.,
    2. Zhong, G.,
    3. Chen, C.,
    4. Li, J., &
    5. Ma, E.
    (2023). A simultaneous localization and mapping system using the iterative error state Kalman filter judgment algorithm for global navigation satellite system. Sensors, 23(13), 6000. https://doi.org/10.3390/s23136000
  44. ↵
    1. Zhang, J., &
    2. Singh, S.
    (2014). LOAM: LiDAR odometry and mapping in real-time. Robotics: Science and Systems, 2(9), 1–9. https://doi.org/10.15607/RSS.2014.X.007
  45. ↵
    1. Zhao, J.,
    2. Gao, J.,
    3. Zhao, F., &
    4. Liu, Y.
    (2017). A search-and-rescue robot system for remotely sensing the underground coal mine environment. Sensors, 17(10), 2426. https://doi.org/10.3390/s17102426
PreviousNext
Back to top

In this issue

NAVIGATION: Journal of the Institute of Navigation: 71 (4)
NAVIGATION: Journal of the Institute of Navigation
Vol. 71, Issue 4
Winter 2024
  • Table of Contents
  • Index by author
Print
Download PDF
Article Alerts
Sign In to Email Alerts with your Email Address
Email Article

Thank you for your interest in spreading the word on NAVIGATION: Journal of the Institute of Navigation.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Cooperative Localization for GNSS-Denied Subterranean Navigation: A UAV–UGV Team Approach
(Your Name) has sent you a message from NAVIGATION: Journal of the Institute of Navigation
(Your Name) thought you would like to see the NAVIGATION: Journal of the Institute of Navigation web site.
Citation Tools
Cooperative Localization for GNSS-Denied Subterranean Navigation: A UAV–UGV Team Approach
David Akhihiero, Uthman Olawoye, Shounak Das,, Jason Gross
NAVIGATION: Journal of the Institute of Navigation Dec 2024, 71 (4) navi.677; DOI: 10.33012/navi.677

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
Share
Cooperative Localization for GNSS-Denied Subterranean Navigation: A UAV–UGV Team Approach
David Akhihiero, Uthman Olawoye, Shounak Das,, Jason Gross
NAVIGATION: Journal of the Institute of Navigation Dec 2024, 71 (4) navi.677; DOI: 10.33012/navi.677
Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Bookmark this article

Jump to section

  • Article
    • Abstract
    • 1 INTRODUCTION
    • 2 SYSTEM DESIGN
    • 3 FORMULATION OF THE NAVIGATION ALGORITHM
    • 4 EXPERIMENTAL SETUP
    • 5 RESULTS
    • 6 CONCLUSION
    • HOW TO CITE THIS ARTICLE
    • CONFLICT OF INTEREST
    • ACKNOWLEDGMENTS
    • REFERENCES
  • Figures & Data
  • Supplemental
  • References
  • Info & Metrics
  • PDF

Related Articles

  • Google Scholar

Cited By...

  • No citing articles found.
  • Google Scholar

More in this TOC Section

  • Ranging Performance Evaluation for Higher-Order Scalable Interplex
  • Combinatorial Watermarking Under Limited SCER Adversarial Models
  • Wide-Sense CDF Overbounding for GNSS Integrity
Show more Regular Papers

Similar Articles

Keywords

  • cooperative navigation
  • GNSS-denied
  • mapping
  • state estimation
  • subterranean environment

Unless otherwise noted, NAVIGATION content is licensed under a Creative Commons CC BY 4.0 License.

© 2025 The Institute of Navigation, Inc.

Powered by HighWire