Summary: | Considering the radio-based indoor positioning system pertaining to signal degradation due to the environmental factors, and rising popularity of IP (Internet Protocol) cameras in cities, a novel fusion of inertial measurement units (IMUs) with external IP cameras to determine the positions of moving users in indoor environments is presented. This approach uses a fine-tuned Faster R-CNN (Region Convolutional Neural Network) to detect users in images captured by cameras, and acquires visual measurements including ranges and angles of users with respect to the cameras based on the proposed monocular vision relatively measuring (MVRM) method. The results are determined by integrating the positions predicted by each user’s inertial measurement unit (IMU) and visual measurements using an EKF (Extended Kalman Filter). The results experimentally show that the ranging accuracy is affected by both the detected bounding box’s by Faster R-CNN height errors and diverse measuring distances, however, the heading accuracy is solely interfered with bounding box’s horizontal biases. The indoor obstacles including stationary obstacles and a pedestrian in our tests more significantly decrease the accuracy of ranging than that of heading, and the effect of a pedestrian on the heading errors is greater than stationary obstacles on that. We implemented a positioning test for a single user and an external camera in five indoor scenarios to evaluate the performance. The robust fused IMU/MVRM solution significantly decreases the positioning errors and shows better performance in dense multipath scenarios compared with the pure MVRM solution and ultra-wideband (UWB) solution.
|