Abstract
Environmental mapping is the key step for mobile robots to perform tasks independently and perfectly. In recent years, visual SLAM, laser-based SLAM and simultaneous localization and mapping (SLAM) have aroused the interest of many people. Unfortunately, those technologies are not widely used, limited by the computational complexity, data processing and very low and predictable latency. This paper had mainly completed the following work and edge-enabled computing-based edge computing (Shi and Dustdar in Computer 49(5):78–81, 2016) is used as a solution to accelerate calculation. First of all, this research design works with inertial unit mobile robotic navigation systems, and all sensors are connected in edge layers in the framework of edge computing and explore the accelerometer, electronic compass, and gyroscope data. The accelerometer data are integrated using the Kalman filter data fusion algorithm to filter the random drift error caused by the gyroscope and the electronic compass. The state of the machine is determined by calculation of the corresponding attitude angle and position information. Second, a low-cost distance sensor is used to detect the depth and upload to the other fog node for computation. Next, the 3D point coordinate information is projected onto the two-dimensional coordinate extraction feature point to establish the feature map. Third, the extended Kalman filter SLAM is used to achieve simultaneous positioning and mapping. Finally, the method is validated in the experiment, proving that the method is feasible. The main improvement in this article is as follows: First, the multi-sensor data fusion algorithm is used to reduce the positioning error. Second, we use low-cost distance sensors to measure the depth of the model environment and reduce the cost. Third, we would take advantage of translating the three-dimensional depth information into a flat two-dimensional projection information to reduce the calculation of load and computing time. Fourth, our computation is distributed in different layers and focuses on edge-enabled platform to decrease the latency and redundancy.










Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Song H, Rawat D, Jeschke S, Brecher C (2016) Cyber-physical systems: foundations, principles and applications. Academic Press, Boston, pp 1–514. ISBN 978-0-12-803801-7
Kim A, Eustice RM (2013) Real-time visual SLAM for autonomous underwater hull inspection using visual saliency. IEEE Trans Robot 29(3):719–733
Wang R et al (2018) A formal model-based design method for robotic systems. IEEE Syst J 99:1–12
Jeschke S, Brecher C, Song H, Rawat D (2017) Industrial internet of things: cybermanufacturing systems. Springer, Cham, pp 1–715. ISBN 978-3-319-42558-0
Song H, Rawat D, Jeschke S, Brecher C (2016) Cyber-physical systems: foundations, principles and applications. Academic Press, Boston, pp 1–514. ISBN 978-0-12-803801-7
Folkesson J, Christensen HI (2007) Closing the loop with graphical SLAM. IEEE Trans Robot 23(4):731–741
Zhang L et al (2015) A fast robot identification and mapping algorithm based on kinect sensor. Sensors 15(8):19937–19967
Bailey T, Durrant-Whyte H (2006) Simultaneous localization and mapping (SLAM): Part II. IEEE Robot Autom Mag 13(3):108–117
Diosi A, Kleeman L (2005) Laser scan matching in polar coordinates with application to SLAM. In: 2005 IEEE/RSJ international conference on intelligent robots and systems. pp 3317–3322
Lionis GS, Kyriakopoulos KJ (2002) A laser scanner based mobile robot SLAM algorithm with improved convergence properties. In: 2002 IEEE/RSJ international conference on intelligent robots and systems, pp 582–587
Yang J, Zhao W, Ji C, Jiang B, Zheng Z, Song H (2019) The aircraft tracking based on fully conventional network and Kalman filter. IET Image Proces. https://doi.org/10.1049/iet-ipr.2018.5022
Fu S, Liu H, Gao L, et al (2007) SLAM for mobile robots using laser range finder and monocular vision. In: M2VIP 2007 14th international conference on mechatronics and ma-chine vision in practice, pp 91–96
Cao Y et al (2018) Mobile edge computing for big-data-enabled electric vehicle charging. IEEE Commun Mag 56(3):150–156
Liu J et al (2017) A scalable and quick-response software defined vehicular network assisted by mobile edge computing. IEEE Commun Mag 55(7):94–100
Ziyi S et al (2017) Toward architectural and protocol-level foundation for end-to-end trustworthiness in Cloud/Fog computing. IEEE Trans Big Data. https://doi.org/10.1109/TBDATA.2017.2705418
DiFilippo NM, Jouaneh MK (2015) Characterization of different microsoft Kinect sensor models. IEEE Sens J 15(8):4554–4564
Shi W, Cao J, Zhang Q et al (2016) Edge computing: vision and challenges. IEEE Internet Things J 3(5):637–646
Camplani M, Mantecon T, Salgado L (2013) Depth-color fusion strategy for 3D scene modeling with Kinect. IEEE Trans Cybern 43(6):1560–1571
Durrant-Whyte H, Bailey T (2006) Simultaneous localization and mapping (SLAM). IEEE Robot Autom Mag 13(2):99–110
Burgard W, Stachniss C, Grisetti G (2009) A comparison of SLAM algorithms based on a graph of relations. In: IEEE/RSJ international conference on digital object identifer, pp 2089–2095
Ila V, Porta JM, Andrade-Cetto J (2010) Information-based compact pose SLAM. IEEE Trans Robot 26(1):78–93
Lee K, Ryu SH, et al (2014) Analysis of the reference coordinate system used in the EKF-based SLAM. In: Ubiquitous robots and ambient intelligence (URAI), pp 33–38
Piniés P, Lupton T, Sukkarieh S, et al (2007) Inertial aiding of inverse depth SLAM using a monocular camera. In: 2007 IEEE international conference on robotics and automation, pp 2797–2802
Liu W, Wang T, Zhang Y (2014) A relative map approach for efficient EKFSLAM, In: 2014 IEEE Chinese guidance, navigation and control conference (CGNCC), pp 2646–2650
Guivant J, Nebot E, Baiker S (2000) Autonomous navigation and map building using laser range sensors in outdoor applications. J Robot Syst 10(17):565–583
Zeng WJ, Zhang TD, Jiang DP (2010) Analysis of data association methods of SLAM. Syst Eng Electr 32(4):860–864
Xiao Q, Wu Y, Fu H, Zhang Y (2015) Two-stage robust extended Kalman filter in autonomous navigation for the powered descent phase of Mars EDL. IET Signal Proces 9(3):277–287
Welch G, Bishop G (2006) An introduction to the Kalman filter. University of North Carolina, Chapel Hill
Shi W, Dustdar S (2016) The promise of edge computing. Computer 49(5):78–81
Acknowledgements
This work was supported in part by the National Natural Science Foundation of China under Grants U1713212, 61572330, and 61602319, in part by the Natural Science Foundation of SZU under Grant 2016048 and in part by the Technology Planning Project from Guangdong Province, China, under Grant 2014B010118005.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Li, Jq., Zhang, Yf., Chen, Zz. et al. A novel edge-enabled SLAM solution using projected depth image information. Neural Comput & Applic 32, 15369–15381 (2020). https://doi.org/10.1007/s00521-019-04156-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-019-04156-2