Skip to main content
Login | Suomeksi | På svenska | In English

Browsing by Subject "Visual Odometry"

Sort by: Order: Results:

  • Joswig, Niclas (2021)
    Simultaneous Localization and Mapping (SLAM) research is gaining a lot of traction as the available computational power and the demand for autonomous vehicles increases. A SLAM system solves the problem of localizing itself during movement (Visual Odometry) and, at the same time, creating a 3D map of its surroundings. Both tasks can be solved on the basis of expensive and spacious hardware like LiDaRs and IMUs, but in this subarea of visual SLAM research aims at replacing those costly sensors by, ultimately, inexpensive monocular cameras. In this work I applied the current state-of-the-art in end-to-end deep learning-based SLAM to a novel dataset comprising of images recorded from cameras mounted to an indoor crane from the Konecranes CXT family. One major aspect that is unique about our proposed dataset is the camera angle that resembles a classical bird’s-eye view towards the ground. This orientation change coming alongside with a novel scene structure has a large impact on the subtask of mapping the environment, which is in this work done through monocular depth prediction. Furthermore, I will assess which properties of the given industrial environments have the biggest impact on the system’s performance to identify possible future research opportunities for improvement. The main performance impairments I examined, that are characteristic for most types of industrial premise, are non-lambertian surfaces, occlusion and texture-sparse areas alongside the ground and walls.