Humans are very good at navigating in nearly any environment. Using our eyes and the vestibular system, the orientation sensor in our inner ears, we can quickly create a mental map of wherever we are. Today's robots, however, are not very good at navigating new spaces without information from an external source, like a human operator or GPS, which does not function indoors and is accurate to only a few meters.
At Wyss Zurich, engineers, scientists and programmers are developing an add-on device for robots, the Zurich Eye, which imitates the human eyes and inner ear. Zurich Eye will enable machines to independently navigate in any space. Zurich Eye contains cameras and an inertial measurement unit (IMU), which provides orientation. As it is moved through a space, Zurich Eye uses these sensors to create a digital map. It can then use its onboard processor and software to analyze the data and enable robots to perform specific tasks. The Zurich Eye will be modular and rugged, allowing it to be integrated easily into other systems. The precise maps created by Zurich Eye can also be transmitted to other devices and used to guide other machines or people.
With its affordable components, the Zurich Eye technology could be used in a wide range of fields, in particular, it enables machines to perform difficult tasks in places that are unsafe, or impossible to access for humans. For instance, it could be used for the automation of manufacturing processes. Another potential field of application is civil engineering, where the device could be used to very rapidly generate a precise set of 3D plans. Zurich Eye could also be used in the automotive industry, in the fast-growing areas of driver assistance and autonomous vehicles.