About

SNIFFBOTSniffing Dangerous Gases with Immersive Robots

General informations

This page serves to make the results of the scientific project SNIFFBOT available to the general public and to attract their attention. The project is completely realized at the Technische Universität Dresden (TUD).

The following professorships are involved:

  • Prof. Dr. rer. nat. Uwe Aßmann, Chair of Software Technology, Faculty of Computer Science
  • Prof. Dr. phil. nat. habil. Ronald Tetzlaff, Chair of Fundamentals of Electrical Engineering, Faculty of Electrical and Computer Engineering
  • Prof. Dr.-Ing. Steffen Ihlenfeldt, Chair of Machine Tools Development and Adaptive Controls, Faculty of Mechanical Science and Engineering
  • Prof. Dr.-Ing. Diana Göhringer, Chair of Adaptive Dynamic Systems, Faculty of Computer Science
  • Prof. Dr.-Ing. habil. Leon Urbas, Chair of Process Control Systems Engineering, Faculty of Electrical and Computer Engineering
  • Prof. Dr Gianaurelio Cuniberti, Chair of Materials Science and Nanotechnology, Faculty of Mechanical Science and Engineering
  • PD Dr.-Ing. habil. Waltenegus Dargie, Chair of Computer Networks, Faculty of Computer Science
Project fundation

This project has been supported by the German Federal State of Saxony as part of the “SNIFFBOT: Sniffing Dangerous Gases with Immersive Robots” project under grant agreement number 100369691.

Motivation

In environments where hazardous gases could be present, the use of people should be prevented as far as possible. Such situations, which are life-threatening to humans, occur not only in chemical production plants, but also in the event of accidents, catastrophes and, of course, in the repair of war damage. In order to adequately protect people from entering such dangerous environments, they should be explored by autonomous vehicles. If hazardous gases are detected, the vehicles should be able to detect, seal or completely eliminate the gas source(s).

Overall objective of the project

The goal of the SNIFFBOT project is to develop methods and technologies for gas sniffing robots (“sniff-bots”). Sniff-bots are semi-autonomous, remote-controlled robots that can detect toxic gases and allow the remote operator to immerse himself in the robot’s environment and eliminate the dangerous sources of gas. To this end, drones and driving robots are to be equipped with modern bio- and micro-sensors to enable them to move around in dangerous regions and carry out appropriate work. For the necessary communication, the robots and drones used will form a dynamic mesh network to enable the bidirectional transmission of commands and data, thus allowing the swarm to act autonomously.
If a sniffbot finds toxic gas, a human should “immerse” the robot to inspect the area and, if possible, seal or remove the source of the toxic gas remotely. The system supports different types of drones and sensors and offers experts different immersive views through and onto the swarm. For this purpose, three types of interactions are planned in the project: the robots with the physical environment, the robots with each other or with the drones, and the robots with humans, i.e. remote controllers or groups in the dangerous situation. The views are augmented by task-related simulations and updates from integrated models of the environment and the situation and thus support contextualization, integration and evaluation of the situation and the options for action
At the same time, the goal must be to develop an adaptive and self-organizing software for sniff-bots, because completely different tasks are required for different application scenarios. The project therefore combines tasks from sensor technology with robotic immersion and software technology for application-specific robot tasks.
The following goals are explicitly addressed by the SNIFFBOT project:
1. development of robust bio- and micro-sensor technology that can be used in harsh environments (e.g. extreme weather conditions)
2. development of communication protocols for building energy-efficient and fail-safe 3D networks (i.e. networks that enable communication between robots, inter-drones, robot-drones, robot-sensors and sensor-sensors).
3. development of self-distributing and self-organising immersive software.
4. development of a lightweight and adaptable simulation environment to model the movement of drones and robots.
5. development of a lightweight middleware to seamlessly coordinate the tasks, data flow and status of different entities.