Sensor fusion means the combining of sensory data from multiple sensors to form a coherent overview of the world. In waste processing, this means that e.g. weighing, haptic sensors and NIR imagery are used to form a comprehensive view of the situation.
In the picking process sensor fusion combines all sensory data – including historical data – into an analysis of whether the item is wanted or not — and what would be the optimum way to grip it. ZenRobotics Brain can also e.g. apply machine learning to combine the data of two physical sensors to create a so called virtual sensor. The data of this virtual sensor would be impossible to obtain traditionally, i.e. by using the data of the two physical sensors separately. Never before has there been such a diverse and accurate analysis of waste.