Do Androids Dream of Electric Sheep? is the original title of the Philip K. Dick science-fiction novel on which the film Blade Runner was based – and the idea that robots could actually have senses and visualise has come a step closer.
► AI used to create robots that can ‘see’ clear objects

► Systems that enable robots to ‘touch’ and ‘smell’ also in advanced stages
Researchers at the Massachusetts Institute of Technology (MIT) has developed a system that links visual stimuli to human touch sensations; Carnegie Mellon University have used AI to create robots that can ‘see’ transparent objects and even pick them up; while a French developer, Aryballe, has come up with one that can ‘smell’ gas leaks or burning. A US firm – Analytical Flavor Systems Inc. – has meanwhile created Gastrograph AI, a platform that will recognise hundreds of different flavours.
While these technologies are still some way off giving robots the ability to have human-like experiences and to imagine or even ‘dream’, they illustrate the rapid progress that is being made with the help of the latest AI, neural networks, and IoT technologies and these advances will undoubtedly see new and exciting products being developed and coming to market over the next few years.
Read more of our latest Industry Updates stories