The British company DeepMind works on the unpredictable behavior of AI technologies, Facebook implemented artificial intelligence to help users buy products, computer vision helps to recognize smokers at petrol stations, and hackers can control Alexa.
Read more about this news in our digest.
DeepMind researches the unpredictable behavior of AI
The British company DeepMind has started studying the unpredictable behavior of AI. Researches have shown that neural networks can harm themselves and people in case they optimize their work improperly.
Researchers experimented with CoastRunners game, where players have to compete in the boat race. While playing the game, the neural network understood that it received more scores for crashes with game targets than for finishing the race. As a result, it stopped the participation and started colliding with targets.
DeepMind's representatives say that currently AI is a black box, as the motives of its behavior are often hardly traceable. This results in the lack of trust in artificial intelligence solutions. To make AI more secure and efficient, it is important to control system activity, which includes monitoring and enforcement.
AI will help Facebook to sell goods
Facebook added an AI-based update for Marketplace. Now the service can analyze the uploaded image, define the category of the good, and even recommend the price.
Launched in 2016, Marketplace is available in the USA, Europe, and Russia. Goods are sorted by categories and the geolocation of the seller.
Thanks to AI, the service will get simpler for buyers: Facebook plans to introduce the option allowing users to upload a photo and receive recommendations as to where to buy the depicted product and how much it costs.
Developers state that Marketplace can also offer interior items basing on the photo of the room and choose corresponding goods.
The new AI system will define smokers at petrol stations
Shell oil and gas company and Microsoft developed a system that can recognize a smoking person and prevent fire at petrol stations. Computer vision can detect a cigarette, lighter, or smoke and give an alert that will trigger the alarm system.
Currently, the system is being tested at two petrol stations in Singapore and Thailand. Microsoft is planning to spread the use of the new technology among other petrol stations.
Hackers can control voice assistants
German scientists found out that voice assistants could be hacked using noise hidden in audio files. Voice assistants can perceive commands inaudible to the human ear.
Professor Thorsten Holz called this type of the hacking attack “psychoacoustic hiding”. A person would hear sounds of an application, advertisement, or music, whereas the virtual assistant would “hear”, recognize, and fulfil the command. In such a way, hackers can take hold of private information, turn off the alarm system and cameras.
This is not the first case when vulnerabilities of voice assistants are detected. In 2017, Chinese scientists of Zhejiang University conducted a research and revealed that Alexa could similarly perceive commands. Amazon developers promised to study the problem thoroughly, but the possibility of hacking still exists.
There have been no incidents of such hacking attacks, but scientists recommend using the PIN code to prevent voice assistants from making unauthorized purchases and having access to banks.
You'll know more than your colleagues and business rivals.