MIT to Propose to Teach AI Like a Baby

The machine must independently observe people, “listen” to their conversations and form vocabulary
02 November 2018   202

MIT has developed a parser for artificial intelligence learning language. Its feature is learning through observation - as children do. This is reported by Engadget Com.

The method is not based on clear descriptions of words and concepts, but on the method of “weak control” and passive learning. The machine must independently observe people, “listen” to their conversations and form vocabulary. In the same way, children learn to speak, listening and learning words.

It is assumed that this approach will simplify the accumulation of vocabulary and allow programs and robots to more accurately perceive human speech and respond to it.

People in conversation often use only part of the sentence and violate the rules of grammar. Word analysis 'on the run' is supposed to improve the performance of AI systems and parsers. The parser does not rely on a specific context, and therefore, allows robots to perceive implicitly formulated orders.

The analyzer will help to find out how your child learns the language, which will help not only the developers of robots, but also specialists working with children.

MIT used the passive method of teaching the AI ​​network underlying the parser. Neural networks see video and text descriptions of actions, and the system correlated data and linked words with objects and actions. Researchers used 400 videos.

Scientists argue that the technology is easily scaled, and can be used where voice control or communication with AI is necessary.

Intel to Present Neural Compute Stick 2

Neural Compute Stick 2 is an autonomous neural network on a USB drive
15 November 2018   115

At the Beijing conference, Intel introduced Neural Compute Stick 2, a device that facilitates the development of smart software for peripheral devices. These include not only network equipment, but also IoT systems, video cameras, industrial robots, medical systems and drones. The solution is intended primarily for projects that use computer vision.

Neural Compute Stick 2 is an autonomous neural network on a USB drive and should speed up and simplify the development of software for peripheral devices by transferring most of the computation needed for learning to the specialized Intel Movidius Myriad X processor. Neural Compute Engine, responsible for the high-speed neural network of deep learning.

The first Neural Compute Stick was created by Movidius, which was acquired by Intel in 2016. The second version is 8 times faster than the first one and can work on Linux OS. The device is connected via a USB interface to a PC, laptop or peripheral device.

Intel said that Intel NCS 2 allows to quickly create, configure and test prototypes of neural networks with deep learning. Calculations in the cloud and even access to the Internet for this is not needed.

The module with a neural network has already been released for sale at a price of $ 99. Even before the start of sales, some developers got access to Intel NCS 2. With its help, projects such as Clean Water AI, which use machine vision with a microscope to detect harmful bacteria in water, BlueScan AI, scanning the skin for signs of melanoma, and ASL Classification, real-time translates sign language into text.

Over the Movidius Myriad X VPU, Intel worked with Microsoft, which was announced at the Developer Day conference in March 2018. The AI ​​platform is expected to appear in upcoming Windows updates.