Samsung to Create DRAM for AI Mobile Apps

The chip called LPDDR5 is created using a 10-nanometer process technology and has a capacity of 8 GB
20 July 2018   216

Samsung has completed the development and testing of a new memory module standard LPDDR5. The chip is created using a 10-nanometer process technology and has a capacity of 8 GB. LPDDR5 DRAM is planned to be used in the production of smartphones with support for mobile communication 5G, on-board electronics of cars, as well as solutions for AI based on mobile platforms.

One of the innovations of LPDDR5 technology is improved energy efficiency. According to the company, the chip consumes 30% less electricity compared to the previous generation of LPDDR4X memory chips. The number of memory banks (subsections of the DRAM cell) is increased from 8 to 16 to provide high speed with low power consumption. In active mode, the module lowers the operating voltage to synchronize with the application processing speed.

Also, the new chip implements an advanced "deep sleep mode" to reduce the power consumption level to half the level of "standby" in LPDDR4X memory modules.

In the new LPDDR5 chip, the data transfer rate is increased to 6400 Mbit / s, which is 1.5 times more than the last LPDDR4X chips. According to Samsung, the module allows to transfer the volume in 51.2 GB of data (14 films with the size of 3.7 GB each as a Full-HD) in just 1 second.

The chip is scheduled to be launched in mass production as soon as the LPDDR5 specification is approved.

Oracle to Open GraphPipe Source Code

GraphPipe is a tool that simplifies the maintenance of machine learning models
17 August 2018   136

Oracle has opened the source code of the GraphPipe tool to simplify the maintenance of machine learning models. It supports projects based on the TensorFlow, MXNet, Caffe2 and PyTorch libraries. They are intended for use in IoT-devices, custom web-services and corporate AI-platforms.

The tool eliminates the need for developers to create custom APIs. Also, it eliminates confusion when using multiple frameworks and prevents memory copying during deserialization. The developers hope that GraphPipe will become a standard tool for deploying models.

GraphPipe is free and available on GitHub. It consists of open source tools designed to work with artificial intelligence. For example, the TensorFlow framework and the Open Neural Network Exchange (ONNX) project for creating portable neural networks are among them.

In September 2017, Microsoft introduced own tools for operating with machine learning. At the same time, the company released utilities for using Visual Studio Code when creating models based on the CNTK and Keras frameworks.