Yesterday, Google started to seed to developers a new developer beta version (8.1) of Android Oreo.
The most interesting thing in 8.1 version is Neural Networks API. As Google assure, the Android Neural Networks API (NNAPI) is an Android C API designed for running computationally intensive operations for machine learning on mobile devices. NNAPI is designed to provide a base layer of functionality for higher-level machine learning frameworks (such as TensorFlow Lite, Caffe2, or others) that build and train neural networks. The API is available on all devices running Android 8.1 (API level 27) or higher.
On-device inferencing has many benefits:
- Latency: You don’t need to send a request over a network connection and wait for a response. This can be critical for video applications that process successive frames coming from a camera.
- Availability: The application runs even when outside of network coverage.
- Speed: New hardware specific to neural networks processing provide significantly faster computation than with general-use CPU alone.
- Privacy: The data does not leave the device.
- Cost: No server farm is needed when all the computations are performed on the device.
Android Neural Network API architecture
There are also trade-offs that a developer should keep in mind:
- System utilization: Evaluating neural networks involve a lot of computation, which could increase battery power usage. You should consider monitoring the battery health if this is a concern for your app, especially for long-running computations.
- Application size: Pay attention to the size of your models. Models may take up multiple megabytes of space. If bundling large models in your APK would unduly impact your users, you may want to consider downloading the models after app installation, using smaller models, or running your computations in the cloud. NNAPI does not provide functionality for running models in the cloud.
This features can be very useful in situation like picture classify.
You can learn more at this page.