Neural Network to be Used For Cooking Receipts

One of the best cooking receipt by an AI is to preheat the oven to 3500 ° C for 8 minutes
06 August 2018   308

The creator of the AIweirdness blog decided to write a cookbook using a neural network. She used the framework from the GitHub resource and published the results.

Despite its early developments, Janelle Shane decided to literally start from scratch and used the possibilities of the textgenrnn framework, transforming them into recipes. The final data for learning the neural network looked like this:

  • framework: textgenrnn, long text mode;
  • memory: 40 characters (default);
  • Duration: about 15 hours with NVIDIA Tesla K80 graphics card (using Google Cloud);
  • temperature: 0,6.

The latest results were better than they were before. But, nevertheless, there are still absurd combinations of ingredients:

  • 1 long granules sugar;
  • 1 Spanish water;
  • 1 cup of cream cheese seeds.

And the names of recipes resemble computer-generated descriptions for products with AliExpress. More details can be found in the AIweirdness blog.

Initially, in order to teach AI to make culinary recipes, Janelle used 30,000 ready-made recipes, which she collected from various sources. However, this experience was not successful. Considering memory of only a few words in length without a certain selection concept, something extraordinary or simply unreal was obtained. For example:

  • 4.5 kg of broccoli dried in a clay pan;
  • half a pint of spicy pieces;
  • 42 cups of milk;
  • Preheat the oven to 3500 ° C for 8 minutes.

And these examples from the cooking tips perfectly illustrate the ineffectiveness of the original method:

  • mix honey, liquid water of the toes, salt and 3 tablespoons of olive oil;
  • throw a frying pan;
  • tear off part of the pan.

In June 2018 AI was taught to create memes. A month later, in July 2018, scientists attempted to improve the model for recognizing speech accents.

DeepMind to Develop Neural Arithmetic Logic Units

According to researchers, new architecture allows neural networks to perform number-related tasks more efficiently
16 August 2018   105

A team of researchers from DeepMind has developed a new architecture that allows neural networks to perform number-related tasks more efficiently. It involves the creation of a module with the basic mathematical operations described in it. The module was named Neural Arithmetic Logic Unit (NALU).

Scientists have noticed that neural networks are rarely able to successfully generalize concepts beyond the data set on which they were trained. For example, when working with numbers, models don't extrapolate the data to high-order quantities. After studying the problem, the researchers found that it also extends to other arithmetic functions.

When standard neural architectures are trained to count to a number, they often struggle to count to a higher one. We explored this limitation and found that it extends to other arithmetic functions as well, leading to our hypothesis that neural networks learn numbers similar to how they learn words, as a finite vocabulary. This prevents them from properly extrapolating functions requiring previously unseen (higher) numbers. Our objective was to propose a new architecture which could perform better extrapolation.

Andrew Trask

Lead researcher, NALU

The structure with NALU suggests predetermining a set of basic, potentially useful mathematical functions (addition, subtraction, division and multiplication). Subsequently, the neural network itself decides where these functions are best used, rather than figuring out from scratch what it is.

The tests showed that neural networks with a new architecture are capable of learning tasks such as tracking time periods, performing arithmetic operations on image numbers, counting objects on a picture, and executing computer code. 

In March 2018, DeepMind introduced a new paradigm for learning AI models. Unlike standard methods, it does not require a large set of input data: the algorithm learns to perform tasks independently, gradually mastering the necessary skills.