Dialpad to Raise $50m to Expand Conversational AI

Funds are planed to be spent on strengthen and expand its AI services to additional products, including enterprise video chat service UberConference
17 July 2018   166

Tech company Dialpad reported on the closure of a $50 million funding round. Funds are planed to be spent on strengthen and expand its conversational AI services to additional products, including enterprise video chat service UberConference. This is reported by VentureBeat.

Dialpad uses VoiceAI from acquired TalkIQ into UberConference, Dialpad phone replacement, and offerings for call center customer service agents. VoiceAI can:

  • provide coaching tips
  • determine whether the person on the other end of a phone or video call is happy with what they’re hearing
  • automatically generate action items from meetings and speech-to-text transcripts.

Evolving from a product standpoint, we’ll be adding multiparty video to it [UberConference] shortly. Beyond that, we will be adding the same AI pieces to it that are in Dialpad and Dialpad Call Center. Having a unified artificial intelligence experience lets a business have much better visibility into how the business is being run, how they’re talking about their products, how their sales and support reps are providing support.
 

Craig Walker

CEO, Dialpad

Earlier Walker founded GrandCentral, a VoIP company acquired by Google in 2007 that would become Google Voice.

Iconiq Capital, Andreessen Horowitz, Amasia, Scale Ventures, and Section 32 took part in the investment. It follows a $17 million round last September. Since its launch in 2011, Dialpad has raised $120 million.

Funding will also be used to grow the company’s headcount by 100 employees. At the moment, it has 275 employees. 

DeepMind to Develop Neural Arithmetic Logic Units

According to researchers, new architecture allows neural networks to perform number-related tasks more efficiently
16 August 2018   105

A team of researchers from DeepMind has developed a new architecture that allows neural networks to perform number-related tasks more efficiently. It involves the creation of a module with the basic mathematical operations described in it. The module was named Neural Arithmetic Logic Unit (NALU).

Scientists have noticed that neural networks are rarely able to successfully generalize concepts beyond the data set on which they were trained. For example, when working with numbers, models don't extrapolate the data to high-order quantities. After studying the problem, the researchers found that it also extends to other arithmetic functions.

When standard neural architectures are trained to count to a number, they often struggle to count to a higher one. We explored this limitation and found that it extends to other arithmetic functions as well, leading to our hypothesis that neural networks learn numbers similar to how they learn words, as a finite vocabulary. This prevents them from properly extrapolating functions requiring previously unseen (higher) numbers. Our objective was to propose a new architecture which could perform better extrapolation.
 

Andrew Trask

Lead researcher, NALU

The structure with NALU suggests predetermining a set of basic, potentially useful mathematical functions (addition, subtraction, division and multiplication). Subsequently, the neural network itself decides where these functions are best used, rather than figuring out from scratch what it is.

The tests showed that neural networks with a new architecture are capable of learning tasks such as tracking time periods, performing arithmetic operations on image numbers, counting objects on a picture, and executing computer code. 

In March 2018, DeepMind introduced a new paradigm for learning AI models. Unlike standard methods, it does not require a large set of input data: the algorithm learns to perform tasks independently, gradually mastering the necessary skills.