Enabler Space

[Infographic] Seven Language Challenges Google BERT Resolves

What language challenges does Google BERT, the latest algorithm solves? That would be exactly 7 language nuances as follows:

Named Entity Determination

The information extraction process that aims to classify unstructured text into categories such as name, location, currency, quantities and time expressions.

Textual Entailment Next Sentence Prediction

The ability to predict the next sentence from the context of the first sentence.

Coreference Resolution

The ability to determine which real-world entity is a noun in the phrase referring to. E.g. John gave his dog a bone. ‘His’ refers to ‘John’.

Question Answering

Systems that are designed to be able to automatically answer questions posed by humans in a natural language.

Word Sense Disambiguation

A process that aims to identify the exact sense of an ambiguous word that is used in a certain phrase.

Automatic Summarization

The ability for machines to summarize long text using mathematical and statistical methods.

Polysemy Resolution

A process that aims to understand the correct meaning of a word or phrase that may have many different meanings when used in a different context. e.g. organ, crane, table.

Learn more about Google and search algorithms with us!

Exit mobile version