Winter is coming

BERT is a natural language model built by Google. It was trained using a corpus of more than three billion words of English text1. If a human could read at 200 words a minute, 12 hours a day, every day, it would take a little less than 60 years for them to read this much text, once. BERT has more than a third of a billion parameters.

This is nothing like how humans learn a language, which they do in a few years with a relatively minute amount of data.

The current AI hype cycle is built on two pillars:

  • the ability to throw truly vast amounts of training data at enormous computing systems, to achieve rather limited benefits;
  • the willingness of people involved in this programme to make absurd claims about it, in the same way that similar lies were told during all the previous AI hype cycles.

Another AI winter is coming.


  1. Most of this text was the English language Wikipedia. 


You'll only receive email when they publish something new.

More from 100 suns
All posts