What should we expect from GPT-3?

post by avturchin · 2019-03-21T14:28:37.702Z · LW · GW · 1 comment

This is a question post.

Contents

  Answers
    3 avturchin
None
1 comment

When it will appear? (My guess is 2020).

Will it be created by OpenAI and will it be advertised? (My guess is that it will not be publicly known until 2021, but other companies may create open versions before it.)

How much data will be used for its training and what type of data? (My guess is 400 GB of text plus illustrating pictures, but not audio and video.)

What it will be able to do? (My guess: translation, picture generation based on text, text generation based on pictures – with 70 per cent of human performance.)

How many parameters will be in the model? (My guess is 100 billion to trillion.)

How much compute will be used for training? (No idea.)

Answers

answer by avturchin · 2019-11-12T10:16:34.595Z · LW(p) · GW(p)

In October 2019, a model was trained by Google with on 750 GB training data and it has 11 billion parameters (vs. 40 Gb and 1.6B for GPT-2 8 months before that.)

1 comment

Comments sorted by top scores.

comment by Daniel Kokotajlo (daniel-kokotajlo) · 2020-06-01T13:35:14.243Z · LW(p) · GW(p)

Looks like the results are in!