post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by ChristianKl · 2019-08-05T13:58:09.861Z · LW(p) · GW(p)

How exactly do you implement using GPT-2 for autocompletion? This usage is new to me.

Replies from: gwern
comment by gwern · 2019-08-05T16:29:00.706Z · LW(p) · GW(p)

In theory? You just generate a few random samples with the current text as the prefix and display them. In practice, there's already tools to do this: Talk to Transformer does autocomplete. Even better, IMO, is Deep TabNine for programming languages, trained off Github.