Posts

Central, Hong Kong – ACX Meetups Everywhere Fall 2023 2023-08-25T23:23:34.521Z

Comments

Comment by batterseapower on Hong Kong – ACX Meetups Everywhere Spring 2024 · 2024-04-04T12:17:23.289Z · LW · GW

We have 8 RSVPs right now. More are welcome :-)

Comment by batterseapower on Hong Kong – ACX Meetups Everywhere Spring 2024 · 2024-04-01T04:47:48.097Z · LW · GW

Last time we had about 12 people - hope we can get similar numbers for this one :-) - Max

Comment by batterseapower on Are we in an AI overhang? · 2020-07-28T21:34:12.250Z · LW · GW

Isn't GPT3 already almost at the theoretical limit of the scaling law from the paper? This is what is argued by nostalgebraist in his blog and colab notebook. You also get this result if you just compare the 3.14E23 FLOP (i.e. 3.6k PFLOPS-days) cost of training GPT3 from the lambdalabs estimate to the ~10k PFLOPS-days limit from the paper.

(Of course, this doesn't imply that the post is wrong. I'm sure it's possible to train a radically larger GPT right now. It's just that the relevant bound is the availability of data, not of compute power.)