Announcing ILIAD — Theoretical AI Alignment Conference

post by Nora_Ammann, Alexander Gietelink Oldenziel (alexander-gietelink-oldenziel) · 2024-06-05T09:37:39.546Z · LW · GW · 18 comments

Contents

  ***Apply to attend by June 30!***
  About ILIAD
  Program and Unconference Format
  Financial Support
None
18 comments

We are pleased to announce ILIAD — a 5-day conference bringing together 100+ researchers to build strong scientific foundations for AI alignment.

***Apply to attend by June 30!***

See our website here. For any questions, email iliadconference@gmail.com 

About ILIAD

ILIAD is a 100+ person conference about alignment with a mathematical focus. The theme is ecumenical, yet the goal is nothing less than finding the True Names of AI alignment.

Participants may be interested in all tracks, only one or two or none at all. The unconference format will mean participants have maximum freedom to direct their own time and energy. 

Program and Unconference Format

ILIAD will feature an unconference format - meaning that participants can propose and lead their own sessions. We believe that this is the best way to release the latent creative energies in everyone attending.

That said, freedom can be scary! If taking charge of your own learning sounds terrifying, rest assured there will be plenty of organized sessions as well. We will also run the topic-specific workshop tracks such as:

Financial Support

Financial support for accommodation & travel are available on a needs basis. Lighthaven has capacity to accommodate % of participants. Note that these rooms are shared.  

18 comments

Comments sorted by top scores.

comment by Chris_Leong · 2024-06-05T13:47:51.732Z · LW(p) · GW(p)

How are applications processed? Sometimes applications are processed on a rolling basis, so it's important to submit as soon as possible. Other times, you just need to apply by the date, so if you're about to post something big, it makes sense to hold-off your application.

Replies from: alexander-gietelink-oldenziel
comment by Alexander Gietelink Oldenziel (alexander-gietelink-oldenziel) · 2024-06-05T16:44:41.437Z · LW(p) · GW(p)

We intend to review end of the submit deadline June 30th but I wouldn't hold off on your application. 

comment by jacobjacob · 2024-06-05T10:15:24.724Z · LW(p) · GW(p)

Sidenote: I'm a bit confused by the name. The all caps makes it seem like an acronym. But it seems to not be? 

Replies from: gw, mateusz-baginski, vanessa-kosoy, Lorxus, Alex_Altair
comment by gw · 2024-06-05T11:06:37.899Z · LW(p) · GW(p)

I
Love
Interesting
Alignment
Donferences

Replies from: jacobjacob, TsviBT, TsviBT
comment by jacobjacob · 2024-06-05T14:25:38.033Z · LW(p) · GW(p)

ah that makes sense thanks

comment by TsviBT · 2024-06-05T20:02:40.516Z · LW(p) · GW(p)

honestly i prefer undonfrences

Replies from: Alex_Altair
comment by Alex_Altair · 2024-06-05T20:43:05.166Z · LW(p) · GW(p)

How about deconferences?

Replies from: TsviBT
comment by TsviBT · 2024-06-05T21:08:29.995Z · LW(p) · GW(p)

idk, sounds dangerously close to deferences

comment by TsviBT · 2024-06-05T21:32:42.813Z · LW(p) · GW(p)

Insightful

Learning

Implore

Agreed

Delta

comment by Mateusz Bagiński (mateusz-baginski) · 2024-06-06T06:33:48.906Z · LW(p) · GW(p)

Intentional
Lure for
Improvised
Acronym
Derivation

comment by Vanessa Kosoy (vanessa-kosoy) · 2024-06-05T18:06:13.285Z · LW(p) · GW(p)

International League of Intelligent Agent Deconfusion

comment by Lorxus · 2024-06-05T23:31:50.849Z · LW(p) · GW(p)

It's the Independently-Led Interactive Alignment Discussion, surely.

comment by Alex_Altair · 2024-06-05T18:14:49.435Z · LW(p) · GW(p)

Interactively Learning the Ideal Agent Design

comment by Lorxus · 2024-08-25T06:02:04.883Z · LW(p) · GW(p)

> https://www.lesswrong.com/posts/r7nBaKy5Ry3JWhnJT/announcing-iliad-theoretical-ai-alignment-conference#whqf4oJoYbz5szxWc

you didn't invite me so you don't get to have all the nice things, but I did leave several good artifacts [LW · GW] and books I recommend lying around. I invite you to make good use of them!

Replies from: alexander-gietelink-oldenziel
comment by Alexander Gietelink Oldenziel (alexander-gietelink-oldenziel) · 2024-08-25T07:34:18.452Z · LW(p) · GW(p)

Thank you Lorxus, that's appreciated. I'm sure we can make good use of them.

Unfortunately, we get many more applications than we have spots so we have to make some tough choices. Better luck next time!

comment by Lorxus · 2024-06-05T23:30:08.149Z · LW(p) · GW(p)

https://manifold.markets/Lorxus/will-lorxus-attend-the-iliad-unconf?r=TG9yeHVz

Replies from: Lorxus
comment by Lorxus · 2024-06-10T16:27:56.834Z · LW(p) · GW(p)

Also: if I get accepted to come to ILIAD I am going to make delicious citrus sodas.[1] Maybe I could even run a pair of panels about that?[2] That seemed extremely out of scope though so I didn't put it in the application.

  1. ^

    Better than you've had before. Like, ever. Yes I am serious, I've got lost lore. Also, no limit on the flavor as long as it's a citrus fruit we can go and physically acquire on-site. Also, no need at all for a stove or heating element.

  2. ^

    There is a crucially important time-dependent step on the scale of hours, so a matched pair of panels would be the best format.

comment by Review Bot · 2024-06-05T19:12:57.282Z · LW(p) · GW(p)

The LessWrong Review [? · GW] runs every year to select the posts that have most stood the test of time. This post is not yet eligible for review, but will be at the end of 2025. The top fifty or so posts are featured prominently on the site throughout the year.

Hopefully, the review is better than karma at judging enduring value. If we have accurate prediction markets on the review results, maybe we can have better incentives on LessWrong today. Will this post make the top fifty?