Short Primers on Crucial Topics
post by lukeprog · 2012-05-31T00:46:19.709Z · LW · GW · Legacy · 24 commentsContents
24 comments
Series: How to Purchase AI Risk Reduction
Here's another way we might purchase existential risk reduction: the production of short primers on crucial topics.
Resources like The Sequences and NickBostrom.com have been incredibly effective at gathering and creating a community engaged in x-risk reduction (either through direct action or, perhaps more importantly, through donations), but most people who could make a difference probably won't take the time to read The Sequences or academic papers.
One solution? Short primers on crucial topics.
Facing the Singularity is one example. I'm waiting for some work from remote researchers before I write the last chapter, but once it's complete we'll produce a PDF version and a Kindle version. Already, several people (including Jaan Tallinn) use it as a standard introduction they send to AI risk newbies.
Similar documents (say, 10 pages in length) could be produced for topics like Existential Risk, AI Risk, Friendly AI, Optimal Philanthropy, and Rationality. These would be concise, fun to read, and emotionally engaging, while also being accurate and thoroughly hyperlinked/referenced to fuller explanations of each section and major idea (on LessWrong, in academic papers, etc.).
These could even be printed and left lying around wherever we think is most important: say, at the top math, computer science, and formal philosophy departments in the English-speaking world.
The major difficulty in executing such a project would be in finding good writers with the relevant knowledge. Eliezer, Yvain, and myself might qualify, but right now the three of us are otherwise occupied. The time investment of the primary author(s) could be minimized by outsourcing as much of the work as possible to SI's team of remote researchers, writers, and editors.
Estimated cost per primer:
- 80 hours from primary author. (Well, if it's me. I've put about 60 hours into the writing of Facing the Singularity so far, which is of similar length to the proposed primers but I'm adding some padding to the estimate.)
- $4,000 on remote research. (Tracking down statistics and references, etc.)
- $1000 on book design, Kindle version production, etc.
24 comments
Comments sorted by top scores.
comment by [deleted] · 2012-05-31T00:54:34.938Z · LW(p) · GW(p)
Thanks for laying out these specific proposals, Luke.
Whatever you guys decide to go with, I am much more tempted to donate to SI because of these.
comment by Paul Crowley (ciphergoth) · 2012-05-31T12:02:46.504Z · LW(p) · GW(p)
The AI Risk wiki still seems worth doing first, because it increases the proportion of Luke Muehlhausers to Holden Karnofskys it creates by rounding out the details of the argument.
comment by Grognor · 2012-05-31T07:00:26.784Z · LW(p) · GW(p)
Without doing any cost-benefit analysis, I can tell you that, of the three so far, this one gives me by far the most fuzzies, just thinking about it. A scholarly wiki? Boring. Research? Boring. Short primers on crucial topics??? That sounded less boring in my head.
I couldn't tell you why this happened. Maybe I just really liked Facing the Singularity more than I realized. Does anyone else have a similar reaction?
comment by John_Maxwell (John_Maxwell_IV) · 2012-05-31T05:11:55.336Z · LW(p) · GW(p)
I like this idea overall, but SI might want to think twice before it risks becoming known as a group that leaves books promoting its ideas lying around.
Also, I'm doubtful of the claim that such introductory books can only be usefully produced by top Less Wrong contributors.
Replies from: John_Maxwell_IV, lukeprog↑ comment by John_Maxwell (John_Maxwell_IV) · 2012-05-31T05:46:16.247Z · LW(p) · GW(p)
Passing out flyers seems superior to leaving books around. It more closely resembles awareness raising methods used by most charities, and I think a flyer can be a more effective sales pitch (with a pointer to a website where you can read more) than a book cover. Additionally it should be cheaper per person reached by far, and could give Less Wrong users practice with rejection therapy.
I have a friend who passed out flyers with some success for his life extension charity, and claims to have a contact in the Berkeley area who will pass out flyers for cheap. He tried to get Michael Anissimov to design an SI flyer for this guy to pass out, but Anissimov didn't end up going for it. Get in touch with me if you want.
Replies from: timtyler, MichaelAnissimov↑ comment by timtyler · 2012-05-31T10:18:26.767Z · LW(p) · GW(p)
I would like to see the LW sandwich board.
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2012-05-31T18:58:48.491Z · LW(p) · GW(p)
It feels like the best sandwich board would combine a provocative, intriguing claim with some sort of insider signal that the board user is conversant with advanced math, programming, or what have you.
Replies from: timtyler↑ comment by MichaelAnissimov · 2012-06-01T12:42:01.487Z · LW(p) · GW(p)
If anyone feels that they know the issues (extremely) well enough to co-write a succinct, informative, and punchy SI flyer with me, I encourage them to get in contact: michael@intelligence.org. My other assignments prevent me from following through on this alone, I'm afraid. I do appreciate being encouraged to do this, I just feel that it's too much responsibility to take on alone. Such a flyer would need to be of a high quality to give a favorable impression.
comment by Stuart_Armstrong · 2012-06-06T16:09:56.374Z · LW(p) · GW(p)
Unlike the wiki idea, this is something that I can wholeheartedly endorse. Even shorter summaries of the primer (flyer or one-page sizes) would be good too.
comment by Vaniver · 2012-05-31T17:29:11.534Z · LW(p) · GW(p)
$1000 on book design, Kindle version production, etc.
I would recommend at least doubling this budget, I think (with the understanding that you don't have to spend it all). These should look really appealing, and it might be beneficial for them to be illustrated on the interior.
Replies from: lukeprogcomment by gjm · 2012-05-31T08:25:15.374Z · LW(p) · GW(p)
Is the cover design shown here (1) just for fun here on LW, or (2) something you're thinking of actually doing on actual kinda-book-like entities?
If the latter, then you might want to reconsider the merits of making it quite so blatant a ripoff of the famous "Very Short Introduction" series of books. That seems like it might ring some readers' confidence-trick alarm bells. (It certainly does mine.)
comment by XiXiDu · 2012-05-31T10:33:39.025Z · LW(p) · GW(p)
Looking at the page of Facing the Singularity I just realized again how wrong it is from the perspective of convincing people who are not already inclined to believe that stuff. The header, the title, the text...wrong, wrong, wrong!
Facing the Singularity
The advent of an advanced optimization process and its global consequences
Sometime this century, machines will surpass human levels of intelligence and ability. This event — the “Singularity” — will be the most important event in our history, and navigating it wisely will be the most important thing we can ever do.
The speed of technological progress suggests a non-negligible probability of the invention of advanced general purpose optimization processes, sometime this century, exhibiting many features of general intelligence as envisioned by the proponents of strong AI (artificial intelligence that matches or exceeds human intelligence) while lacking other important characteristics.
This paper will give a rough overview of 1) the expected power of such optimization processes 2) the lack of important characteristics intuitively associated with intelligent agents, like the consideration of human values in optimizing the environment 3) associated negative consequences and their expected scale 4) the importance of research in preparation of such a possibility 5) a bibliography of advanced supplementary material.
Replies from: Kindlycomment by XiXiDu · 2012-05-31T09:09:50.569Z · LW(p) · GW(p)
...say, at the top math, computer science, and formal philosophy departments in the English-speaking world.
People at top academic departments everywhere in the world speak English... (which is probably true even for the janitor when it comes to some western countries).
Replies from: D2AEFEA1↑ comment by D2AEFEA1 · 2012-05-31T14:37:06.618Z · LW(p) · GW(p)
How well do they though? I've seen a few academics from around me having enough command of English to get by, but they might still miss some of the subtle points. They just can't reason as well in English as they do in their mother tongue.
Replies from: Barry_Cotter↑ comment by Barry_Cotter · 2012-06-01T08:22:36.143Z · LW(p) · GW(p)
As of 1997 more than 95% of research articles in the Science Citation Index were written in English. Being able to read and write in English is a hard requirement for participation in the community of scholars in STEM disciplines and somewhere between a hard requirement and very, very useful elsewhere. I doubt there are any top level philosophers who can't read English well enough to parse extremely complicated arguments. Whether they can write, speak or lsten as well, dunno.
comment by borntowin · 2012-06-01T11:34:58.308Z · LW(p) · GW(p)
A guy who works for a book publisher once told me that they pay about 8 euro's per 1000 words to a good translator for books they translate from foreign languages. So by this calculation you can have a 100.000 words text translated in Romanian for 800 euros.
Replies from: Kindly↑ comment by Kindly · 2012-06-01T14:06:46.372Z · LW(p) · GW(p)
Facing the Singularity is approximately 14000 words. The hypothetical 10-page primers would probably be even shorter, maybe 3000 words, although hoping to get them down to 10 pages might be optimistic. So if translations to other languages are similarly priced, you're looking at around $600 for all four translations of Facing the Singularity, or around $100 for the shorter primers.
This doesn't include "checks and improvements by multiple translators", but I imagine those can probably obtained more cheaply than an actual translation, and it seems like $2000 is far too high an estimate for the cost.
Replies from: faul_sname↑ comment by faul_sname · 2012-06-01T15:48:32.378Z · LW(p) · GW(p)
There are always unanticipated costs. I find that I generally need 1.5-2 times the amount of money that I know the use for beforehand .
comment by Bruno_Coelho · 2012-05-31T22:39:23.631Z · LW(p) · GW(p)
The wiki, research and shorter primers proposals signals a little strategic turn in SI. To me, the shorter primer proposal sounds more academic, even if they not come with the "Oxford University Press" in the front.
comment by XiXiDu · 2012-05-31T09:24:10.698Z · LW(p) · GW(p)
Before you continue with this you should maybe try to get someone important read 'Facing the Singularity' without trying too hard. If that doesn't work then...
I have my doubts that someone like Terence Tao would read your primer.
For some time now I am watching the real-time stats for my homepage, especially when I post links at places where people of similar calibre to Terence Tao are chatting. And I seldom get more than 2 clicks, even if more than 20 people converse in that thread.
Now it is true that I am a nobody, why would they read a post written on my personal blog? But how would they know that something called 'Facing the Singularity' is more worthy of their attention?
If I really wanted to I would probably be able to get them read my stuff. But that's difficult and would probably take a middleman who shares a link to it on his blog/Google+/Facebook page and whose stuff is subsequently read by top-notch people.