Posts

OpenAI charter 2018-04-09T21:02:04.621Z

Comments

Comment by wunan on Why Productivity Systems Don't Stick · 2021-01-16T17:57:01.028Z · LW · GW

Has this new congruency-based approach led to less, the same, or more productivity than what you were doing before and how long have you been doing it?

Comment by wunan on Matt Goldenberg's Short Form Feed · 2021-01-06T17:44:45.359Z · LW · GW

Is losing weight one of your goals with this?

 

Like you said, since it hasn't been studied you're not going to find anything conclusive about it, but it may be a good idea to skip the fast once a month (i.e. 3 weeks where you do 88 hour fasts, then 1 week where you don't fast at all).

Comment by wunan on I object (in theory) · 2020-12-30T22:48:43.007Z · LW · GW

I object to the demonstration because it's based on the false assumption that there's a fixed amount of value (candy, money) to be distributed and that by participating in capitalism, you're playing a zero-sum game. Most games played in capitalism are positive-sum -- you can make more candy.

Comment by wunan on Tweet markets for impersonal truth tracking? · 2020-11-10T16:07:51.696Z · LW · GW

Do you have a source for the 80% figure?

Comment by wunan on Seek Upside Risk · 2020-09-29T21:33:09.741Z · LW · GW

I agree that this is a really important concept. Two related ideas are asymmetric risk and Barbell strategies, both of which are things that Nassim Nicholas Taleb writes about a lot.

Comment by wunan on Where is human level on text prediction? (GPTs task) · 2020-09-21T14:21:39.370Z · LW · GW

What is that formula based on? Can't find anything from googling. I thought it may be from the OpenAI paper Scaling Laws for Neural Language Models, but can't find it with ctrl+f.

Comment by wunan on Where is human level on text prediction? (GPTs task) · 2020-09-20T13:39:23.202Z · LW · GW

In Steve Omohundro's presentation on GPT-3, he compares the perplexity of some different approaches. GPT-2 scores 35.8, GPT-3 scores 20.5, and humans score 12. Sources are linked on slide 12.

Comment by wunan on Escalation Outside the System · 2020-09-09T10:38:24.221Z · LW · GW

People are literally looting businesses and NPR is publishing interviews supporting it. They're not just interviewing people who support it -- the interviewer also supports it. What makes you think these aren't actual policy proposals?

They may only propose it for deep social-signalling reasons as you say, but that doesn't mean it's not actually a proposal. Historically, we've seen that people are willing to go through with mass murders.

Comment by wunan on Are we in an AI overhang? · 2020-07-28T13:54:00.329Z · LW · GW

In the Gwern quote, what does "Even the dates are more or less correct!" refer to? Which dates were predicted for what?

Comment by wunan on Are we in an AI overhang? · 2020-07-27T15:36:46.786Z · LW · GW

This was mentioned in the "Other Constraints" section of the original post:

Inference costs. The GPT-3 paper (§6.3), gives .4kWh/100 pages of output, which works out to 500 pages/dollar from eyeballing hardware cost as 5x electricity. Scaling up 1000x and you're at $2/page, which is cheap compared to humans but no longer quite as easy to experiment with
I'm skeptical of this being a binding constraint too. $2/page is still very cheap.
Comment by wunan on [Resource Request] What are good resources for best practice for creating bureaucracy? · 2020-07-09T16:26:05.214Z · LW · GW

The field of mechanism design seems very relevant here.

Comment by wunan on My experience with the "rationalist uncanny valley" · 2020-04-22T22:47:08.477Z · LW · GW

It might help if you try to think less in terms of making rationality and EA part of your identity and instead just look at them as some things you're interested in. You could pursue the things you're interested in and become a more capable person even if you never read anything else from the rationality community again. Maybe reading stuff from people who have achieved great things and had great ideas and who have not been influenced by the rationality community (which, by the way, describes most people who have achieved great things and had great ideas) would help? E.g. Paul Graham's essays are good (he's kind of LW-adjacent, but was writing essays long before the rationality community was a thing): http://paulgraham.com/articles.html

I think the rationality community is great, it has hugely influenced me, and I'm glad I found it, but I'm pretty sure I'd be doing great stuff even if I never found it.

Comment by wunan on Life can be better than you think · 2019-01-21T17:50:05.514Z · LW · GW

I remember reading SquirrelInHell's posts earlier and I'm really sorry to hear that. Is there any more public information regarding the circumstances of the suicide? Couldn't find anything with google.

Comment by wunan on What is a reasonable outside view for the fate of social movements? · 2019-01-05T00:50:51.593Z · LW · GW

The podcast Rationally Speaking recently had an episode on the Mohists, a "strikingly modern group of Chinese philosophers active in 479–221 BCE." They discuss what caused the movement to die out and draw comparisons between it and the Effective Altruism movement.

Comment by wunan on One night, without sleep · 2018-08-24T14:28:52.300Z · LW · GW

Have you heard about the EA Hotel? Or considered moving to a country with a very low cost of living?

Comment by wunan on One night, without sleep · 2018-08-19T18:52:02.121Z · LW · GW
In my case, I think illness is very much just a symptom of the struggle to get on with things in an interfering environment.

Do you mean you think you have something like Mindbody syndrome/TMS? I thought I had it for a while, but now suspect the root causes are actually physiological, not psychological, for me.

Just to clarify, am I interpreting your post correctly in reading it as you saying that the reason you're not operating at your full potential is because of a chronic illness which causes migraines and other symptoms? If so, this may be something that you've already thought of, but it's worth putting a lot of effort into tracking down the root cause of of the illness and fixing it (assuming you don't already know the root cause and that there is a potential fix) even if it means temporarily working more slowly on the AI alignment problem. That's what I'm doing, at least.

Comment by wunan on One night, without sleep · 2018-08-17T03:44:10.940Z · LW · GW
I want to attempt with all my strength, to do the specific things that I see to be done.
And this is why the illness of bodily defeat is so bitter;
one's struggle, conducted alone, but in the hope of one day dragging treasures into daylight,
is felled by the weakness of one's own physical vehicle.

I also suffer from a chronic illness that keeps me from pursuing my goals (which I think are the same as your goals) at anything close to the speed at which I feel I should be able to. I don't know if my condition is better or worse than yours, but one thing that helps me is to think about how there are others out there who are a lot like me, but without these limits, and they seem to be doing what I wish I could do. Maybe they'll succeed even if I'm not able to help. And if they succeed, then so do I.

You're not as alone as you think.

Comment by wunan on A friendly reminder of the mission · 2018-06-05T02:52:50.420Z · LW · GW

Related: Nick Bostrom's Letter from Utopia

Comment by wunan on Questions about the Usefulness of Self-Importance · 2018-05-27T17:12:57.251Z · LW · GW

You might like the content on 80000 hours, which is pretty popular around here.

Comment by wunan on Double Cruxing the AI Foom debate · 2018-04-27T17:35:54.353Z · LW · GW

Recursive Self-Improvement

Comment by wunan on Rationality and Spirituality - Summary and Open Thread · 2018-04-21T15:39:13.937Z · LW · GW

Book recommendation: Waking Up: A Guide to Spirituality Without Religion by Sam Harris. Discusses enlightenment, meditation, and psychedelics.

Comment by wunan on [deleted post] 2018-03-03T19:42:24.807Z

It sounds to me like most of the negative experiences you described were a result of the pills and are not associated with enlightenment:

I go off citrulline malate for 48 hours.  And it hits me.  Lethargy gone.  Cloudy headed thinking gone.  Ability to be productive returns.  I spend 10 hours at my desk in a row.  I write several thousand words.  I send off 10 emails and clear my inbox.  I power through my to-do list.  I stick to my diet for the first time in months.  I send emails, I round up outstanding notes, reorganise myself.  Reset my GTD system and power through for a day.

I've never heard of lethargy, cloudy headed thinking, and an inability to be productive as side effects of enlightenment.

The other symptoms you described later in the post, like calmness even in the face of stressors, don't seem negative to me as long as you don't abuse this ability in order to ignore problems. Also, I think the calmness associated with enlightenment might feel significantly different than what you experienced. A lot of people talk about the importance of "responding skillfully" to different situations, meaning feeling anger when you should feel anger, sadness when you should feel sadness, etc, and then being able to let go of those states once they're no longer helpful. This seems different than the vasodilator-induced state of calm you described.

Comment by wunan on [deleted post] 2018-03-03T13:25:04.821Z
Also generally a warning that you might not like enlightenment if you find it.

I'm don't necessarily disagree (I'm still looking into this topic), but what are you basing this on?

Comment by wunan on [deleted post] 2018-03-02T18:23:20.392Z
It's analogous to a music teacher instructing you to just sit down and play some notes, any notes, for twenty minutes. It would be amazing if you made progress that way.

This is exactly how it felt for me -- I even remember thinking this exact same metaphor after practicing TMI and reflecting on the difference between it and my previous attempts.

Comment by wunan on [deleted post] 2018-03-02T18:13:00.869Z

What were you doing wrong?

Comment by wunan on [deleted post] 2018-03-02T17:36:37.082Z

No problem, I don't think the question is rude. No, I didn't view it as hokey. I was actually very enthusiastic about it right from the start, but never made any progress. TMI was valuable to me because it provided much more granular instructions.

I stopped to reassess about 2 months ago and have not been meditating in that time.

Comment by wunan on [deleted post] 2018-03-02T13:17:54.992Z

I don't think this guide goes into enough detail. I had read instructions many times that were essentially the same as this and attempted them consistently every day for weeks or months and made very little progress in terms of improving my attention. There was very little difference from if I had been following the instructions "just sit and relax for 15 minutes."

In my experience, the problem isn't that meditation is often treated as something that's too complex when actually it's very simple, it's that it's treated as something very simple when actually it's pretty complex.

What did help was reading "The Mind Illuminated," which breaks things down into much more detail. Attempting meditation with the instructions in that book was a very different experience from my earlier attempts. There was a very noticeable improvement in my ability to intentionally maintain my attention within the first few sessions. In fact I made such rapid progress that I stopped after a week because I wanted to take some time to reassess whether this was really a path I wanted to go down (the book provides instructions all the way to "awakening" or "enlightenment"). I'm currently still assessing.

Comment by wunan on A LessWrong Crypto Autopsy · 2018-01-29T19:40:09.610Z · LW · GW

Do we have any data on how well other people who had similar levels of interest in tech did?

Comment by wunan on Notes on Mental Security · 2017-12-30T22:35:27.717Z · LW · GW

I've noticed that your writing tends to be very abstract and gestures towards ideas with a lot of potential impact on instrumental rationality, but often doesn't spell things out in a concrete way. Do you think you could add examples and, in general, try to say things explicitly instead of hinting at them?

Comment by wunan on Happiness Is a Chore · 2017-12-23T18:15:57.538Z · LW · GW

So now that you've realized this, do you think you'll be able to use meditation to overcome akrasia?

Comment by wunan on Happiness Is a Chore · 2017-12-20T19:48:24.442Z · LW · GW
I have felt levels of happiness which are far above the upper limit of your mental scale. I know exactly how to be happy. And yet I find myself not consistently applying my own methods. Do you realize how impossibly mind-twisting this situation is? What happens in reality is that I enjoy and see great value in happiness when it happens, but when it doesn't I only work on it grudgingly. It's like with exercise, which is great but I'm rarely enthusiastic about starting it. The problem is not that I don't value happiness enough. The problem is rather that there is no gut-level motivational gradient to get actual happiness. There are gradients for all sorts of things which are crappy, fake substitutes. Once you know the taste of the real thing, they aren't fun at all. But you still end up optimizing for them, because that's what your brain does.

It sounds like you're describing akrasia. Do you think your meditation-based methods are insufficient to overcome akrasia, or you just haven't applied them to this end yet?