Nathan Young's Shortform

post by Nathan Young · 2022-09-23T17:47:06.903Z · LW · GW · 37 comments

37 comments

Comments sorted by top scores.

comment by Nathan Young · 2024-01-02T10:56:49.872Z · LW(p) · GW(p)

I am trying to learn some information theory.

It feels like the bits of information between 50% and 25% and 50% and 75% should be the same.

But for probability p, the information is -log2(p).

But then the information of .5 -> .25 is 1 bit and but from .5 to .75 is .41 bits. What am I getting wrong?

I would appreciate blogs and youtube videos.

Replies from: mattmacdermott
comment by mattmacdermott · 2024-01-02T12:16:05.257Z · LW(p) · GW(p)

I might have misunderstood you, but I wonder if you're mixing up calculating the self-information or surpisal of an outcome with the information gain on updating your beliefs from one distribution to another.

An outcome which has probability 50% contains bit of self-information, and an outcome which has probability 75% contains bits, which seems to be what you've calculated.

But since you're talking about the bits of information between two probabilities I think the situation you have in mind is that I've started with 50% credence in some proposition A, and ended up with 25% (or 75%). To calculate the information gained here, we need to find the entropy of our initial belief distribution, and subtract the entropy of our final beliefs. The entropy of our beliefs about A is .

So for 50% -> 25% it's

And for 50%->75% it's

So your intuition is correct: these give the same answer.

comment by Nathan Young · 2024-04-24T22:28:02.763Z · LW(p) · GW(p)

I think I'm gonna start posting top blogpost to the main feed (mainly from dead writers or people I predict won't care) 

comment by Nathan Young · 2022-10-03T11:35:14.980Z · LW(p) · GW(p)

If you or a partner have ever been pregnant and done research on what is helpful and harmful, feel free to link it here and I will add it to the LessWrong pregnancy wiki page. 

https://www.lesswrong.com/tag/pregnancy [? · GW

comment by Nathan Young · 2023-07-08T11:57:03.069Z · LW(p) · GW(p)

Epistemic status: written quickly, probably errors

Some thoughts on Manifund

  • To me it seems like it will be the GiveDirectly of regranting (perhaps along with NonLinear) rather than the GiveWell
  • It will be capable of rapidly scaling (especially if some regrantors are able to be paid for their time if they are dishing out a lot). It's not clear to me that's a bottleneck of granting orgs.
  • There are benefits to centralised/closed systems. Just as GiveWell makes choices for people and so delivers 10x returns, I expect that Manifund will do worse, on average than OpenPhil, which has centralised systems, centralised theories of impact.
  • Not everyone wants their grant to be public. If you have a sensitive idea (easy to imagine in AI) you may not want to publicly announce you're trying to get funding
  • As with GiveDirectly, there is a real benefit of ~dignity/~agency. And I guess I think this is mostly vibes, but vibes matter. I can imagine crypto donors in particular finding a transparent system with individual portfolios much more attractive than OpenPhil. I can imagine that making a big difference on net.
  • Notable that the donors aren't public. And I'm not being snide, I just mean it's interesting to me given the transparency of everything else.
  • I love mechanism design. I love prizes, I love prediction markets. So I want this to work, but the base rate for clever mechanisms outcompeting bureaucratic ones seems low. But perhaps this finds a way to deliver and then outcompetes at scale (which seems my theory for if GiveDirectly ends up outcompeting GiveWell)

Am I wrong? 

comment by Nathan Young · 2024-04-26T16:35:45.870Z · LW(p) · GW(p)

Nathan and Carson's Manifold discussion.

As of the last edit my position is something like:

"Manifold could have handled this better, so as not to force everyone with large amounts of mana to have to do something urgently, when many were busy. 

Beyond that they are attempting to satisfy two classes of people:

  • People who played to donate can donate the full value of their investments
  • People who played for fun now get the chance to turn their mana into money

To this end, and modulo the above hassle this decision is good. 

It is unclear to me whether there was an implicit promise that mana was worth 100 to the dollar. Manifold has made some small attempt to stick to this, but many untried avenues are available, as is acknowledging they will rectify the error if possible later. To the extent that there was a promise (uncertain) and no further attempt is made, I don't really believe they really take that promise seriously.

It is unclear to me what I should take from this, though they have not acted as I would have expected them to. Who is wrong? Me, them, both of us? I am unsure."

Threaded discussion

Replies from: Nathan Young, Nathan Young, Nathan Young
comment by Nathan Young · 2024-04-26T16:38:52.092Z · LW(p) · GW(p)

Carson:
 

Ppl don't seem to understand that Manifold could literally not exist in a year or 2 if they don't find a product market fit

Replies from: Nathan Young
comment by Nathan Young · 2024-04-26T16:44:29.782Z · LW(p) · GW(p)

Austin said [EA(p) · GW(p)] they have $1.5 million in the bank, vs $1.2 million mana issued. The only outflows right now are to the charity programme which even with a lot of outflows is only at $200k. they also recently raised at a $40 million valuation. I am confused by running out of money. They have a large user base that wants to bet and will do so at larger amounts if given the opportunity. I'm not so convinced that there is some tiny timeline here.

But if there is, then say so "we know that we often talked about mana being eventually worth $100 mana per dollar, but we printed too much and we're sorry. Here are some reasons we won't devalue in the future.."

Replies from: james-grugett
comment by James Grugett (james-grugett) · 2024-04-26T18:10:23.703Z · LW(p) · GW(p)

If we could push a button to raise at a reasonable valuation, we would do that and back the mana supply at the old rate. But it's not that easy. Raising takes time and is uncertain.

Carson's prior is right that VC backed companies can quickly die if they have no growth -- it can be very difficult to raise in that environment.

Replies from: Nathan Young
comment by Nathan Young · 2024-04-26T21:22:38.026Z · LW(p) · GW(p)

If that were true then there are many ways you could partially do that - eg give people a set of tokens to represent their mana at the time of the devluation and if at future point you raise. you could give them 10x those tokens back.

comment by Nathan Young · 2024-04-26T16:36:48.073Z · LW(p) · GW(p)

seems like they are breaking an explicit contract (by pausing donations on ~a weeks notice)

Replies from: Nathan Young
comment by Nathan Young · 2024-04-26T16:37:18.001Z · LW(p) · GW(p)

Carson's response:

weren't donations always flagged to be a temporary thing that may or may not continue to exist? I'm not inclined to search for links but that was my understanding.

Replies from: Nathan Young, Nathan Young
comment by Nathan Young · 2024-04-26T16:42:29.341Z · LW(p) · GW(p)

From https://manifoldmarkets.notion.site/Charitable-donation-program-668d55f4ded147cf8cf1282a007fb005

"That being said, we will do everything we can to communicate to our users what our plans are for the future and work with anyone who has participated in our platform with the expectation of being able to donate mana earnings."

"everything we can" is not a couple of weeks notice and lot of hassle.  Am I supposed to trust this organisation in future with my real money?

Replies from: james-grugett
comment by James Grugett (james-grugett) · 2024-04-26T18:17:43.102Z · LW(p) · GW(p)

We are trying our best to honor mana donations!

If you are inactive you have until the rest of the year to donate at the old rate. If you want to donate all your investments without having to sell each individually, we are offering you a loan to do that.

We removed the charity cap of $10k donations per month, which is going beyond what we previous communicated.

Replies from: Nathan Young, Nathan Young
comment by Nathan Young · 2024-04-26T21:28:26.133Z · LW(p) · GW(p)

Nevertheless lots of people were hassled. That has real costs, both to them and to you. 

comment by Nathan Young · 2024-04-26T20:18:34.198Z · LW(p) · GW(p)

I’m discussing with Carson. I might change my mind but i don’t know that i’ll argue with both of you at once.

comment by Nathan Young · 2024-04-26T16:41:14.268Z · LW(p) · GW(p)

Well they have a much larger donation than has been spent so there were ways to avoid this abrupt change:


"Manifold for Good has received grants totaling $500k from the Center for Effective Altruism (via the FTX Future Fund) to support our charitable endeavors."

Manifold has donated $200k so far. So there is $300k left. Why not at least, say "we will change the rate at which mana can be donated when we burn through this money" 

(via https://manifoldmarkets.notion.site/Charitable-donation-program-668d55f4ded147cf8cf1282a007fb005 )

comment by Nathan Young · 2024-04-26T16:36:26.185Z · LW(p) · GW(p)

seems breaking an implicity contract (that 100 mana was worth a dollar) 

Replies from: Nathan Young
comment by Nathan Young · 2024-04-26T16:37:56.719Z · LW(p) · GW(p)

Carson's response:

There was no implicit contract that 100 mana was worth $1 IMO. This was explicitly not the case given CFTC restrictions?

Replies from: Nathan Young
comment by Nathan Young · 2024-04-26T16:43:36.942Z · LW(p) · GW(p)

Austin took his salary in mana as an often referred to incentive for him to want mana to become valuable, presumably at that rate.

I recall comments like 'we pay 250 in referrals mana per user because we reckon we'd pay about $2.50' likewise in the in person mana auction. I'm not saying it was an explicit contract, but there were norms.

comment by Nathan Young · 2024-04-20T11:24:08.702Z · LW(p) · GW(p)

I recall a comment on the EA forum about Bostrom donating a lot to global dev work in the early days. I've looked for it for 10 minutes. Does anyone recall it or know where donations like this might be recorded?

comment by Nathan Young · 2023-09-26T16:49:14.079Z · LW(p) · GW(p)

No petrov day? I am sad.

Replies from: Dagon, Richard_Kennaway
comment by Dagon · 2023-09-26T19:50:48.749Z · LW(p) · GW(p)

The best way to honor Stanislav Petrov is to arrange systems not to need Stanislav Petrov.

comment by Richard_Kennaway · 2023-09-27T09:33:31.417Z · LW(p) · GW(p)

There is an ongoing Petrov Day poll. I don't know if everyone on LW is being polled.

comment by Nathan Young · 2023-07-19T10:43:42.312Z · LW(p) · GW(p)

Why you should be writing on the LessWrong wiki.

There is way too much to read here, but if we all took pieces and summarised them in their respective tag, then we'd have a much denser resources that would be easier to understand.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2023-07-19T12:00:58.583Z · LW(p) · GW(p)

There are currently no active editors or a way of directing sufficient-for-this-purpose traffic to new edits, and on the UI side no way to undo an edit, an essential wiki feature. So when you write a large wiki article, it's left as you wrote it, and it's not going to be improved. For posts, review related to tags is in voting on the posts and their relevance, and even that is barely sufficient to get good relevant posts visible in relation to tags. But at least there is some sort of signal.

I think your article on Futarchy [? · GW] illustrates this point. So a reasonable policy right now is to keep all tags short. But without established norms that live in minds of active editors, it's not going to be enforced, especially against large edits that are written well.

Replies from: Nathan Young
comment by Nathan Young · 2023-07-21T09:29:10.948Z · LW(p) · GW(p)

Thanks for replying.

Would you revert my Futarchy edits if you could?

I think reversion is kind of overpowered. I'd prefer reverting chunks.

I don't see the logic that says we should keep tags short. That just seems less useful

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2023-07-21T13:23:13.193Z · LW(p) · GW(p)

I don't see the logic that says we should keep tags short.

The argument is that with the current level of editor engagement, only short tags have any chance of actually getting reviewed and meaningfully changed if that's called for. It's not about the result of a particular change to the wiki, but about the place where the trajectory of similar changes plausibly takes it in the long run.

I think reversion is kind of overpowered.

A good thing about the reversion feature is that reversion can itself be reverted, and so it's not as final as when it's inconvenient to revert the reversions. This makes edit wars more efficient, more likely to converge on a consensus framing rather than with one side giving up in exhaustion.

Would you revert my Futarchy edits if you could?

The point is that absence of the feature makes engagement with the wiki less promising, as it becomes inconvenient and hence infeasible in practice to protect it in detail, and so less appealing to invest effort in it. I mentioned that as a hypothesis for explaining currently near-absent editor engagement, not as something relevant to reverting your edits.

Reverting your edits would follow from a norm that says such edits are inappropriate. I think this norm would be good, but it's also clearly not present, since there are no active editors to channel it. My opinion here only matters as much as the arguments around it convince you or other potential wiki editors, the fact that I hold this opinion shouldn't in itself have any weight. (So to be clear, currently I wouldn't revert the edits if I could. I would revert them only if there were active editors and they overall endorsed the norm of reverting such edits.)

comment by Nathan Young · 2024-03-11T18:07:51.242Z · LW(p) · GW(p)

I did a quick community poll - Community norms poll (2 mins) [LW · GW

I think it went pretty well. What do you think next steps could/should be? 

Here are some points with a lot of agreement.

comment by Nathan Young · 2023-12-15T10:06:30.567Z · LW(p) · GW(p)

Things I would do dialogues about:

(Note I may change my mind during these discussions but if I do so I will say I have)

  • Prediction is the right frame for most things
  • Focus on world states not individual predictions
  • Betting on wars is underrated
  • The UK House of Lords is okay actually
  • Immigration should be higher but in a way that doesn't annoy everyone and cause backlash
comment by Nathan Young · 2023-12-09T18:47:04.943Z · LW(p) · GW(p)

I appreciate reading women talk about what is good sex for them. But it's a pretty thin genre, especially with any kind of research behind it.

So I'd recommend this (though it is paywalled): 

https://aella.substack.com/p/how-to-be-good-at-sex-starve-her?utm_source=profile&utm_medium=reader2

Also I subscribed to this for a while and it was useful:

https://start.omgyes.com/join

Replies from: rhollerith_dot_com
comment by RHollerith (rhollerith_dot_com) · 2023-12-09T18:58:40.553Z · LW(p) · GW(p)

You don't want to warn us that it is behind a paywall?

Replies from: Nathan Young
comment by Nathan Young · 2023-12-09T21:01:31.194Z · LW(p) · GW(p)

I didn't think it was relevant, but happy to add it.

comment by Nathan Young · 2023-10-30T10:00:43.395Z · LW(p) · GW(p)

I suggest that rats should use https://manifold.love/ as the Schelling dating app. It has long profiles and you can bet on other people getting on.

What more could you want!

I am somewhat biased because I've bet that it will be a moderate success.

comment by Nathan Young · 2023-09-11T14:03:25.274Z · LW(p) · GW(p)

Relative Value Widget

It gives you sets of donations and you have to choose which you prefer. If you want you can add more at the bottom.

https://allourideas.org/manifund-relative-value 

comment by Nathan Young · 2023-08-31T15:44:57.683Z · LW(p) · GW(p)

Other things I would like to be able to express anonymously on individual comments:

  • This is poorly framed - Sometimes i neither want to agree nor diagree. I think the comment is orthogonal to reality and agreement and disagreement both push away from truth.
  • I don't know - If a comment is getting a lot of agreement/disagreement it would also be interesting to see if there could be a lot of uncertainty
comment by Nathan Young · 2022-09-23T17:47:07.108Z · LW(p) · GW(p)

It's a shame the wiki doesn't support the draft google-docs-like editor. I wish I could make in-line comments while writing.