itaibn0's Shortform
post by itaibn0 · 2020-05-11T05:19:08.672Z · LW · GW · 3 commentsContents
3 comments
3 comments
Comments sorted by top scores.
comment by itaibn0 · 2023-03-02T23:55:53.640Z · LW(p) · GW(p)
I think MIRI's Logical Inductor idea can be factored into two components, one of which contains the elegant core that is why this idea works so well, and the other is an arbitrary embellishment that obscures what is actually going on. Of course I am calling for this to be recognized and that people should only be teaching and thinking about the elegant core. The elegant core is infinitary markets: Markets that exist for an arbitrarily long time, with commodities that can take arbitrarily long to return dividends, and infinitely many market participants who use every computable strategy. The hack is that the commodities are labeled by sentences in a formal language and the relationships between them are governed by a proof systems. This creates a misleading pattern that that the value of the commodity labeled phi appears to measure the probability that phi is true; in fact what it measures is more like the probability the that proof system will eventually affirm that phi is true, or more precisely like the probability that phi is true in a random model of the theory. Of course what we really care about is the probability phi is actually true, meaning true in the standard model where the things labeled "natural numbers" are actual natural numbers and so on. By combining proof systems and infinitary markets, one obscures how much of the "work" in obtaining accurate information is done by either. I think it is better to study these two things separately. Since proof systems are already well-studies and infinitary markets are the novel idea in MIRI's work, that means they should primarily study infinitary markets.
comment by itaibn0 · 2023-01-10T22:51:37.700Z · LW(p) · GW(p)
While I like a lot of Hanson's grabby alien model, I do not buy the inference that since humans appeared early in cosmological history, that implies that the cosmic commons are taken quickly and so a lower bound on how often grabby aliens appear. I think that is neglecting the possibility that the early universe is inherently more conducive to creating life, so most life is created early, but these lifeforms may be very far apart.
comment by itaibn0 · 2020-05-11T05:19:09.125Z · LW(p) · GW(p)
Crossposted on my blog:
Lightspeed delays lead to multiple technological singularities.
By Yudkowsky's classification, I'm assuming the Accelerating Change Singularity: As technology gets better, the characteristic timescale at which technological progress is made becomes shorter, so that the time until this reaches physical limits is short from the perspective of our timescale. At a short enough timescale the lightspeed limit becomes important: When information cannot traverse the diameter of civilization in the time until singularity further progress must be made independently in different regions. The subjective time from then may still be large, and without communication the different regions can develop different interest and, after their singularities, compete. As the characteristic timescale becomes shorter the independent regions split further.