Rana Dexsin's Shortform
post by Rana Dexsin · 2021-06-28T03:55:40.930Z · LW · GW · 15 commentsContents
15 comments
15 comments
Comments sorted by top scores.
comment by Rana Dexsin · 2024-10-10T21:33:03.175Z · LW(p) · GW(p)
Dear people writing in the TeX-based math notation here who want to include full-word variables: putting the word in raw leads to subtly bad formatting. If you just write “cake”, this gets typeset as though it were c times a times k times e, as in this example which admittedly doesn't actually show how awkward it can get depending on the scale: . It's more coherent if you put a \mathrm{} around each word to typeset it as a single upright word, like so: .
Replies from: Sodiumcomment by Rana Dexsin · 2021-07-25T14:38:50.463Z · LW(p) · GW(p)
Since scattered “productivity tips” seem so popular here, here's mine: if you're reading this and there's something more useful you feel like you should be doing, then stop right now, take a moment to sweep out your working memory so it's tidy, and carefully hold onto the idea of the useful thing you could be doing. Remember what it feels like to actually do it. Then close your browser tab and let that happen instead.
Replies from: przemyslaw-czechowski↑ comment by Jan Czechowski (przemyslaw-czechowski) · 2021-07-26T20:21:13.820Z · LW(p) · GW(p)
I happen actually to be in my very specific allocated time for "30 mins of LW reading and writing". But usually, this site is a procrastination hole for me, so thanks. Still, I must say, a very life-improving procrastination hole.
comment by Rana Dexsin · 2024-08-24T22:44:31.741Z · LW(p) · GW(p)
Observation of context drift: I was rereading some of HPMOR just now, and Harry's complaint of “The person who made this probably didn't speak Japanese and I don't speak any Hebrew, so it's not using their knowledge, and it's not using my knowledge”, regarding a magic item in chapter 6, hits… differently in the presence of the current generation of language models.
Replies from: Rana Dexsin↑ comment by Rana Dexsin · 2024-09-02T05:26:09.204Z · LW(p) · GW(p)
Followup:
How so much artistry had been infused into the creation of Hogwarts was something that still awed Draco every time he thought about it. There must have been some way to do it all at once, no one could have detailed so much piece by piece, the castle changed and every new piece was like that.
Years later, Midjourney happened.
comment by Rana Dexsin · 2024-09-11T00:19:57.630Z · LW(p) · GW(p)
Detached from a comment on Zvi's AI #80 [LW(p) · GW(p)] because it's a hazy tangent: the idea of steering an AI early and deeply using synthetic data reminds me distinctly of the idea of steering a human early and deeply using culture-reinforcing mythology. Or, nowadays, children's television, I suppose.
comment by Rana Dexsin · 2024-02-12T21:48:17.859Z · LW(p) · GW(p)
This SMBC from a few years ago including an “entropic libertarian” probably isn't pointing at what people call “e/acc”… right? My immediate impression is that it rhymes though. I'm not sure how to feel about that.
comment by Rana Dexsin · 2023-07-19T17:24:00.126Z · LW(p) · GW(p)
Currently thinking about the idea that an idea or meme can be dangerous or poorly adapted to a population, despite being true and coherent in itself, due to interactions with the preexisting landscape resulting in it being metabolized into a different, wrong and/or toxic idea.
A principle in engineering that feels similar is “design for manufacturability”, where a design can be theoretically sound but not yield adequate quality when put through the limitations of the actual fabrication process, including the possibility of breaking the manufacturing equipment if you try. In this case, the equivalent of the fabrication process is the interaction inside people's minds, so “design for mentalization”, perhaps?
comment by Rana Dexsin · 2024-08-21T17:27:10.514Z · LW(p) · GW(p)
Publishing “that ship has sailed” earlier than others actively drives the ship. I notice that this feels terrible, but I don't know where sensible lines are to draw in situations where there's no existing institution that can deliver a more coordinated stop/go signal for the ship. I relatedly notice that allowing speed to make things unstoppable means any beneficial decision-affecting processes that can't be or haven't been adapted to much lower latencies lose all their results to a never-ending stream of irrelevance timeouts. I have no idea what to do here, and that makes me sad.
Related but more specific: “Give Up Seventy Percent Of The Way Through The Hyperstitious Slur Cascade”
comment by Rana Dexsin · 2024-05-29T16:48:10.059Z · LW(p) · GW(p)
Given the presence of mood fluctuations and other noise, repeatedly being triggered to re-evaluate a decision on whether or not to take a one-shot action when not much has relevantly changed in the meantime seems subject to a temporal unilateralist's curse: if you at time 1000 choose to do the action even if you at times 0–999 didn't choose it and you at times 1001–1999 wouldn't have chosen it, it still happens. The most well-known example that comes to mind of this being bad is addiction and “falling off the wagon”, but it seems like it generalizes.
Replies from: lcmgcd↑ comment by lemonhope (lcmgcd) · 2024-05-31T06:24:56.429Z · LW(p) · GW(p)
See also gun owners and suicide. The gun is just sitting there.
comment by Rana Dexsin · 2022-06-17T13:39:01.829Z · LW(p) · GW(p)
This post will be more poetic than argumentative. [LW · GW] … harnessed in service of the eight-year-old's misguided love for "spaceships."
In that spirit:
I don't forget just what I make
Oh yeah, it's great, it's spacey ships
And if that's hard or seems unkind
Oh well, whatever, wrapper mind
—“Smells Like Teen AI”, NirvANNa
comment by Rana Dexsin · 2021-06-28T03:55:41.308Z · LW(p) · GW(p)
From Gwern's Epigrams page: “‘thrift’ is achieving one’s goals as cost-effectively as possible and maximizing one’s bang-for-buck; ‘frugality’ is choosing one’s goals to be as cost-effective as possible, and picking a bang which minimizes one’s buck. The former is a virtue; the latter, a vice.”
I would not describe the latter as a vice when the form of ‘picking a bang’ is internal alignment: rather than choosing your ‘real’ goals to be cost-effective, you are instead ‘realizing’ that you were acting on habits attached to some poorly-defined conception of what ‘should’ be your goal on some axis, and now aligning the part of you that was busy acting as though you cared, so that it no longer has to, and so that the cost of that action is no longer borne.
There are probably other cases, but I don't feel like enumerating them right now. I'm tempted to write more because I ‘should’, but I'm picking the lower-cost temporary goal of entering a Shortform entry instead as a form of the above non-vicious frugality. Unfortunately, I'm probably not going to immediately get back to thinking about the more important thing I was just thinking about a few minutes ago; what's more likely is that I'll get distracted by something else, so the non-viciousness of the frugality is somewhat nullified by vice elsewhere.