Rationality Drugs
post by lukeprog · 2011-10-01T11:20:00.159Z · LW · GW · Legacy · 122 commentsContents
Notes References None 122 comments
Can drugs improve your rationality?
I’m not sure, but it seems likely.
Remember the cognitive science of rationality. Often, irrationality is a result of ‘mindware gaps’ or ‘contaminated mindware’ — missing pieces of knowledge like probability theory, or wrong ideas like supernaturalism. Alas, we cannot yet put probability theory in a pill and feed it to people, nor can a pill deprogram someone from supernaturalism.
Another cause of irrationality is ‘cognitive miserliness’. We default to automatic, unconscious, inaccurate processes whenever possible. Even if we manage to override those processes with slow deliberation, we usually perform the easiest deliberation possible — deliberation with a ‘focal bias’ like confirmation bias.
What will increase the likelihood of cognitive override and decrease the effect of focal biases? First, high cognitive capabilities (IQ, working memory, etc.) make a brain able to do the computationally difficult processing required for cognitive override and avoidance of focal bias. Second, a disposition for cognitive reflectiveness make it more likely that someone will choose to use those cognitive capabilities to override automatic reasoning processes and reason with less bias.1
Thus, if drugs can increase cognitive capability or increase cognitive reflectiveness, then such drugs may be capable of increasing one’s rationality.
First: Can drugs increase cognitive capability?
Yes. Many drugs have been shown to increase cognitive capability. Here are a few of them:2
- Modafinil improves working memory, digit span, visual pattern recognition, spatial planning, and reaction time.3
- Because glucose is the brain’s main energy source,4 increases in glucose availability via sugar injestion should improve memory performance.5
- Creatine improves cognitive performance.6
- Donepezil improves memory performance, but perhaps only after taken for 21 days.7
- Dopamine agonists like d-amphetamine, bromocriptine, and pergolide have all been been found to improve working memory and executive function,8 but perhaps only in those with poor memory performance.9
- Guanfacine has shown mixed effects on cognition.10 Methylphenidate (Ritalin) has also shown mixed results for cognitive enhancement,11 though the most commonly reported motive for illicit use of prescription stimulants like methylphenidate is to enhance concentration and alertness for studying purposes.12
- Piracetam is usually prescribed to deal with cognitive deficits and other problems, but also has also shown some cognitive benefits in healthy individuals.13
Second: Can drugs increase cognitive reflectiveness?
I’m not sure. I’m not yet aware of any drugs that have been shown to increase one’s cognitive reflectiveness.
So, can drugs improve your rationality? I haven’t seen any experimental studies test whether particular drugs improve performance on standard tests of rationality like the CRT. However, our understanding of how human irrationality works suggests that improvements in cognitive capability and cognitive reflectiveness (via drugs or other means) should increase one’s capacity to think and act rationally. That said, current drugs probably can’t improve rationality as much as demonstrated debiasing practices can.
Should we use drugs for cognitive enhancement? Scholars debate whether such modifications to human functioning are ethical or wise,14 but I think the simplicity of the transhumanist position is pretty compelling:
If we can make things better, then we should, like, do that.15
Notes
1 For a review, see Stanovich (2010), ch. 2.
2 For a broader overview, see de Jongh et al. (2008); Normann & Berger (2008); Sandberg (2011).
3 Muller et al. (2004); Turner et al. (2004); Gill et al. (2006); Caldwell et al. (2000); Finke et al. (2010); Repnatis et al. (2010).
4 Fox et al. (1988).
5 Foster et al. (1999); Sunram-Lea et al. (2002).
6 Rae et al. (2003); McMorris et al. (2006); Watanabe et al. (2002).
7 Gron et al. (2005).
8 D-amphetamine: Mattay et al. (2000); Mattay et al. (2003); Barch & Carter (2005). Bromocriptine: Kimberg et al. (1997); Kimberg et al. (2001); Mehta et al. (2001); Roesch-Ely et al. (2005); Gibbs & D’Esposito (2005a). Pergolide: Muller et al. (1998); Kimberg & D’Esposito (2003).
9 Kimberg et al. (1997); Mehta et al. (2001); Mattay et al. (2000); Mattay et al. (2003); Gibbs & D’Esposito (2005a, 2005b).
10 Muller et al. (2005); de Jongh et al. (2008).
11 de Jongh et al. (2008).
12 Teter et al. (2006).
13 Dimond & Brouwers (1976); Mondadori (1996).
14 Savulescu & Bostrom (2009).
15 I think I first heard Louie Helm put it this way.
References
Barch & Carter (2005). Amphetamine improves cognitive function in medicated individuals with schizophrenia and in healthy volunteers. Schizophrenia Research, 77: 43–58.
Caldwell, Caldwell, et al. (2000). A double-blind, placebo-controlled investigation of the efficacy of modafinil for sustaining the alertness and performance of aviators: A helicopter simulator study. Psychopharmacology (Berlin), 150: 272–282.
de Jongh, Bolt, Schermer, & Olivier (2008). Botox for the brain: Enhancement of cognition, mood, and pro-social behavior and blunting of unwanted memories. Neuroscience and Biobehavioral Reviews, 32: 760-776.
Dimond & Brouwers (1976). Increase in the power of human memory in normal man through the use of drugs. Psychopharmacology, 49: 307–309.
Finke, Dodds, et al. (2010). Effects of modafinil and methylphenidate on visual attention capacity: a TVA-based study. Psychopharmacology, 210: 317-329.
Foster, Lidder, & Sunram (1998). Glucose and memory: fractionation of enhancement effects? Psychopharmacology, 137: 259–270.
Fox, Raichle, et at. (1988). Nonoxidative glucose consumption during focal physiologic neural activity. Science, 241: 462–464.
Gibbs & D’Esposito (2005a). Individual capacity differences predict working memory performance and prefrontal activity following dopamine receptor stimulation. Cognitive & Affective Behavioral Neuroscience, 5: 212–221.
Gibbs & D’Esposito (2005b). A functional MRI study of the effects of bromocriptine, a dopamine receptor agonist, on component processes of working memory. Psychopharmacology (Berlin), 180: 644–653.
Gill, Haerich, et al. (2006). Cognitive performance following modafinil versus placebo in sleep-deprived emergency physicians: A double-blind randomized crossover study. Academic Emergency Medicine, 13: 158–165.
Gron, Kirstein, et al. (2005). Cholinergic enhancement of episodic memory in healthy young adults. Psychopharmacology (Berlin), 182: 170–179.
Kimberg, D’Esposito, & Farah (1997). Effects of bromocriptine on human subjects depend on working memory capacity. Neuroreport, 8: 3581–3585.
Kimberg, Aguirre, et al. (2001). Cortical effects of bromocriptine, a D-2 dopamine receptor agonist, in human subjects, revealed by fMRI. Human Brain Mapping, 12: 246–257.
Kimberg & D’Esposito (2003). Cognitive effects of the dopamine receptor agonist pergolide. Neuropsychologia, 41: 1020–1027.
Mattay, Callicott, et al. (2000). Effects of dextroamphetamine on cognitive performance and cortical activation. Neuroimage, 12: 268–275.
Mattay, Goldberg, et al. (2003). Catechol O-methyltransferase val158-met genotype and individual variation in the brain response to amphetamine. Proceedings of the National Academy of Sciences USA, 100: 6186–6191.
McMorris, Harris, et al. (2006). Effect of creatine supplementation and sleep deprivation, with mild exercise, on cognitive and psychomotor performance, mood state, and plasma concentrations of catecholamines and cortisol. Psychopharmacology, 185: 93–103.
Mehta, Swainson, et al. (2001). Improved short-term spatial memory but impaired reversal learning following the dopamine D(2) agonist bromocriptine in human volunteers. Psychopharmacology (Berlin), 159: 10–20.
Mondadori (1996). Nootropics: Preclinical results in the light of clinical effects; comparison with tacrine. Critical Reviews in Neurobiology, 10: 357–370.
Muller, von Cramon, & Pollmann (1998). D1- versus D2-receptor modulation of visuospatial working memory in humans. Journal of Neuroscience, 18: 2720–2728.
Muller, Steffenhagen, et al. (2004). Effects of modafinil on working memory processes in humans. Psychopharmacology, 177: 161–169.
Muller, Clark, et al. (2005). Lack of effects of guanfacine on executive and memory functions in healthy male volunteers. Psychopharmacology (Berlin), 182: 205–213.
Normann & Berger (2008). Neuroenhancement: status quo and perspectives. European Archives of Psychiatry and Clinical Neuroscience, 258 Supplement 5: 110-114.
Rae, Digney, et al. (2003). Oral creatine monohydrate supplementation improves brain performance: a double-blind, placebo-controlled, cross-over trial. Proceedings of the Royal Society of London Series B, Biotogical Sciences, 270: 2147–2150.
Repantis, Schlattmann, et al. (2010). Modafinil and methylphenidate for neuroenhancement in healthy individuals: A systematic review. Pharmacological Research, 62: 187-206.
Roesch-Ely, Scheffel, et al. (2005). Differential dopaminergic modulation of executive control in healthy subjects. Psychopharmacology (Berlin), 178: 420–430.
Sandberg (2011). Cognition enhancement: Upgrading the brain. In Savulescu, ter Meulen, & Kahane (eds.), Enhancing Human Capacities. Wiley-Blackwell.
Savulescu & Bostrom (2009). Human Enhancement. Oxford University Press.
Stanovich (2010). Rationality and the Reflective Mind. Oxford University Press.
Sunram-Lea, Foster, et al. (2002). Investigation into the significance of task difficulty and divided allocation of resources on the glucose memory facilitation effect. Psychopharmacology, 160: 387–397.
Teter, Robbins, et al. (2003). Cognitive enhancing effects of modafinil in healthy volunteers. Psychopharmacology, 165: 260–269.
Turner, Clark, Dowson, Robbins, & Sahakian (2004). Modafinil improves cognition and response inhibition in adult attention-deficit/hyperactivity disorder. Biological Psychiatry, 55: 1031-1040.
Watanabe, Kato, et al. (2002). Effects of creatine on mental fatigue and cerebral hemoglobin oxygenation. Neuroscience Research, 42: 279–285.
122 comments
Comments sorted by top scores.
comment by Morendil · 2011-10-01T16:27:04.728Z · LW(p) · GW(p)
"Drug X improves performance measure Y" will in general be an incomplete description of the effects of drug X.
To be a rationalist is to be the kind of person who mentally adds "among other as yet undiscovered effects" to every single bullet point above.
Replies from: MarkusRamikin, None, DanielLC↑ comment by MarkusRamikin · 2011-10-01T16:37:41.773Z · LW(p) · GW(p)
Upvoted for naming what was bothering me.
Of course I imagine some drugs are rather well understood by now. But Lukeprog's post doesn't seem to touch on the safety and potential downsides of taking this stuff, which would be useful.
(Also, creepy pill-man is creepy.)
↑ comment by [deleted] · 2011-10-02T00:10:43.749Z · LW(p) · GW(p)
To be a rationalist is to be the kind of person who mentally adds "among other as yet undiscovered effects" to every single bullet point above.
What makes that mental addition a "rationalist" thing to do, rather than simply a good thing to do?
Replies from: NancyLebovitz, Morendil↑ comment by NancyLebovitz · 2011-10-02T12:49:09.810Z · LW(p) · GW(p)
It's specifically about having a more accurate model of the universe. It's not the same sort of thing as brushing your teeth, even though that's also a good thing to do.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2011-10-03T13:37:13.157Z · LW(p) · GW(p)
General principle: definitions put a thing into a category, and then explain how that thing is different from other things in the same category.
I don't think definitions are how people generally use words-- prototype theory seems more accurate. Prototype theory says that people have best examples of concepts, and then rank actual things according to how close they are to the prototype.
It would be nice to have a theory about how to decide when to use definitional thinking and when to use prototypes, but I don't.
↑ comment by Morendil · 2011-10-02T10:33:37.535Z · LW(p) · GW(p)
It's a five-second skill - you have to train yourself to do it.
↑ comment by DanielLC · 2011-10-03T01:02:50.741Z · LW(p) · GW(p)
"among other as yet undiscovered effects"
And effects that lukeprog didn't bother to state.
Replies from: Morendil↑ comment by Morendil · 2011-10-03T10:39:52.674Z · LW(p) · GW(p)
Those count as "undiscovered" too - undiscovered by at least me. :)
This article cited by Luke has more nuanced appreciations of drugs like Donepezil, and generally a more balanced take on the subject. For instance, they report that
Detrimental effects on cognition have also been reported: both in healthy young participants (Beglinger QJ et al., 2004) and in healthy elderly volunteers (Beglinger et al., 2005), donepezil administration (5 mg for 14 days and 10 mg for 14 days respectively) caused a slight deterioration of performance on speed, attention and short-term memory tasks.
The same article goes on to suggest that perhaps 14 days is too short a timeframe for the beneficial effects to be felt. However one can also find studies like this one (not cited by Luke) which show detrimental effects on cognition in healthy subjects over four weeks of treatment.
Neither Luke nor de Jongh et al. report on the frequent side-effects, which (Wikipedia says) include bradycardia, nausea, diarrhea, anorexia, abdominal pain, and vivid dreams.
comment by [deleted] · 2011-10-01T12:26:20.381Z · LW(p) · GW(p)
I'd like to share one day's worth of experience with modafinil.
I noticed a huge difference in alertness. I was filled with an urge to be doing something every second. I don't believe I was more intelligent (some of the work I did that day turned out to be low quality) but I was much more productive. And happy. I felt like I was just "riding the day" -- that going through life, minute by minute, running errands, checking items off my to-do list, and seeing what happened next, was boundlessly fascinating.
I suspect that, at least for me, and maybe for others, most unhappiness is really fatigue, coupled with the guilt of not having accomplished much in a state of fatigue. Simply not being tired makes me deliriously happy. I am not surprised by the study that coffee reduces depression in women, though I know to be suspicious of medical study methodology. The symptoms of clinical depression look a lot like the symptoms of chronic sleep deprivation (fatigue, inability to concentrate, clumsiness, weight gain or weight loss, dramatic and irrational emotions). It's possible that some people with symptoms of depression are actually sleep deprived (or that a typical amount of sleep for a modern-day working or student life is too little for their biological needs.) I had a year when I thought I was losing my mind; in retrospect, it may have had something to do with getting no more than five hours of sleep a night.
Replies from: Nisan, Swimmer963, juliawise, Dorikka, Douglas_Knight↑ comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2011-10-01T19:29:27.190Z · LW(p) · GW(p)
I had a year when I thought I was losing my mind; in retrospect, it may have had something to do with getting no more than five hours of sleep a night.
Five hours of sleep a night for a whole year? I'm amazed you functioned! One five-hour night and I'm moderately functional, maybe a slightly shorter attention span and more mood swings than usual. Two nights in a row and I'm a zombie unless I drink a lot of coffee. Three nights and I'm a zombie anyway no matter how much coffee I drink. Unless I get 9+ hours of sleep every night, I will feel sleepy at various points during the day.
Replies from: vi21maobk9vp, matt↑ comment by vi21maobk9vp · 2011-10-02T10:06:27.380Z · LW(p) · GW(p)
It is highly personal.
9+ hours of sleep per night for a month will probably make me feel bad. Average of 6 hours per night may be slowly wearing myself out, but this rate seems to be sustainable indefinitely.
But then, if I do not do anything stressful, I can do with 4 hours per night for a month..
↑ comment by matt · 2011-10-04T23:45:16.599Z · LW(p) · GW(p)
4.5hrs of sleep every 24 on everyman 3 since January and I've never felt better!
[full disclosure: the first couple of months were tough and involved much experimentation with schedules close to everyman 3.]
Replies from: Crux, None, Swimmer963↑ comment by Crux · 2011-10-04T23:48:06.270Z · LW(p) · GW(p)
Never felt better? Do you do any hard exercise?
Replies from: matt↑ comment by matt · 2011-10-05T03:54:34.886Z · LW(p) · GW(p)
Not "hard". Four hour body inspired exercise routine. I'm fit and healthy with as little exercise as I can get away with (pushups, situps, etc. 3 days per week; 2km walk with sprints 3 days per week).
↑ comment by [deleted] · 2011-10-05T18:19:09.837Z · LW(p) · GW(p)
Do you do anything hard involving your long-term memory? Do you use spaced repetition, and if so, has it suffered?
Replies from: matt↑ comment by matt · 2011-10-05T19:59:03.576Z · LW(p) · GW(p)
I'm a programmer and manager of programmers. I don't use spaced repetition (I mean to… I've cron'd it to open every morning… but I close it every morning that I figure I don't have time… and that's every morning). I've not noticed any memory deficit.
I think that amounts to: no information.
↑ comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2011-10-05T02:48:16.820Z · LW(p) · GW(p)
Neat. However, how regimented does your sleep schedule have to be in order for it to work? (My main problem with sleeping enough isn't that I have trouble going to bed early enough, like seems to be true for a lot of people... It's that some days I have shifts at work that start at 6 am and then I'm busy until 10 pm, and some days I get home after 11 pm and have to work 6 an the next day, and somehow even though I sleep 8-10 hours a night on the other days, I never really seem to catch up. (Also, can't nap during the day, at least not on demand. I taught myself to do it a bit during first-year university, but my schedule no longer allows napping anyway.)
Replies from: matt↑ comment by matt · 2011-10-05T04:03:30.007Z · LW(p) · GW(p)
I can usually move naps ±90 minutes with very little negative consequence (±30mins with no consequences). I can skip a nap with coffee at the cost of adding an extra hour of sleep the following night (I had to give up coffee to make normal naps work - trace caffeine doesn't stop me from napping, but does stop the naps from being effective).
Re: "can't nap during the day… on demand" - the adaption period will fix that.
↑ comment by juliawise · 2011-10-30T12:44:09.209Z · LW(p) · GW(p)
What are "irrational emotions"?
Replies from: AndHisHorse↑ comment by AndHisHorse · 2013-08-08T15:47:24.010Z · LW(p) · GW(p)
An emotion is irrational if it is not appropriate to the situation - for example, social anxiety is irrational if it causes one to avoid pursuing some social opportunities which have a positive expected value (for any utility function, which may or may not carry a heavier penalty for failure than a bonus for success).
Replies from: Lumifer↑ comment by Lumifer · 2013-08-08T15:54:15.602Z · LW(p) · GW(p)
An emotion is irrational if it is not appropriate to the situation
Who decides (and how) which emotion is appropriate to which situation?
Replies from: AndHisHorse↑ comment by AndHisHorse · 2013-08-08T17:04:59.542Z · LW(p) · GW(p)
See above. If your emotional state (and I assume the ability to distinguish a state of heightened emotion from a resting state) causes you to act in ways which do not reflect your evidence-based assessments, it is causing you to act against your rational decisions and is therefore irrational.
I would say that the ability to make this judgement belongs to the best-informed rationally-acting observer: someone who has knowledge of the your mental state in both emotional states, and from the available evidence, estimate whether or not the difference in behavior can be attributed to emotional causes. This observer may very well be you yourself, in a resting state; once you have regained your perspective, as you have a lot more information on your own mental state.
To expand on the example I gave above, someone experiencing social anxiety may suddenly focus on the various ways in which a social interaction can go horribly wrong, even if these futures are not very probable. Basically, anxiety hijacks the availability heuristic, causing an overestimation of the probability of catastrophe. Because this adjustment in probability is not based in evidence (though this point could be argued), it is irrational.
This definition of "irrational emotions" does not depend on the utility function used. If someone weights failure more heavily than success, and will go home unhappy at the end of the night if they have 9 successful conversations and 1 boring dud, they are not necessarily irrational. However, if, on previous nights with substantial frequency they have gone 10 for 10, and before entering a conversation they freeze in fear - then, their expected value has changed without sufficient reason. That is irrational emotion.
Replies from: Lumifer↑ comment by Lumifer · 2013-08-08T17:18:35.191Z · LW(p) · GW(p)
If your emotional state ... causes you to act in ways which do not reflect your evidence-based assessments, it is causing you to act against your rational decisions and is therefore irrational.
So, anything which decreases rationality is irrational? Sounds like circle reasoning to me.
Besides, you original point was that
An emotion is irrational if it is not appropriate to the situation
If I wake up in a burning house, fear is certainly appropriate to the situation and yet it's very likely to decrease the rationality of my decision-making. If I'm making out with someone I like a lot, love/tenderness is appropriate to the situation and will decrease my rationality. Etc. etc.
This starts to remind me of a steel Vulcan :-)
↑ comment by Dorikka · 2011-10-01T18:48:06.232Z · LW(p) · GW(p)
Do you take modafinil on a regular basis? If not, what made you choose not to, given your positive experience? If so, have you noticed any other effects that would be good to note?
Replies from: None↑ comment by [deleted] · 2011-10-01T19:03:00.620Z · LW(p) · GW(p)
I had a one-time trial and I'm planning to see my doctor for more as soon as I can.
Replies from: loqi↑ comment by loqi · 2011-10-03T23:10:11.318Z · LW(p) · GW(p)
If you don't mind sharing, how do you plan to do this? Is it as simple as "this controlled substance makes my life better, will you prescribe it for me?" Or are you "fortunate" enough to have a condition that warrants its prescription?
I ask because I've had similar experiences with Modafinil (my nickname for it is "executive lubricant"), and it is terribly frustrating to be stuck without a banned goods store.
↑ comment by Douglas_Knight · 2011-10-01T13:58:37.723Z · LW(p) · GW(p)
It's possible that some people with symptoms of depression are actually sleep deprived
I'm skeptical of this. Yes, five hours of sleep is bad for your mental health, but usually in a different direction. Did you have depressive symptoms that year? A key symptom of depression is lack of willpower - depressives don't normally have the willpower not to sleep. Quite the opposite, they sleep more the than normal. This would solve simple sleep deprivation. It's possible that they lack something more specific that normal people are able to get by sleeping, but even that does not sound terribly likely to me.
ETA: As various people comment, this is largely backwards. I particularly regret suggesting that people who spend a lot of time in bed get useful sleep. So maybe sleep deprivation contributes to some of the symptoms of depression. But there are other symptoms and I am skeptical that the two are confused.
Replies from: Vladimir_M, Yvain, Swimmer963, None↑ comment by Vladimir_M · 2011-10-01T15:51:52.136Z · LW(p) · GW(p)
A key symptom of depression is lack of willpower - depressives don't normally have the willpower not to sleep.
For me personally, and I suspect also for a significant number of other people, it takes willpower to go to sleep as well as to wake up early enough. In the morning, the path of least resistance for me is to sleep in, but in the evening, it is to do something fun until I'm overcome with overwhelming sleepiness, which won't happen until it's far too late to maintain a normal sleeping schedule. Therefore, if I were completely deprived of willpower, my "days" would quickly degenerate into cycles of much more than 24 hours, falling asleep as well as waking up at a much later hour each time.
Now, the incentive to wake up early enough (so as not to miss work etc.) is usually much stronger than the incentive to go to bed early enough, which is maintained only by the much milder and more distant threat of feeling sleepy and lousy next day. So a moderate crisis of willpower will have the effect of making me chronically sleep-deprived, since I'll still muster the willpower to get up for work, but not the willpower to go to bed instead of wasting time until the wee hours.
(This is exacerbated by the fact that when I'm sleep-deprived, I tend to feel lousy and wanting to doze off through the day, but then in the evening I suddenly start feeling perfectly OK and not wanting to sleep at all.)
Replies from: Jordan, taelor, multifoliaterose↑ comment by Jordan · 2011-10-01T20:54:33.161Z · LW(p) · GW(p)
(This is exacerbated by the fact that when I'm sleep-deprived, I tend to feel lousy and wanting to doze off through the day, but then in the evening I suddenly start feeling perfectly OK and not wanting to sleep at all.)
I suffer from this as well. It is my totally unsubstantiated theory that this is a stress response. Throughout the whole day your body is tired and telling you to go to sleep, but the Conscious High Command keeps pressing the KEEP-GOING-NO-MATTER-WHAT button until your body decides it must be in a war zone and kicks in with cortisol or adrenaline or whatever.
↑ comment by multifoliaterose · 2011-10-02T15:30:29.353Z · LW(p) · GW(p)
Me too!
↑ comment by Scott Alexander (Yvain) · 2011-10-01T15:04:37.699Z · LW(p) · GW(p)
Depressed people can have either insomnia or hypersomnia; insomnia is significantly more common. Depression-related insomnia is usually "terminal" - people wake up very early and can't get back to sleep.
Strangely enough, there have been some studies showing that depriving depressed people of sleep has a strong positive effect on their mood, but of course then they're too sleep-deprived to enjoy it.
↑ comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2011-10-01T21:15:55.504Z · LW(p) · GW(p)
Quite the opposite, they sleep more the than normal.
Actually, according to my nursing textbooks, depression can manifest either by sleeping more or less than usual. So five hours of sleep a night could, for some people, be a symptom of depression. And I do remember reading somewhere about first-year college or university students developing clinical depression after a few months of unaccustomed stress and lack of sleep. And for most university students, it probably takes willpower to go to bed early, since nearly everyone I know who is my age seems to be on a longer-than-24-hour natural sleep schedule. So lack of sleep could cause depression, although once you were depressed, you might find yourself wanting to sleep more (and having an even harder time keeping up with classes).
Personal anecdote: long periods of sleep deprivation can mess up your neurotransmitter levels enough to cause an episode of psychosis. This actually happened to one of my good friends. (When you're waking up at 4:30 am every day for swim practice, and staying up late for whatever reason including just wanting to have a life, sleep deprivation can very quickly get out of hand.) You probably have to be genetically predisposed, but still...it scares me.
Replies from: Vaniver↑ comment by Vaniver · 2011-10-01T21:21:36.300Z · LW(p) · GW(p)
And for most university students, it probably takes willpower to go to bed early, since nearly everyone I know who is my age seems to be on a longer-than-24-hour natural sleep schedule.
It seems likely that this is a combination of youthful endurance plus a lack of night cues (computer screens make fake-sunlight at any time of the night), rather than young people actually having a circadian rhythm that's longer by hours.
Replies from: gwern, Swimmer963↑ comment by gwern · 2011-10-01T21:36:52.627Z · LW(p) · GW(p)
I disagree. The circadian rhythms in middle school and up is very well established; please see all the links & citations in http://www.gwern.net/education-is-not-about-learning#school-hours
That it is not a mere preference but a biological reality is one of the reasons I regard melatonin as so useful - fight fire with fire.
EDIT: Of course, it's also true that artificial light and computer screens are not helpful in the least: see the second paragraph in http://www.gwern.net/Melatonin#health-performance So you might say for young people, it's a many-edged problem: they naturally want to go to bed late, their electronic devices exacerbate the original biological problem, and then all the social dynamics can begin to contribute their share of the problem...
Replies from: Douglas_Knight, Vaniver↑ comment by Douglas_Knight · 2011-10-01T23:18:44.813Z · LW(p) · GW(p)
I think Vaniver is objecting to the narrow claim of a cycle longer than 24 hours. Without clicking through on your sources, they seem to say that teens have a shifted cycle, not a longer cycle.
In particular, that shifting school later improves sleep suggests that teens have a shifted cycle. If they had an unmoored cycle of longer than 24 hours, the greater light exposure of an earlier start would probably be better.
↑ comment by Vaniver · 2011-10-02T05:15:56.211Z · LW(p) · GW(p)
Douglas_Knight is correct; I'm not challenging "young people want to go to bed late and get up late" but "young people want to sleep six times a week rather than seven" (or, more reasonably, 13 times every two weeks).
↑ comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2011-10-01T21:28:41.179Z · LW(p) · GW(p)
I do remember reading in a variety of places that young people, especially teenagers, tend to have more trouble sticking to an earlier sleep schedule. But you're right that this isn't necessarily biological in origin. It could just be that young people have a) greater benefits to gain from staying up late, since that's when a lot of socializing takes place, and b) less practice using willpower to force themselves to go to bed, and maybe less incentive, since with their "youthful endurance" they can push through on 2-3 hour of sleep.
And being able to do this, or for example get really drunk and still make it to work early the next morning, is definitely a status thing that people are almost competitive about. Maybe some kind of signalling at work, too: "I'm so healthy and strong, I can afford to get really, really drunk and hardly get any sleep and still function...I must have awesome genes." That could explain how being a compete idiot and passing out on my friend's floor in front of my supervisor when I had an exam the next day somehow made me cooler to all the staff.
↑ comment by [deleted] · 2011-10-01T16:06:57.582Z · LW(p) · GW(p)
I don't think it's everybody -- certainly there are cases of severe depression where the person sleeps 20 hours a day.
Maybe it's more that sleep deprivation can masquerade as depression. That is, if you're tired, slow, unmotivated, hopeless, lethargic, plunged in gloom, and you're sleeping four or five hours a night, your problems might be related to your sleep patterns.
Replies from: Douglas_Knight↑ comment by Douglas_Knight · 2011-10-01T21:53:33.721Z · LW(p) · GW(p)
Sure, fatigue can cause unhappiness, but I don't think it looks like clinical depression. You seem to be holding yourself up as an example. Did anyone think you clinically depressed when sleep deprived?
comment by Scott Alexander (Yvain) · 2011-10-01T14:51:43.572Z · LW(p) · GW(p)
I've been self-experimenting with piracetam the past few months.
I usually study from a site called USMLEWorld with a selection of difficult case-based medical questions. For example, it might give a short story about a man coming into a hospital with a certain set of symptoms, and explain a little about his past medical history, and then ask multiple choice questions about what the most likely diagnosis is, or what medication would be most helpful. These are usually multi-step reasoning questions - for example, they might ask what side effect a certain patient could expect if given the ideal treatment for his disease, and before answering you need to determine what disease he has, what's the ideal treatment, and then what side effects that treatment could cause. My point is they're complicated (test multiple mental skills and not just simple recall) and realistic (similar to the problems a real doctor would encounter on the job).
I've tried comparing my performance on these questions on versus off piracetam. My usual procedure is to do twenty questions, take 2400 mg piracetam + 600 mg lecithin-derived choline, go do something fun and relaxing for an hour (about the time I've been told it takes for piracetam to take effect) then do twenty more questions. It's enough of a pain that I usually don't bother, but in about three months of occasionally doing my study this way I've got a pool of 160 questions on piracetam and 160 same-day control questions. Medicine is a sufficiently large and complicated field that I don't think three months worth of practice effects are a huge deal, and in any case I made sure to do equal piracetam and control questions every day so there wouldn't be a practiced-unpracticed confounder.
I got an average of 65% of questions right in the control condition and 60% of questions right on piracetam, but the difference was not significant.
USMLEWorld also tells you how other people did on each question; I used this information to run a different analysis controlling for the random difficulty variation in the questions. In the control condition I did 2.8% better than average, in the piracetam condition I did 1.3% worse than average; this wasn't a significant difference either.
I do worry that fatigue effects might have played a part; I tried to always rest and relax between conditions, but I was always doing piracetam after control (I wanted to have same-day comparisons to eliminate practice effects, and piracetam lasts too long for me to feel comfortable taking it first and then doing control after it wore off). But I didn't feel fatigued, and I haven't noticed huge fatigue effects when I study a lot without taking piracetam.
In any case, piracetam either has no effect on me in the reasoning domains I'm interested in, or else its effect is so small that it is overwhelmed even by relatively minor fatigue effects.
Replies from: ataftoti, D_Malik, Alexei↑ comment by D_Malik · 2014-03-05T20:47:20.157Z · LW(p) · GW(p)
The main claimed benefit for piracetam is not backwards recall right after supplementation; this seems to be a benefit, but it's small. The main claimed benefit is reduction of long-term cognitive decline with high-dose piracetam over time. See for instance http://examine.com/supplements/Piracetam/#main_clinical_results . (You probably know this; this is directed at the other people reading your comment.)
↑ comment by Alexei · 2011-10-03T00:44:28.077Z · LW(p) · GW(p)
I've taken Piracetam + Choline combination daily for a week (twice) and I've never noticed any positive effects. If anything, I was more irritated and prone to head-aches. Although, I didn't have a solid method of measuring the difference like you, so this is purely anecdotal.
Replies from: Jayson_Virissimo↑ comment by Jayson_Virissimo · 2011-10-04T06:44:12.498Z · LW(p) · GW(p)
If you get headaches you should probably up the choline dosage or use a more bioavailable form like CDP choline.
comment by eugman · 2011-10-02T02:33:24.882Z · LW(p) · GW(p)
Lukeprog, I noticed in your last two posts you've used a stock photo to represent the subject of the post. I may be different from everyone else, but despite the usefulness of this design choice, I associate it with probloggers or whatever you would call them. So, personally (and this is only personal taste), I would try to use them very sparingly. I hope you don't mind my suggestion.
Replies from: Kevin, printing-spoon↑ comment by Kevin · 2011-10-02T11:32:52.873Z · LW(p) · GW(p)
You aren't the target audience for the stock photo, it's a random person seeing Less Wrong for the first time. People like pictures.
Replies from: eugman↑ comment by eugman · 2011-10-02T16:05:32.302Z · LW(p) · GW(p)
I felt I was quite humble in giving my opinion (or maybe just self-effacing). Still, I'm willing to logically concede the point.
Replies from: Kevin↑ comment by printing-spoon · 2011-10-02T21:00:22.969Z · LW(p) · GW(p)
Yeah, and the texture in this picture makes my skin crawl. The pills look like growths or something.
comment by Jayson_Virissimo · 2011-10-03T07:09:04.615Z · LW(p) · GW(p)
Why no love for nicotine?
comment by shokwave · 2011-10-04T06:24:00.859Z · LW(p) · GW(p)
Having experimented with nootropics (using gwern's site as a guide), I can report there is little exciting in the way of "being smarter" - but there is plenty of low-hanging fruit in the stimulants! Being more alert and motivated is a pretty good proxy for being smarter to boot.
comment by Kevin · 2011-10-02T00:19:05.058Z · LW(p) · GW(p)
Lately I've been extraordinarily surprised at how effective potassium and potassium salt are. By which I mean that simple potassium is probably the most positively mind altering supplement I've ever tried.
Replies from: wedrifid, Yvain, anonym, gwern↑ comment by wedrifid · 2011-10-02T08:25:19.974Z · LW(p) · GW(p)
Lately I've been extraordinarily surprised at how effective potassium and potassium salt are.
Wait... Potassium AND potassium salt? You have actually tried using non-salt forms of potassium as a mind altering supplement? That's seriously baddass!
Replies from: Kevin↑ comment by Scott Alexander (Yvain) · 2011-10-02T08:17:04.521Z · LW(p) · GW(p)
In what form did you take the potassium?
Replies from: Kevin↑ comment by anonym · 2011-10-02T03:05:42.057Z · LW(p) · GW(p)
Please elaborate. In what ways have you found it to be mind-altering?
Replies from: Kevin↑ comment by Kevin · 2011-10-02T03:13:58.012Z · LW(p) · GW(p)
About 15 minutes after consumption, it manifests as a kind of pressure in the head or temples or eyes, a clearing up of brain fog, increased focus, and the kind of energy that is not jittery but the kind that makes you feel like exercising would be the reasonable and prudent thing to do. I have done no tests, but "feel" smarter from this in a way that seems much stronger than piracetam or any of the conventional weak nootropics.
It is not just me -- I have been introducing this around my inner social circle and I'm at 7/10 people felt immediately noticeable effects. The 3 that didn't notice much were vegetarians and less likely to have been deficient.
Now that I'm not deficient, it is of course not noticeable as mind altering, but still serves to be energizing, particularly for sustained mental energy as the night goes on.
Replies from: pjeby, Nick_Tarleton↑ comment by pjeby · 2011-10-03T15:38:29.137Z · LW(p) · GW(p)
How much did you take?
Replies from: Kevin↑ comment by Kevin · 2011-10-04T01:01:29.513Z · LW(p) · GW(p)
Initially 1 teaspoon of potassium salt in water.
Replies from: Kevin↑ comment by Kevin · 2012-04-13T05:44:27.434Z · LW(p) · GW(p)
Note: I now consider 1 teaspoon at once to be a dose larger than necessary. I only recommend such a large dose at once if it is important to you to be able to viscerally sense potassium coursing through your body. I'd recommend drinking at least 24 ounces of water with that much potassium. Definitely, definitely don't eat it straight with no water.
↑ comment by Nick_Tarleton · 2011-10-04T01:32:44.312Z · LW(p) · GW(p)
How long does the increased energy last?
Replies from: Kevin↑ comment by gwern · 2011-10-02T02:34:30.026Z · LW(p) · GW(p)
Well, I can't say I've heard that one before; resources? And are you sure you simply didn't have a deficiency? Fixing a deficiency can have dramatic effects but be useless for everyone else (eg. that LWer who fell in love with sulbutiamine because he had B-vitamin problems, even though sulbutiamine is only a very mild stimulant to me).
Replies from: Kevincomment by betterthanwell · 2011-10-01T11:43:53.232Z · LW(p) · GW(p)
Maybe also this: Single Dose of 'Magic Mushrooms' Hallucinogen May Create Lasting Personality Change
A single high dose of the hallucinogen psilocybin, the active ingredient in so-called "magic mushrooms," was enough to bring about a measurable personality change lasting at least a year in nearly 60 percent of the 51 participants in a new study, according to the Johns Hopkins researchers who conducted it.
From the abstract:
A large body of evidence, including longitudinal analyses of personality change, suggests that core personality traits are predominantly stable after age 30. To our knowledge, no study has demonstrated changes in personality in healthy adults after an experimentally manipulated discrete event. Intriguingly, double-blind controlled studies have shown that the classic hallucinogen psilocybin occasions personally and spiritually significant mystical experiences that predict long-term changes in behaviors, attitudes and values. (...) Consistent with participant claims of hallucinogen-occasioned increases in aesthetic appreciation, imagination, and creativity, we found significant increases in Openness following a high-dose psilocybin session. In participants who had mystical experiences during their psilocybin session, Openness remained significantly higher than baseline more than 1 year after the session.
Replies from: gwern, ciphergothOpenness to experience correlates with creativity, as measured by tests of divergent thinking. Openness is also associated with crystallized intelligence, but not fluid intelligence. These mental abilities may come more easily when people are dispositionally curious and open to learning. However, openness is only weakly related to general intelligence. Openness to experience is related to need for cognition, a motivational tendency to think about ideas, scrutinize information, and enjoy solving puzzles.
Openness is the only personality trait that correlates with neuropsychological tests of dorsolateral prefrontal cortical function, supporting theoretical links among openness, cognitive functioning, and IQ.
↑ comment by gwern · 2011-10-01T19:51:27.667Z · LW(p) · GW(p)
My own impression on reading that yesterday was that your average LWer doesn't really need Openness; what we need is Conscientiousness!
EDIT: I've posted article based on Spent dealing with Openness: http://lesswrong.com/lw/82g/on_the_openness_personality_trait_rationality/
Replies from: Metus↑ comment by Metus · 2011-10-02T12:40:38.018Z · LW(p) · GW(p)
Now if we only had a drug that increases conscientiousness.
Replies from: wedrifid, NancyLebovitz↑ comment by wedrifid · 2011-10-02T13:03:22.273Z · LW(p) · GW(p)
Now if we only had a drug that increases conscientiousness.
Stimulants in general. And most (other) things that increase dopamine or norepinephrine can be expected to some extent. Pramiracetam. For many anabolic steroids increase motivation as a side effect, a significant component of conscientiousness.
↑ comment by NancyLebovitz · 2011-10-02T12:45:18.523Z · LW(p) · GW(p)
I think amphetamines can do that, at least for people with ADD.
Is anything known about a physical basis for conscientiousness?
Replies from: VincentYu↑ comment by VincentYu · 2012-07-12T03:48:54.939Z · LW(p) · GW(p)
DeYoung and Gray (2009) wrote a review on the neuroscience of the Big Five traits in The Cambridge handbook of personality.
The two relevant paragraphs on conscientiousness:
When considering research on the biological basis of the various impulsivity-related traits, one must bear in mind that most are related to multiple Big Five dimensions. Zuckerman (2005) noted that many studies have found Impulsive Sensation-Seeking and similar traits to be associated with high levels of dopaminergic function and low levels of serotonergic function. However, he argued that dopamine is associated with the approach tendencies reflected in these traits, whereas low serotonin is related to the absence of control or restraint. Involvement of serotonin in control and restraint is consistent with findings that serotonin is associated with Conscientiousness (Manuck, Flory, McCaffery et al. 1998, Rosenberg, Templeton, Feigin et al. 2006).
Another biological factor that may be related to Conscientiousness is glucose metabolism. Glucose represents the basic energy source for the brain, and a number of studies indicate that blood-glucose is depleted by acts of self-control and that the extent of this depletion predicts failures of self-control (Gailliot, Baumeister, DeWall et al. 2007; Gailliot and Baumeister 2007). Further, a self-report measure of trait self-control, which correlates highly with Conscientiousness, similarly predicts failures of self-control (Gailliot, Schmeichel and Baumeister 2006; Tangney, Baumeister and Boone 2004). Perhaps individuals whose metabolism provides their brains with an ample and steady supply of glucose are likely to be higher in Conscientiousness. If individual differences in glucose metabolism prove to be involved in Conscientiousness, one will also want to know what brain systems are consuming glucose to fuel acts of self-control. The prefrontal cortex seems likely to be involved, given its central role in planning and voluntary control of behaviour, and given that its consumption of glucose appears relatively high (Gailliot and Baumeister 2007). An fMRI study (Brown, Manuck, Flory and Hariri 2006) showed that brain activity inventral prefrontal cortex during a response inhibition task was negatively associated with a questionnaire measure of impulsivity that is strongly negatively correlated with Conscientiousness (Whiteside and Lynam 2001).
It seems like high levels of serotonin and blood-glucose are associated with high levels of some specific facets of conscientiousness.
↑ comment by Paul Crowley (ciphergoth) · 2011-10-01T12:22:59.203Z · LW(p) · GW(p)
The statistics in the linked paper are very badly done: see Does psilocybin cause changes in personality? Maybe, but not so fast.
comment by Vaniver · 2011-10-01T19:01:05.954Z · LW(p) · GW(p)
Creatine improves cognitive performance.
Isn't this primarily true for vegetarians? I was under the impression that most people have all the creatine their brains can make use of.
Replies from: gwern↑ comment by gwern · 2011-10-01T19:53:47.296Z · LW(p) · GW(p)
Not just vegetarians; if you had clicked through to my page, you'd see my summary:
Replies from: Vaniverhowever, Rae 2003 was only in vegetarians, who are known to be creatine deficient (much like B vitamins, creatine is usually gotten in one’s diet from meat), and the other studies are likewise of subpopulations. Rawson 2008 (PDF) studied young omnivores who are not sleep-deprived, and found no mental benefits. However, vegetarians (Rae 2003), the sleep-deprived, and old people may benefit from creatine supplementation.
↑ comment by Vaniver · 2011-10-01T20:01:19.109Z · LW(p) · GW(p)
I did click through to your page; I decided not to quote it directly, which was a mistake. My impression is that of LWers, vegetarians are the most common group (though perhaps there are lots of sleep-deprived people).
Overall, I was disappointed with taking a qualified statement ("creatine deficiency causes intelligence problems; make sure you have enough") and turning it into an unqualified statement ("creatine improves cognitive performance").
Replies from: gwern↑ comment by gwern · 2011-10-01T21:08:57.385Z · LW(p) · GW(p)
My impression is that of LWers, vegetarians are the most common group (though perhaps there are lots of sleep-deprived people).
But still not very common. Vegetarian LWers would be, what, 10% maybe? (Not sure any surveys have covered it, but I don't see it discussed very often).
Overall, I was disappointed with taking a qualified statement
/shrug
That's Luke's description, not mine. I've edited the page to include specific citations for each group and some PDF links, incidentally.
comment by beriukay · 2011-10-30T12:02:17.330Z · LW(p) · GW(p)
I decided to try some of the suggestions here. There was a piracetam powder I ordered. How on earth are you supposed to ingest that crap?!! I have such a powerful negative taste reaction, even disguising the ~2g in 3 glasses of water, or 2 glasses of milk, or in a mouthful of food (though it suggests consuming on an empty stomach)... that even if it was prescribed by a doctor to cure aging, I'd be hard-pressed to take the recommended dosage on a daily basis. What can I do to continue this experiment without having to annihilate my taste buds?
Replies from: gwern↑ comment by gwern · 2011-10-30T15:01:32.375Z · LW(p) · GW(p)
You mustn't done much reading about piracetam because everyone complains about the taste - it's impressively nasty, isn't it? (BTW, if you purchased the powder, I guess you noticed the price difference between the powders and the pills; you should have wondered why there was such a price difference and not then been surprised at the taste.) Anyway, what you can do about it:
- hide it with citrus fruit juices (eg. unsweetened grapefruit juice)
- cap the piracetam powder (might be a bit expensive if you don't already own a capsule machine and empty pills)
- 'parachute' (make pills using toilet paper)
↑ comment by pengvado · 2011-10-30T16:43:25.711Z · LW(p) · GW(p)
you should have wondered why there was such a price difference and not then been surprised at the taste
I would have expected any price difference to have something to do with the cost of making pills. If that's not the case... is there a competitive market in powder but not a competitive market in pills, or do all the sellers agree on this same method of price discrimination, or what?
Replies from: gwern, wedrifid↑ comment by gwern · 2011-10-30T16:49:12.892Z · LW(p) · GW(p)
There's usually a cost to the convenience of pills, yes; but the greatest the difference, the more convenience is being provided (because otherwise people would just cap their own or not buy the substance at all). Piracetam seems to have unusual differentials, pointing to some greater convenience being provided - which I believe to be related to its revolting taste.
↑ comment by wedrifid · 2011-10-31T01:20:07.478Z · LW(p) · GW(p)
I would have expected any price difference to have something to do with the cost of making pills. If that's not the case... is there a competitive market in powder but not a competitive market in pills, or do all the sellers agree on this same method of price discrimination, or what?
I'm inclined to agree. A large difference between the price of pills and the price of the powder tells us a whole lot more about the depth of the market than about taste. If the market for piracetam were large one would far more closely track the other.
↑ comment by beriukay · 2011-11-01T07:12:50.861Z · LW(p) · GW(p)
You are correct. I basically wiki'd it, glanced at some of the LW material, browsed amazon and saw largely positive reviews for the product. Maybe those people all have capsule-makers.
I'll report back with my experiences with citrus and/or parachuting later. Thanks for the tips!
↑ comment by Vaniver · 2011-10-30T17:17:09.257Z · LW(p) · GW(p)
'parachute' (make pills using toilet paper)
Can't you just chew on a cracker, spit that out, and make a 'pill' out of that? Though I suppose some people might find that more unappetizing than eating paper.
[edit: never mind, grandparent discussed eating it with food and said it was suggested against.]
comment by Paul Crowley (ciphergoth) · 2011-10-01T12:21:35.000Z · LW(p) · GW(p)
Could you expand the "CRT" initialism? I'm not finding it in the linked text on a quick scan. Thanks!
Replies from: lukeprog↑ comment by lukeprog · 2011-10-01T13:04:19.888Z · LW(p) · GW(p)
Ah. That was a copy-paste fail. The link is now fixed.
Replies from: ciphergoth↑ comment by Paul Crowley (ciphergoth) · 2011-10-01T14:02:21.364Z · LW(p) · GW(p)
Thanks! Think it might still be best to expand the initialism in the text, but now I know what you mean.
comment by Iabalka · 2011-10-02T08:25:05.332Z · LW(p) · GW(p)
What about improving rationality with neurofeedback? The theory is that if you can see some kind of representation of your own brain activity (EEG for example), you should be able to learn to modify it. It has been shown that people could learn to control pain by watching the activity of their pain centres (http://www.newscientist.com/article/mg18224451.400-controlling-pain-by-watching-your-brain.html). Neurofeedback is also used to treat ADHD, increase concentration, and "it has been shown that it can improve medical students' memory and make them feel calmer before exams."
Replies from: GilPanama↑ comment by GilPanama · 2011-10-06T07:22:53.155Z · LW(p) · GW(p)
I did quite a bit of EEG neurofeedback at the age of about 11 or 12. I may have learned to concentrate a little better, but I'm really not sure. The problem is that once I was off the machine, I stopped getting the feedback!
Consider the following interior monologue:
"Am I relaxing or focusing in the right way? I don't have the beeping to tell me, how do I know I am doing it right?"
In theory, EEG is a truly rational way to learn to relax, because one constantly gets information about how relaxed one is and can adjust one's behavior to maximize relaxation. In practice, I'm not sure if telling 12-year-old me that I was going to have access to electrical feedback from my own brain was the best way to relax me.
The EEG did convince me that physicalism was probably true, which distressed me because I had a lot of cached thoughts about how it is bad to be a soulless machine. My mother, who believed in souls at the time, reassured me that if I really was a machine that could feel and think, there'd be nothing wrong with that.
I wonder how my rationality would have developed if, at that point, she had instead decided to argue against the evidence?
comment by [deleted] · 2015-07-08T08:37:29.538Z · LW(p) · GW(p)
There are several key reasons that rationalists may underestimate the dangers of drugs. I can also think of one good reason, other than the obvious one (which is the application of rationality techniques including calling on helpful social influence)
hypothesised risk factors
There is a big distance in the kinds of inferences that can be made from consistently health literature and popular social commentary. Rationalists may be biased to base their decision to use or continue to use drugs based on medical evidence, without incorporating evidence from common sensibilities or mainstream social commentary.
Drugs may affect our emotions and thus our capacity to make decisions without that stimuli
Drug positive information is often suppleid in contexts that are largely unregulated and unscrutinised, which unscrupulous individuals may be motivated to take advantage of
Anecdotally, drug use elevates others' social dominance orientation, particularly towards non-drug users.
Anecdotally, rationalists are more open to novel experiences than others. They may wish to seek out similarly minded peoples, as I do, and come into the company of drug positive people.
hypothesised protective factors
- Rationalists, at least those who blog here, are more likely to recount experiences, including those which may be more difficult to confront otherwise. Engaging with past experiences in putting them into a coherent narrative is therapeutic.
comment by kilobug · 2011-10-01T15:47:31.692Z · LW(p) · GW(p)
Well, it's not surprising that drugs can help with cognition. But we've to be very careful about two things : the effects it has on other parts of the body, and the long-term effects, both to the body and to the brain itself.
The human body is a very complex and delicate machinery, and the human brain the most delicate part of it... it's very easy to create long term problems in it by trying to push it a bit too much. Just look at the professional sport players, and how badly they are damaged after a few years of taking drugs to enhance their performances.
That's why I tend to be very careful about not taking drugs, unless I already have a disorder that needs fixing, and unless advised to take them by a doctor I trust. Taking drugs to increase performances sounds a bit like overclocking CPUs. Sure they'll go faster... but at the risk of increasing bugs, and of a shorter lifespan due to increased heating.
Now if a drug is really efficient, with no side-effect, no long-term consequences, according to very strict peer-reviewed studies, then ok. But until then, my motto is "careful, your body is complex and delicate, we don't fully understand it, so unless you really have a problem, don't take drugs".
Replies from: wedrifid, AndHisHorse, Lumifer↑ comment by wedrifid · 2013-08-08T17:34:40.190Z · LW(p) · GW(p)
That's why I tend to be very careful about not taking drugs, unless I already have a disorder that needs fixing, and unless advised to take them by a doctor I trust. Taking drugs to increase performances sounds a bit like overclocking CPUs. Sure they'll go faster... but at the risk of increasing bugs, and of a shorter lifespan due to increased heating.
The analogy works well when considering stimulants. However when considering drugs or supplements that are neuroprotective or that actively promote neurogenesis the analogy becomes fallacious. Cerebrolysin, for example is more analogous to opening up your computer and replacing the CPU and RAM with more powerful and more reliable components. Sure, it is invasive and requires caution and knowledge to do but the life expectancy of the core components is increased, not decreased.
↑ comment by AndHisHorse · 2013-08-08T16:13:21.779Z · LW(p) · GW(p)
I am of the impression that the reason for the health problems of professional athletes is the degree to which they push their bodies (which, perhaps, might not be possible/feasible without drugs and supplements) rather than a direct effect from the drugs themselves.
Further, while I share your caution regarding the risks of causing damage to the body or brain through some unknown mechanism or weakness, there is a point at which I believe people would be best advised to take supplemental drugs. Further, whether or not you have a problem depends on your reference point: as a young man of moderate resources in a developed country, I am not of below-average health for the human race, but I am also not optimizing my physical and mental faculties. (For the time being, anyway).
And there may be some drugs which might reasonably be expected to provide a health benefit which outweighs the probability of "increasing bugs", which it would be rational to take given all but the most extremely loss-averse utility functions. I would say that a superior metaphor would be upgrading your CPU - the process may have unintended side effects, but it may not, and there is fair evidence that it will have some positive outcomes. The difficulty lies in weighing these expectations, which I think is inhibited by setting a hard limit.
Replies from: wedrifid↑ comment by wedrifid · 2013-08-08T17:21:48.902Z · LW(p) · GW(p)
I am of the impression that the reason for the health problems of professional athletes is the degree to which they push their bodies (which, perhaps, might not be possible/feasible without drugs and supplements) rather than a direct effect from the drugs themselves.
Your impression approximately matches my research.
↑ comment by Lumifer · 2013-08-08T18:05:19.832Z · LW(p) · GW(p)
That's why I tend to be very careful about not taking drugs
"Drug" is a fuzzy concept.
Specifically, I don't see a well-defined boundary between "drugs" and "food" (and/or drink). Obvious substances that straddle that boundary are psychoactives -- coffee, alcohol, qat, etc. But if we think about human biochemistry, I can affect my metabolism -- pretty radically, too -- purely by varying my diet.
For example, I can switch my body into ketosis by not eating carbs. No drugs involved, and yet I am seriously messing with the "very complex and delicate machinery" of my body.
Add exercise. "Runners high" is well-known phenomenon. Is running a drug?
Add lifestyle, etc. Stress, sleep patterns, etc. all strongly affect your body.
So what's so special about pills and capsules that you have to be so very careful about taking them, while the remaining multitude of way to affect your body and mind gets a free pass?
Replies from: AndHisHorse↑ comment by AndHisHorse · 2013-08-08T18:37:45.227Z · LW(p) · GW(p)
Because pills and capsules are things which have substantial effects on the human body in ways which bypass common natural pathways. Variations in diet (to an extent) and lifestyle changes (of certain kinds) were common in the ancestral environment, which means that they have been reliably and exhaustively tested on the human race over the course of our entire history as a species.
There are some artificial things which have been well-tested. Alcohol, for example, has effects which are well-known; enough people make use of it that there have been substantial incentives to research it exhaustively, at least to the point where we feel fairly comfortable imbibing it in moderation without fear of catastrophic side effects. The same goes for a lot of other substances, some of which are natural (I would assume that natural psychoactive substances were more likely to be discovered before societies grew very picky about what they put in their bodies, i.e. before the introduction of many regulatory bodies).
The issues with the remaining substances is that we don't have enough knowledge about (some of) them to justify what could be existential risk. There have been enough drug recalls over the years that it has become apparent that a successful clinical trial, possibly funded by the same company which has an incentive to bring the drug to market quickly, is not sufficient evidence to dismiss the possibility. As a result, it is not irrational to take extra care - acquiring extra information and erring on the side of the status quo (which happens to include one self, currently still alive and well).
Replies from: Lumifer↑ comment by Lumifer · 2013-08-08T18:51:07.966Z · LW(p) · GW(p)
Variations in diet (to an extent) and lifestyle changes (of certain kinds) were common in the ancestral environment, which means that they have been reliably and exhaustively tested on the human race over the course of our entire history as a species.
That is not true with respect to a large part of contemporary Western diet. Things like refined sugar, hydrogenated oils, a wide variety of food preservatives, flavourings and colorings are new and appeared an instant ago on the evolutionary time scale.
To give a basic example, take a look at the ingredients of Coke: high-fructose corn syrup, phosphoric acid, caramel color, caffeine -- I don't think you can make an argument that humans evolved to drink this.
That's not true with respect to lifestyle, too. Sitting pretty motionless on a chair for 10+ hours per day is not something evolution prepared our bodies for.
The issues with the remaining substances is that we don't have enough knowledge about (some of) them to justify what could be existential risk.
My point is precisely that people in the Western world routinely consume large amounts of these "remaining substances" without a second thought. Why eating hydrogenated soybean oil, for example, is not risky?
By the way, do you consider over-the-counter supplements drugs? do your arguments apply to them?
Replies from: AndHisHorse, Eugine_Nier↑ comment by AndHisHorse · 2013-08-08T19:12:42.280Z · LW(p) · GW(p)
That is why I included these qualifiers. Things such as alcohol and relatively sedentary lifestyles are either common enough to be well-studied, or pervasive enough to be unavoidable.
There are some risks that come with our environment that we do not evaluate in the same way as we evaluate the choice to start a new medication, because the costs of disentangling ourselves from these incredibly common things are higher (in what I estimate to be a typical human utility function with normal time-discounting; your results may vary) than the opportunity costs of declining to try a new supplement.
Further, there is a sort of natural experimentation occurring with these substances which a large number of people consume; if there are substantial negative side effects to them, odds are good that they will become obvious in others before they become a problem for some given person. We reassure ourselves that, since this has not happened, we have some fairly decent evidence that these popular substances are not terrible. New, rare, and unpopular drugs do not have this "natural experiment" advantage.
Replies from: Lumifer↑ comment by Lumifer · 2013-08-08T19:40:34.961Z · LW(p) · GW(p)
You're basically making an argument against anything "new, rare, and unpopular", but that argument applies equally well to drugs, food, and lifestyle.
Remember the original issue? "Drugs are risky", but what is a drug? If I decide that ketosis is great and convert my diet to 80% saturated fat, is that less risky than starting to take a baby aspirin per day just because the first is "food" and the second one is a "drug"?
If I decide to take doses of naringin that's dangerous because naringin is a drug, right? But if I eat a lot of grapefruits to get an equivalent dose, that's OK because grapefruits are food?
Replies from: AndHisHorse↑ comment by AndHisHorse · 2013-08-08T19:52:22.895Z · LW(p) · GW(p)
I wouldn't argue against taking an asprin a a day any more than I would argue against converting your diet to 80% saturated fats; both asprin and saturated fats are commonly ingested substances.
If you decide to take a supplement which is found in natural foods, I would not assign that any more risk than eating the equivalent amount of food. Either way, the issue would seem to be in the dosage, provided that the food has been proven safe. If it takes 100 grapefruits to equal a single dose of naringin, however, I would be worried - because you are consuming it in excess of what would ordinarily be expected.
The reason I am less worried about things such as dietary changes is that individuals experience dietary variation fairly frequently, and even from personal experience we know that we have mechanisms which alert us when our diet is lacking (sometimes). However, I do not believe that they are without risk, or that one should simply try out an extreme dietary change without prior research.
It is substances which have been relatively untested, but are in fact designed to subvert our body's mechanisms, which I have reason to worry about. Not to disavow, but to worry about, and to examine more intensely than substances which are probably, as a class, less harmful.
Replies from: Jiro↑ comment by Jiro · 2013-08-08T22:37:10.670Z · LW(p) · GW(p)
I wouldn't argue against taking an asprin a a day any more than I would argue against converting your diet to 80% saturated fats; both asprin and saturated fats are commonly ingested substances.
I think you should worry about a diet consisting of 80% fat, however, you should worry about it on different grounds to worrying about untested substances.
Replies from: Jayson_Virissimo, AndHisHorse↑ comment by Jayson_Virissimo · 2013-08-08T23:24:07.236Z · LW(p) · GW(p)
I think you should worry about a diet consisting of 80% fat...
Why?
↑ comment by AndHisHorse · 2013-08-08T22:52:46.363Z · LW(p) · GW(p)
Fair. I neglected to include 80% fat as having a standing similar to 100 grapefruits' worth of naringin.
↑ comment by Eugine_Nier · 2013-08-10T02:40:36.537Z · LW(p) · GW(p)
Things like refined sugar, hydrogenated oils, a wide variety of food preservatives, flavourings and colorings are new and appeared an instant ago on the evolutionary time scale.
And the same logic applies to them as well.
My point is precisely that people in the Western world routinely consume large amounts of these "remaining substances" without a second thought.
There's this think called the organic food movement, you may have heard of it.
Why eating hydrogenated soybean oil, for example, is not risky?
It is.
comment by DuncanS · 2011-10-01T21:42:31.535Z · LW(p) · GW(p)
On the other hand, you should consider what evolution can do. Evolution is not the world's best algorithm for inventing things. However, it is an excellent optimising algorithm. Balancing multiple considerations to decide the optimum amount of substance A in your body is the sort of problem that algorithm should do really well.
Essentially the only exception to this rule is when your cells are reacting to DNA/RNA that doesn't belong to you. If cold virus RNA is making your nose run, stop it by all means. But you should trust your own body on most other matters - adding extra chemicals is likely to turn out worse....
Note what's being optimised here - not intelligence, but biological fitness - how likely you are to reproduce successfully. You might improve intelligence somewhat, but there if there isn't a downside somewhere then Darwin was wrong.....
Replies from: arundelo, wedrifid, vi21maobk9vp, dlthomas, handoflixue↑ comment by arundelo · 2011-10-01T22:58:34.665Z · LW(p) · GW(p)
Or as Eliezer puts it:
Algernon's Law: Any simple major enhancement to human intelligence is a net evolutionary disadvantage.
But here's gwern writing about about loopholes in Algernon's Law.
↑ comment by wedrifid · 2011-10-02T08:41:01.435Z · LW(p) · GW(p)
On the other hand, you should consider what evolution can do.
It frustrates me how often this argument against using mind enhancing substances is used and, more importantly, the weight it is given. Not only is evolution optimizing for different critiera (which DuncanS mentions) it is also optimising for an entirely different environment. Further, our expectations that random chemicals will be bad for us is to a massive extent screened off when we go ahead and test them and find that they make things better!
Yet another situation in which evolution should not be expected to give superior results to what we can come up with with science is when we know what we are going to be doing at a specific time. What is best as a general baseline is not going to be the best state when studying for a test. Which is in turn going to be less good when doing unpleasant and potentially traumatic things that you don't want to remember.
Replies from: DuncanS↑ comment by DuncanS · 2011-10-03T20:27:23.106Z · LW(p) · GW(p)
Consuming chemicals that have been tested is certainly an improvement on consuming chemicals that haven't been.
Consuming chemicals to make your brain work better seems to me to be a rather similar activity to overclocking a computer. Let's add more voltage. Let's pour liquid nitrogen into it. Perhaps it will go faster ! Perhaps it will, but will it still be working in 5 years time?
First of all, note just how crude these efforts are compared to the technological research undertaken by the companies that actually make microchips. The same is true of the brain - it can make dopamine and deliver at synapses - exact points of contact throughout the brain. Yet you see people discussing just adding more dopamine everywhere, and thinking that this is in some sense improving on nature in a clever way.
I have to mention a point against myself - which is that I do take general anaesthetics, which, while not an intelligence enhancer, is definitely an intelligence modifier for specific circumstances. However, turning brain function off is arguably simpler than trying to make it better.
It is possible, definitely, to improve human intelligence by combining it with a computer. So it's not the case that I'm against the idea that it's impossible to improve on the natural intelligence we all have - it obviously is.
What I'm pointing out is that all of these drug ideas are bound to be something that evolution has at some point tried out, and thrown away. And they are really unsophisticated ideas compared with those the brain has actually adopted.
Even the situation dependent argument isn't as strong as you might think - for example your brain has a lot of adaptations to cover the "unpleasant and potentially traumatic things" situation, for example - and these adaptations generally disagree with your view that you shouldn't remember them. It's probably the case that intelligence tests are a novel environment, however....
Replies from: lessdazed, Randolf↑ comment by lessdazed · 2011-10-04T01:28:02.386Z · LW(p) · GW(p)
What I'm pointing out is that all of these drug ideas are bound to be something that evolution has at some point tried out, and thrown away. And they are really unsophisticated ideas compared with those the brain has actually adopted.
↑ comment by Randolf · 2011-10-12T23:27:50.218Z · LW(p) · GW(p)
What I'm pointing out is that all of these drug ideas are bound to be something that evolution has at some point tried out, and thrown away. And they are really unsophisticated ideas compared with those the brain has actually adopted.
Well there could be many reasons why evolution has" thrown them out". Maybe they are harmful in the long term, maybe their use consumes precious energy, or maybe they just aren't "good enough" for evolution to have kept them. That is, maybe they just don't give any signifigant evolutionary advantage.
Evolution doesn't create perfect beings, it creates beings which are good enough to survive.
↑ comment by vi21maobk9vp · 2011-10-02T10:00:01.871Z · LW(p) · GW(p)
There can be harmul side-effects and that topic is not covered by the article; on the other hand, pure evolutionary argument can be doubted because of changed environment.
If I stimulate my brain, it is natural to assume my brain requires more energy now. So I probably need more glucose. In evolutionary relevant context, that would make me more likely to starve - after all, I would need more highly valued energy and thinking clearly wouldn't make a killed bull magically appear before me.
This is still true for the most of the Earth's population. It is not true for many of LessWrong readers, though. There are some primarily-mental jobs now (in some places of the world - the places where LessWrong readers come from). Keeping more things in you mind means being a better programmer, teacher, scientific researcher. Being better at your profession often helps you to evade starvation. And getting needed amount of calories - if you already know where to get all these vitamins and microelements - is trivial in these parts of the world.
So, this modification was not a benefit earlier, and it was quite costly; both factors are significantly reduced in some parts of modern world.
Of course, increased mental capability can lead to some personality traits that make it harder to reproduce; but that is again a question of side-effects and not a self-evident thing. If you consider it harmful, you can try to spend effort on fighting these side-effects - some people report significant success..
↑ comment by dlthomas · 2011-10-05T21:22:13.741Z · LW(p) · GW(p)
Maybe inclusive genetic fitness is not my utility function.
Replies from: soreff↑ comment by soreff · 2011-10-05T21:33:04.041Z · LW(p) · GW(p)
Same here. As a childfree human, maximizing the number of copies of my DNA is right up there with paperclip maximization on my list of priorities. :-)
One other categories of exceptions: We aren't in the EEA anymore. In particular, we have much looser constraints on available calories than in the EEA, and that changes the optimal settings even for reproductive success.
↑ comment by handoflixue · 2011-10-05T20:56:05.967Z · LW(p) · GW(p)
you should trust your own body
There's this really pretty large class of issues called "genetic disorders", and a wide variety of other ways the body fails just fine without encountering foreign DNA/RNA... I'm assuming insulin for diabetics also has unexpected drawbacks and isn't really in our best interests?
Or, put succinctly: "Scientists are so ignorant! If it was possible to cure cancer, why didn't we just evolve to not have cancer in the first place?"
Replies from: DuncanS↑ comment by DuncanS · 2011-10-05T21:33:42.262Z · LW(p) · GW(p)
I have heard of genetic disorders, and know that they occur, and why they are found in the gene pool. And in such cases, it's entirely appropriate to take medication for them. I take your point that perhaps I could have mentioned this class of DNA earlier, and that it is appropriate to take medication for that class of diseases, as it doesn't improve your quality of life to leave it as it is.
I didn't think we were discussing genetic diseases particularly. I am convinced of your argument that if you have a genetic disorder that affects your intelligence, you should take medication for it. I don't see why this is relevant to the more general case of people who don't have a particular genetic issue. Evolution is a good balancing algorithm - but it works by trimming the outliers. If you are unlucky enough to be one of the outliers, there's likely something that needs correction. But generally, I don't see the relevance.
As for your second point - well, we did. Cancer is very rare in the natural environment. It's only in our much safer modern environment that we live long enough for it to become a problem again.
Replies from: handoflixue↑ comment by handoflixue · 2011-10-05T23:53:36.800Z · LW(p) · GW(p)
You may have missed my points...
Given the genetic variance in IQ, it's obvious that most people don't have optimal genetics for intelligence. Whether this is a "disorder" is an interesting semantic question, but the point remains that we know that the general class of "human brains" has a maximum that's higher than where most people are at.
Equally, our brains evolved for a much different environment with much different trade-offs. Just like cancer wasn't a threat in our ancestral environment, the ability to do a second order differential equation wasn't a benefit.
In short, medical science suggests that, actually, there's plenty of room to improve humans, both because we're extremely inconsistently built, and because we're not built to handle our current environment.
Replies from: DuncanS↑ comment by DuncanS · 2011-10-06T00:08:57.257Z · LW(p) · GW(p)
Actually I agree with all of this - there's a tremendous difference between average intelligence and the top end of the bell curve, and we have no reason not to think it can't go higher. We are the first species on this planet to attain general purpose intelligence, and there's no good reason at all to think that either more isn't possible, or indeed that the process of human evolution in this respect has stopped - quite the reverse I suspect. We have every reason to assume that at the moment human intelligence is evolving like mad - it's being very strongly selected for in a very large gene pool.
But my point is that this all has nothing to do with the proposed methods of improving the brain. If we really knew how it worked, and were able to model the consequences of our actions better, then it would be less of a guessing game whether there was a longer term price to the short term gain.