Posts
Comments
I find the idea that people don't like being intoxicated suspicious. Experiencing euphoria from intoxication has a lot do with with brain chemistry, and it would be very odd if some humans recieved this effect and others did not.
Another n=1: I like the way intoxication feels when I'm intoxicated, but over last couple of months I've gone from wanting to enter that state often to avoiding all alcohol on purpose. What changed was realizing on an emotional level that I have tons of interesting (or necessary) things to do and alcohol limits that by taking away evening (to drink) and the next day (I feel cognitively worse 'till next afternoon, even if I didn't have a hangover). At some point the prospect of drinking became anxiety-inducing for me.
I drink to make parties with friends tolerable because after an hour there is usually an infinite amount of things I'd rather be doing...
You win rationality(1) points for being honest with yourself :).
Even in Europe, places where you don't have to drive in traffic / door zone are incredibly rare. Bike paths are cool, but as currently implemented they mostly serve to annoy both drivers and pedestrians alike, and there is still a default assumption that where there is no bike path, you'll be driving with traffic.
tens of thousands of lives per year
Try hundreds of thousands per year from just accidents, before even counting health benefits of reduced emissions and smog saving more lives.
I suffer from severe case of Akrasia that makes me work at 10% of my capacity most of the time; here's something I discovered that made me believe problem is actually in me: my closest friend. I know her for many years, and I never ever saw her working at less than 110% of her capacity. She worked in groceries, online bookshop, sold LED bulbs and furniture, managed people, did customer relations and even social media marketing. She wants to be a writer; she hated almost every one of those jobs, felt they're hindering her development, and yet no matter how tired she was, how annoyed or abused by her bosses, she could always find the strength to focus and do her job at scary efficiency. And given all the people I ever worked with, who never had problems with focus or productivity in the ballpark of my own, I can't conclude otherwise that it's me who is just wired wrongly. I wish I had even 5% of professionalism of my friend, I could do so much more than I am able to do now.
-- Mother Gaia, I come on behalf of all humans to apologize for destroying nature (...). We never meant to kill nature.
-- You're not killing nature, you're killing yourself. That's what I mean by self-centered. You think that just because you can't live, then nothing can. You're fucking yourself over big time, and won't be missed.
From a surprisingly insightful comic commenting on the whole notion of "saving the planet".
Thanks for your long and insightful comment. I think it should be edited and put as a top-level article. It's something that I'd personally love to link my friends to everytime they start strawmanning Freud.
A great article, Eneasz.
Reminds me of something that is sitting in my quotes file, apparently coming from a Navy SEAL:
"Under pressure you don't rise to the occassion, you sink to the level of your training."
It's a rephrasing of Kant's categorical imperative.
Well, historically in case of basic subsistence activites, winning meant surviving, and loosing meant dying a horrible death. There are likely some strong adaptations in play here.
It's really, really hard -- I would say impossible -- to prove that variations or changes have not been introduced since the time of a hypothetical original text, copied from handwriting scribe to handwriting scribe.
It might be hard or even borderline impossible, but I do respect people who honestly try. I know for instance, that Jehovah's Witnesses did a lot of work in cross-corelating as many different copies of the scriptures as they could get their hands on to weed out mistranslations, copy errors, etc. when developing their own translation. So for whatever it's worth, it's nice that some people at least try.
What to do when "bum comparison principle" argument stops working because the internal, emotional pain won't leave you alone no matter where you go and what you do, and you see no way to stop it, and you gradually, over the years, build an immunity to this argument?
Was it ever socially acceptable?
I'm reminded of movies where people in impossibly tough situations stick to impossibly idealistic principles. The producers of the movie want to hoodwink you into thinking they would stand by their luxurious morality even when the going gets tough.
Strangely, most of the recent movies and TV series I saw pretty much invert this. Protagonists tend to make arguably insanely bad moral choices (like choosing a course of action that will preserve hero's relative at the cost of killing thousands of people). Sometimes this gets unbearable to watch.
I think it's entirely wrong for Americans to sympathize with Boston victims while disregarding and in many cases outright denying the existence of victims of drone strikes. It's hypocrisy at its finest and especially rich coming from self-proclaimed Christians.
That is exactly the problem with nationalism.
I suspect you're probably saying that it's understandable for Americans only to feel the reality of this kind of cruelty when it affects "their own", and my response is that it may be understandable, but then so are the mechanisms of cancer.
Akin's Laws of Spacecraft Design are full of amazing quotes. My personal favourite:
6) (Mar's Law) Everything is linear if plotted log-log with a fat magic marker.
(See also an interesting note from HN's btilly on this law)
Similar thought:
16) The previous people who did a similar analysis did not have a direct pipeline to the wisdom of the ages. There is therefore no reason to believe their analysis over yours. There is especially no reason to present their analysis as yours.
Relevant link from just yesterday: http://hackaday.com/2013/05/28/shocking-your-brain-and-making-yourself-smarter/ :).
FWIK, some universities allow you to get PhD in computer science by submitting PhD thesis for review and paying some amount of money (~$1200 on my university). This way, one can follow your advice and still get PhD.
That's the general algorithm of reading STL error messages. I still can't get why people look at you as if you were a wizard, if all that you need to do is to quickly filter out irrelevant 90% of the message. Simple pattern matching exercise.
I delay Google'ing to the last possible moment on purpose. It's by figuring out stuff by yourself that you really learn :).
This is often said to be done in the name of simplicity (the 'user' is treated as an inept, lazy moron), but I think an additional, more surreptitious reason, is to keep profit margins high.
There's also one much more important reason. To quote A. Whitehead,
Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.
Humans (right now) just don't have enough cognitive power to understand every technology in detail. If not for the black boxes, one couldn't get anything done today.
The real issue is, whether we're willing to peek inside the box when it misbehaves.
And so recommendations for more self-control regulation tend to be based on claims that we are biased to underestimate our problem.
There is something to those claims given that pretty much every addiction therapy (be it alcohol, food, porn or something else) starts from admitting to oneself that one has underestimated the problem.
That's something that I think laypeople never realize about computer science - it's all really simple things, but combined together at such a scale and pace that in a few decades we've done the equivalent of building a cat from scratch out of DNA. Big complex things really can be built out of extremely simple parts, and we're doing it all the time, but for a lot of people our technology is indistinguishable from magic.
-- wtallis
Well, but it can also be interpreted as a recursive definition expanding to:
Luck is opportunity plus preparation plus opportunity plus preparation plus opportunity plus preparation plus .... ;).
If the way of thinking is so new, then why should we expect to find stories about it?
To quote from the guy this story was about, "there is nothing new under the sun". At least nothing directly related to our wetware. So we should expect that every now and then people stumbled upon a "good way of thinking", and when they did, the results were good. They might just not manage to identify what exactly made the method good, and to replicate it.
Also, as MaoShan said, this is kind of Proto-Bayes, 101 thinking. What we now have is this, but systematically improved over many iterations.
(that is, that it was known N years ago but didn't take over the world)?
"Taking over the world" is a complex mix of effectiveness, popularity, luck and cultular factors. You can see this a lot in the domain of programming languages. With ways of thinking it is even more difficult, because - as opposed to programming languages - most people don't learn them explicitly and don't evaluate them based on results/"features".
I like doing math that involves measuring the lengths of numbers written out on the page—which is really just a way of loosely estimating log_10 x. It works, but it feels so wrong.
It has been said that the past is a foreign country. Well, it is certainly inhabited by foreigners, people whose mindset was shaped by circumstances we shy from remembering. The mother of three children who gave birth eight times. The father of four children, the last of whom cost him his wife. Our minds are largely free of such horrors, and not inured to that kind of suffering. That is the progress of technology. That is what is improving the human race.
It is a long, long ladder, and sometimes we slip, but we've never actually fallen. That is our progress.
Sometimes I still marvel about how in most time-travel stories nobody thinks of this.
The alternate way of computing this is to not actually discard the future, but to split it off to a separate timeline
Or maybe also another one, somewhat related to the main post - let the universe compute, in it's own meta-time, a fixed point [0] of reality (that is, the whole of time between the start and the destination of time travel gets recomputed into a form that allowed it to be internally consistent) and continue from there. You could imagine the universe computer simulating casually the same period of time again and again until a fixed point is reached, just like the iterative algorithms used to find it for functions.
[0] - http://en.wikipedia.org/wiki/Fixed_point_(mathematics)
This whole post strongly reminds me of "A New Kind of Science" [0], where Stephen Wolfram tries to explain the workings of the universe using simple computational structures like Cellular Automata, network systems, etc. I know that Wolfram is not highly regarded for many different reasons (mostly related to personal traits), but I got a very similar feeling when reading both NKS and this post - that there is something in the idea, that the fabric of the universe might actually be found to be best described by a simple computational model.
See also: http://lesswrong.com/lw/12s/the_strangest_thing_an_ai_could_tell_you/.
(exercising necromancy again to raise the thread from the dead)
We had this situation on CS studies in numerical methods class and in metrology class. In both cases, most of the students fudged the data in the reports and/or just plainly copied stuff from what the previous year did.
I've never seen or heard of such a school, at least not in my country. Maybe vocational schools grade like that, but in high schools I know, there's no fitting togetger, sanding, or measuring anything. It's just memorizing theory and solving exercises.