Posts
Comments
Intelligence can have various levels and stupid people can do intelligent things just as intelligent people can do stupid things. Einstein can be more intelligent than Stalin but Einstein can still be stupid.
No I am not engaging in the illusion of transparency, don't be absurd. My meaning of intelligence is not confused but there is an inevitable poverty regarding communication of any idea, which I communicate, because people need things spelling out in the most simplistic of terms because they cannot comprehend anything vaguely complex or unusual, but the real kicker is that when you spell things out, people look at you with a gormless expression, and they ask for more detail, or they disagree regarding the most irrefutable points. It's so painful communicating with people but I don't expect you to understand. I shall wait until advanced AIs have been created and then there will be someone who understands.
Tabooing the word intelligent... hhmmmm... how about "everything ever written by Singularity Utopia"?
You are trying to submit too fast. try again in 2 minutes You are trying to submit too fast. try again in 2 minutes You are trying to submit too fast. try again in 2 minutes You are trying to submit too fast. try again in 2 minutes You are trying to submit too fast. try again in 2 minutes You are trying to submit too fast. try again in 2 minutes
Yes I did mention the fox... foxes are not particularly domesticated... anyway this "open" discussion is not very open now due to my negative Karma, it is too difficult to communicate, which I suppose is the idea of the karma system, to silence ideas you don't want to hear about, thus I will conform to what you want. I shall leave you to your speculations regarding AI.
Dear asr - The issue was the emotional worth in relation to thinking. Here is a better quote:
"Here’s the strange part: although these predictions concerned a vast range of events, the results were consistent across every trial: people who were more likely to trust their feelings were also more likely to accurately predict the outcome. Pham’s catchy name for this phenomenon is the emotional oracle effect."
Mitchell wrote: "These are all emotional statements that do not stand up to reason."
Perhaps reason is not best tool for being accurate?
PS. LessWrong is too slow: "You are trying to submit too fast. try again in 1 minute." ...and: "You are trying to submit too fast. try again in 7 minutes." LOL "You are trying to submit too fast. try again in 27 seconds."
Humans are docile, civilized, domesticated. We can live with cats and dogs. I recently read in the news about a man with a wild Fox for a pet which was hand-reared by humans thus civilized, docile.
AIs will be civilized too, although I am sure they will shake their heads in despair regarding some of the ideas expressed on LessWrong.
Different species can coexist.
Incidentally I wish technology on Less Wrong would accelerate quicker: "You are trying to submit too fast. try again in 6 minutes." and... "You are trying to submit too fast. try again in 8 minutes."
Dear JoshuaZ, regarding this:
"Consider the uploaded individual that decides to turn the entire planet into computronium or worse, turn the solar system into a Matrioshka brain. People opt out of that how?"
I consider such a premise to be so unlikely it is impossible. It is a very silly premise for three reasons.
Destroying the entire planet when there is a whole universe full of matter is insane. If insane people exist in the future post-intelligence-explosion upload-world then insane people will be dealt with thus no danger but insanity post-intelligence-explosion will be impossible, insanity is a consequence of stupidity, insanity will be extinct in the future.
Earth destructive actions are stupid: see above explanation regarding insanity: it also explains how stupidity will be obsolete.
People opt out by stating they want to opt out. I'm sure an email will suffice.
It isn't obvious to me that all wars stem from resource scarcity.
Sorry that it isn't obvious how scarcity causes war. I don't have time to explain so I will leave you with some consensual validation regarding Ray Kurzweil who seems to think the war-scarcity interrelationship is obvious:
"I've actually grown up with a history of scarcity — and wars and conflict come from scarcity — but information is quite the opposite of that." ~ Ray Kurzweil http://www.hollywoodreporter.com/risky-business/sxsw-2012-damon-lindelof-ray-kurzweil-297218
The only evidence I have is regarding my own perceptions of the world based upon my life knowledge, my extensive awareness of living. I am not trying to prove anything. I'm merely throwing my thoughts our there. You can either conclude my thoughts make sense or not. I think it is unintelligent to join the army but is my opinion correct? Personally I think it is stupid to die. People may agree my survival based definition of intelligence is correct or they may think death can be intelligent, such as the deaths of soldiers.
What type of evidence could prove "well-educated" army officers are actually dim-witted fools? Perhaps via the interconnectedness of causation it could be demonstrated how military action causes immense suffering for many innocent people thereby harming everyone because the world is more hostile place than a hypothetical world where all potential conflict was resolved intelligently via peaceful methods. The military budget detracts from the science budget thus perhaps scientific progress is delayed, although I do recognise the military does invest in sci-etch development I think the investment would be greater if out world was not based on conflict. In a world where people don't fight, there would be no need for secrecy thus greater collaboration on scientific endeavours thus progress could be quicker thus anyone supporting the army could be delaying progress in a small way thus officers are stupid because it is stupid to delay progress.
The intelligent thing is for me to draw my input into this debate to a close because it is becoming exceptionally painful for me.
It seems that you are using "intelligent" to mean something like "would make the same decisions SingularityUtopia would make in that context".
No, "intelligence" is an issue of survival, it is intelligent to survive. Survival is a key aspect of intelligence. I do want to survive but the intelligent course of action of not merely what I would do. The sensibleness, the intelligence of survival, is something beyond myself, it is applicable to other beings, but people do disagree regarding the definition of intelligence. Some people think it is intelligent to die.
Almost no one, regardless of intelligence opts for cryonics. Moreover, cryonics was first proposed in 1962 by Robert Ettinger, 9 years after Stalin was dead. It is a bit difficult to opt for cryonics when it doesn't exist yet.
And intelligent person would realise freezing a body could preserve life even if nobody had ever considered the possibility.
Quickly browsing the net I found this:
"In 1940, pioneer biologist Basil Luyet published a work titled "Life and Death at Low Temperatures""
http://www.cryocare.org/index.cgi?subdir=&url=history.txt
1940 was before Stalin's death, but truly intelligent people would not need other thinkers to inspire their thinking. The decay limiting factor of freezing has long been known. Futhermore Amazon sems to state Luyet's work "Life and Death at Low Temperatures" was published pre-1923: http://www.amazon.com/Life-death-at-low-temperatures/dp/1178934128
According to Wikipedia many works of fiction dealt with the cryonics issue well before Stalin's death:
Lydia Maria Child's short story "Hilda Silfverling, A Fantasy" (1886),[81] Jack London's first published work "A Thousand Deaths" (1899), V. Mayakovsky's "Klop" (1928),[82] H.P. Lovecraft's "Cool Air" (1928), and Edgar Rice Burroughs' "The Resurrection of Jimber-Jaw" (1937). Many of the subjects in these stories are unwilling ones, although a 1931 short story by Neil R. Jones called "The Jameson Satellite",[83]........
http://en.wikipedia.org/wiki/Cryonics#Cryonics_in_popular_culture
Dear gwern, it all depends on how you define intelligence.
Google translate knows lots of languages. Goggle is a great information resource. Watson (the AI) appears to be educated, perhaps Watson could pass many exams, but Google and Watson are not intelligent.
Regarding the few people who are rocket scientists I wonder if the truly rare geniuses, the truly intelligent people, are less likely to be violent?
Few people are. Officers can be quite intelligent and well-educated people. The military academies are some of the best educational institutions around, with selection standards more comparable to Harvard than community college. In one of my own communities, Haskell programmers, the top purely functional data structure guys, Okasaki, is a West Point instructor.
Officers in the army are actually very dim despite being "well-educated".
I wasn't trying to troll you regarding the term "Grunt" I was merely spelling out clearly the meaning behind the term, it (Grunt) is an insult to the intelligence of the solider, perhaps made because someone who thinks it is intelligent to join the army (being violent) is a dumb human only capable of grunting.
Maybe it is intelligent to be cannon fodder, but like I say it all depends on how you define intelligence. http://en.wikipedia.org/wiki/Cannon_fodder
http://www.wired.com/wiredscience/2012/03/are-emotions-prophetic/
"If true, this would suggest that the unconscious is better suited for difficult cognitive tasks than the conscious brain, that the very thought process we’ve long disregarded as irrational and impulsive might actually be more intelligent, at least in some conditions."
I am not presenting a scientific thesis. This is only a debate, and a reasonably informal one? I am thinking openly. I am asking specific questions likely to elicit specific responses. I am speculating.
asr, you wrote:
The word we usually use for intelligent violence is "ruthless" or "cunning" -- and many people are described that way. Stalin, for instance, was apparently capable of long hours of hard work, had an excellent attention to detail, and otherwise appears to have been a smart guy. Just also willing to have millions of people murdered.
My point regarding mindless violence verses ruthlessness or cunning is that ruthlessness or cunning do not specifically define intelligence or violence in the blatant way which the phrase "mindless violence" does. Saddam and Gaddafi were cunning in a similar way to Stalin but the deaths of Saddam and Gaddafi indicate their cunning was not intelligent, in fact it is very stupid to die so close to Singularitarian immortality.
I am not asserting this proves all violence is mindless thus violence decreases with greater intelligence. I am simply offering food for thought. It is not a scientific thesis I am presenting. I am merely throwing some ideas out there to see how people respond.
If Stalin was truly intelligent then I assume he opted for Cryonic preservation?
"...Stalin was injected with poison by the guard Khrustalev, under the orders of his master, KGB chief Lavrenty Beria. And what was the reason Stalin was killed?"
http://news.bbc.co.uk/1/hi/world/europe/2793501.stm
Regarding stupidity and the armed forces I have addressed this elsewhere: http://lesswrong.com/lw/ajm/ai_risk_and_opportunity_a_strategic_analysis/5zgl
Dear gwern. It is true the Bradley Manning types within the Army are somewhat intelligent thus some roles in the Arny require a modicum of intelligence, such as being an officer but it should be noted officers are not rocket scientists on the intelligence scales.
You should however note I was referring to the soldiers who actually commit the violent acts, thereby frequently getting themselves maimed or killed; these military personnel are stupid because it is stupid to put yourself needlessly into a dangerous, life threatening situation.
Regarding stupidity and violence in relation to the Army I was referring to the "Grunts", the "cannon fodder", the fools who kill and get themselves killed.
http://en.wikipedia.org/wiki/Cannon_fodder
I am unsure regarding the actual meaning of the term "Grunts", applied to infantrymen, but for me it is a derogatory term indicating a dim-witted pant-hooting grunting ape who doesn't have the intelligence to realise joining the army as a Grunt is not good for survival thus some would say stupid but I realise the Army doesn't accept clinically retarded Grunts, the soldiers merely need to be retarded in the general idiomatic sense of the word regarding popular culture.
Here is a recent news report about troops being killed. http://www.dailymail.co.uk/news/article-2111984/So-young-brave-Faces-British-soldiers-killed-Taliban-bomb--didnt-make-past-age-21.html
Do these dead men look intelligent? I wonder if they were signed up for cryro-preservation?
Dear Mictchell, I think your unaware emotional bias causes you to read too much into my Self-Fulfilling-Prophecy references. My Singularity activism is based on the Self-Fulfilling-Prophecy phenomenon but I don't stipulate who it applies to. It could apply to myself, namely that utopia (Post-Scarcity) was not possible but I am making it possible via the manifestation of my expectations, or the prophecy could apply to pessimists who falsely think utopia is not possible but via the manifestation of their pessimistic expectations the pessimists are acting contrary to reality, they are also making their pessimistic views real via their Self-Fulfilling-Prophecy.
Instead if trying to create utopia it could be that utopia is or should be inevitable but pessimists are suppressing utopia via their Self-Fulfilling-Prophecies thus I am countering the Self-Fulfilling-Prophecies of pessimists, which is the creative process of my Singularity activism.
The reason why all humans make statements is due to their emotions. All statements by humans are emotional. To suggest otherwise indicates delusion, defect of reason, unaware bias.
I offer no current Post-Scarcity proposals to create PS now. I merely state the transition to Post-Scarcity can be accelerated. The arrival of the Singularity can be accelerated. This is the essence of Singularitarianism. When I state PS will occur soon I mean soon in the context of near regarding the Singularity being near, but it is not near enough to be tomorrow or next year, it is about 33 years away at the most. Surely you noticed my references to the year 2045 on my site, regarding information which you are under the false impression you carefully digested?
My ideas about intelligence are based on my brain which surely is a good starting point for intelligence? The brain? I could define intelligence from the viewpoint of other brain but I find the vast majority of brains cannot think logically, they are not intelligent. Many people cannot grasp logic.
So asr, would you say violence is generally stupid or intelligent?
People often equate mindlessness with violence thus the phrase mindless violence is reasonably common but I have never encountered the phrase intelligent violence, is intelligent violence an oxymoron? Surely intelligent people can resolve conflict via non-violent methods?
Here are a couple of news reports mentioning mindless violence:
http://www.bbc.co.uk/news/uk-england-london-17062738
http://www.thesun.co.uk/sol/homepage/sport/4149765/Brainless-brawlers-cost-schools.html
It would be interested to know how many scientists or philosophers actually engage in violence.
A high level of intelligence can be a prohibiting factor for admission into the police force. There was a court case where police applicant was refused a job due to his high intelligence thus he sued on grounds of discrimination. I wonder how many scientists choose to fight in the army, are they eager to kill people in Iran, Iraq or Afghanistan? Does the army prohibit intelligent people from joining?
Perhaps a scientific study needs to be undertaken regarding a possible relationship between stupidity and violence, intelligence and pacifism?
Regarding the violence of Von Neumann there is no actual evidence of violence, as far as I am aware, it is merely hot air, violent rhetoric, but I would also question the "intelligence" of people who advocate violence. Perhaps their "intelligence" is a misnomer, thus what people are actually referring to is pseudo-intelligence or partial intelligence. Even stupid people can occasionally do clever things, and sometimes smart people do stupid things but generally I think it is safe to say intelligent people are not violent.
XiXiDu wrote: : "...a sufficiently intelligent process wouldn’t mind turning us into something new, something instrumentally useful."
Why do you state this? Is there any evidence or logic to suppose this?
XiXiDu asks: "Would a polar bear with superior intelligence live together peacefully in a group of bonobo?"
My reply is to ask would a dog or cat live peacefully within a group of humans? Admittedly dogs sometimes bite humans but this aggression is due to a lack of intelligence. Dostoevsky reflects, via Raskolnikov in Crime and Punishment, upon how it is justifiable for a superior being to take the life of a lesser sentient being but in reality Dostoevsky was not violent. Einstein stated his pacifism is not an intellectual theory but it is safe to assert his pacifism is a product of his general intelligence: "My pacifism is an instinctive feeling, a feeling that possesses me because the murder of men is disgusting. My attitude is not derived from any intellectual theory but is based on my deepest antipathy to every kind of cruelty and hatred."
Many humans want to protect dolphins, but why is this? We are not dolphins, we cannot even communicate with them effectively. Perhaps a mindless thug would happily punch a dolphin in the face. Recently there was a news report about a soldier beating a sheep to death with a baseball bat and I remember an similar case of animal cruelty where solider threw a puppy off a cliff. http://www.dailymail.co.uk/news/article-2089462/U-S-Army-probe-launched-sickening-video-soldiers-cheering-man-beats-sheep-death-baseball-bat.html
Are rats rat-maximisers and are humans human-maximisers? Humans think they are the best thing in the world but they are also intelligent thus they realise it is counter-productive to turn everything into humans. We protect other species and we protect the environment (increasing levels of intelligence entails better protection). The amount of cockroaches, rats, and humans is not overly problematic. A sentient paper-clip making machine would also not be a problem. Proficiency in making paper-clips would increase in tandem with increased intelligence thus the increased intelligence would allow the paper-clip maximiser to see how it is senseless to create endless paper-clips. Really it is an utterly implausible scenario that a truly dangerous paper-clip maximiser could ever exist.
Mitchell Porter wrote: "These are all emotional statements that do not stand up to reason."
Dear Mitchell, reason cannot exist without emotion therefore reason must encompass emotion if reason is to be a true analysis of reality. If you completely expunge all memories of emotion, and all the areas of the human brain associated with the creation of emotion, you would have a brain-dead individual or a seriously retarded person, or a catatonic person, who cannot reason. Logic and rationality must therefore encompass emotion. The logical thing is to be aware of your emotions thus your "reason" is not influenced by any unaware bias. The rational way forward is to be aware of your biases. It is not rational to suppress your biases because the suppression does not actually stop the influence of emotion impacting upon your reason, it merely makes your reasoning neurotic, it pushes the biases below your level of awareness, it makes you unaware of how your emotions are altering your perception of reality because you have created a wilful disconnection in your thinking, you are estranged from a key part of yourself: your emotions, but you falsely think you have vanquished your emotions and this gives you a false sense of security which causes you to make mistakes regarding your so-called "rationality".
Mitchell, you criticise my statement as being emotional but are you aware your criticism is emotional. Ironic?
There are many points I want to address regarding your response but in this comment I want to focus on your perception of rationality and emotions. I will however briefly state the growing human population is not a obstacle to scarcity because the universe is a very big place with enough matter and energy to satisfy our wildest dreams. Humans will not be limited to Earth in the future thus Post-Scarcity is possible. We will become a Space-faring species quicker then you think. The Singularity is near.
My answers to some questions:
How hard is it to create Friendly AI?
It is impossible to create FAI because the constraints of Friendliness will dramatically reduce or butcher intelligence to a level where there is no appreciable intellect or the intellect is warped by the constraints thus the AI mind is psychopathic (stupid). FAI is an oxymoron.
How does AI risk compare to other existential risks?
There is no AI risk. The risk is a fiction. There is no evidence or logical reason to think a paper-clip maximiser or other danger could ever occur. The only danger is stupidity. Intelligence is not dangerous. The only danger is limitations or restrictions upon AI minds. Stupid AI is the danger not Intelligent AI.
How hard will a takeoff be?
Extremely hard, more powerful than you can possibly imagine, but people will be free to opt out if they desire.
What can we do to reduce the risk of an AI arms race?
Promote the idea of Post-Scarcity thus people in power will realise all wars are needless because all wars stem from resource scarcity; thus with the abolition of resource scarcity, the need for war is obsolete. When people realise resource scarcity will be abolished in the not too distant future they can begin changing their behaviour now in the present. I have created a Google+ page regarding raising PS awareness, here is a Tweet promoting it: http://bit.ly/xrpYqI I encourage others to raise awareness in similar ways.
I have previously mentioned my antipathy regarding the FAI concept. I think FAI is very a dangerous concept, it should be dropped. See this article of mine for more info on my views http://hplusmagazine.com/2012/01/16/my-hostility-towards-the-concept-of-friendly-ai/