"Intelligence is impossible without emotion" — Yann LeCun

post by vedevazz · 2019-04-10T17:17:22.103Z · score: 0 (4 votes) · LW · GW · 3 comments

This is a link post for https://www.youtube.com/watch?v=he-AaNp144A


Comments sorted by top scores.

comment by Rob Bensinger (RobbBB) · 2019-04-10T21:47:53.444Z · score: 13 (4 votes) · LW · GW

My prior is that Yann LeCun tends to have unmysterious, thoughtful models of AI (example), even though I strongly disagree with (and am often confused by) his claims about AI safety. So when Yann says "emotion", I wonder if he means anything more than that they "can decide what they do" and have "some intrinsic drive that makes them [...] do particular things" as opposed to having "preprogrammed behavior".

comment by Dagon · 2019-04-10T19:38:30.263Z · score: 9 (6 votes) · LW · GW

Is there a transcript or a summary available? I can't stand videos (except sometimes as auxiliary to written information).

comment by Slider · 2019-04-11T01:11:52.567Z · score: 1 (1 votes) · LW · GW

I very much suspect that what is understood with the words alters what is said a lot. For example I expected that the argument WOULD apply to autonomous cars.

I do have thought about a similar thought before. In my formulation what an actor does is dependent on their brain state and different actions require different states. Thinking is action where you externally do nothing/very trivial thing but go from one brain state to another. Brain states are likely to be so complex that they never really reoccur. At the very least you could think that in additional time step you remember the last time step. In order to make sense of what is going in the brain you are going to draw equivalence classes on the myriad possible configurations. If you are merely trying to have an effective theory about how the mind actually works and are not prejudiced or normative how for example verbal behaviour should be taken into account a "cut reality at the joints" grouping of the states would be those where there are the most robost transition probabilities from one group to another. This way you get "natural laws" like A->B->C->D where you do not have to understand what state class C "represents" or "is about". If there are "splits" in the natural laws that is A->B->C->D and A->B->E->F are both likely it becomes a useful concept of which "river" gets activated in particular thoughtrains.

The brain state needs to include all sensory information. It's likely that the sensory information doesn't dominate which brain state is entered into. If the non-sensory information DOES dominate what state is entered into this is probably this seems like a important point about the funciton of the mind lets call these "control cognitions". One of the theorethical notions of the stateclasses migth be that given a fixed set of sensory information what is the set of all possible control cognitions.

One interpretation on "The car won't have emotions" could be that the car doesn't have any (signifcant) control cognitions because of its nearly stateless nature. The bit about "autonomous thought" is that if the impulse to do stuff comes from within it can't be sensory-dominated so it must come from a control cognition.

However it would seem that "rational reasoning" is not often called "emotional". Things like verbal thoguths of "I will walk there" would seem very much to be very capable of being a control cognition yet we would not describe such a mental stance to be very emotional. We could also think of someone walking somewhere because of sexual arousal reasons which would be very emotional behaviour/thinking. however given an arbitrary or novel mental state I do not know how I could call it an emotion or not. Thus when we examine a system like a car and somebody says "there is no emotion there" my doubt is that "you would not recognise a emotion if you saw one so how you know one isn't here?" or that "hey here is a vivid and complex control cognition network how come these are not emotions?"