[Link] First almost fully-formed human [foetus] brain grown in lab, researchers claim
post by ESRogs · 2015-08-19T06:37:21.049Z · LW · GW · Legacy · 40 commentsContents
40 comments
This seems significant:
An almost fully-formed human brain has been grown in a lab for the first time, claim scientists from Ohio State University. The team behind the feat hope the brain could transform our understanding of neurological disease.
Though not conscious the miniature brain, which resembles that of a five-week-old foetus, could potentially be useful for scientists who want to study the progression of developmental diseases.
...
The brain, which is about the size of a pencil eraser, is engineered from adult human skin cells and is the most complete human brain model yet developed
...
Previous attempts at growing whole brains have at best achieved mini-organs that resemble those of nine-week-old foetuses, although these “cerebral organoids” were not complete and only contained certain aspects of the brain. “We have grown the entire brain from the get-go,” said Anand.
...
The ethical concerns were non-existent, said Anand. “We don’t have any sensory stimuli entering the brain. This brain is not thinking in any way.”
...
If the team’s claims prove true, the technique could revolutionise personalised medicine. “If you have an inherited disease, for example, you could give us a sample of skin cells, we could make a brain and then ask what’s going on,” said Anand.
...
For now, the team say they are focusing on using the brain for military research, to understand the effect of post traumatic stress disorder and traumatic brain injuries.
http://www.theguardian.com/science/2015/aug/18/first-almost-fully-formed-human-brain-grown-in-lab-researchers-claim
40 comments
Comments sorted by top scores.
comment by Viliam · 2015-08-19T07:51:32.038Z · LW(p) · GW(p)
The ethical concerns were non-existent (...) We don’t have any sensory stimuli entering the brain. This brain is not thinking in any way.
Don't ever let this guy walk around when someone is in a sensory deprivation tank.
Replies from: Dr_Manhattan, Luke_A_Somers, Richard_Kennaway↑ comment by Dr_Manhattan · 2015-08-19T19:35:51.756Z · LW(p) · GW(p)
To me the biggest concern was
The ethical concerns were non-existent, said Anand. “We don’t have any sensory stimuli entering the brain. This brain is not thinking in any way.”
...
For now, the team say they are focusing on using the brain for military research, to understand the effect of post traumatic stress disorder and traumatic brain injuries.
The goal being studying brain in pain implies they will need a brain in pain. Seems like ethics should come into that at some point.
Replies from: Viliam↑ comment by Luke_A_Somers · 2015-08-19T18:19:13.550Z · LW(p) · GW(p)
As bad as the argument is, it's a little different when the brain has never ever been outside one.
Replies from: hyporational↑ comment by hyporational · 2015-08-20T06:35:46.025Z · LW(p) · GW(p)
How is it a bad argument?
Replies from: Luke_A_Somers↑ comment by Luke_A_Somers · 2015-08-20T20:19:13.180Z · LW(p) · GW(p)
We don't know enough about brain operation to conclude that sensory stimuli are necessary for ethically sensitive processes to start.
Replies from: hyporational↑ comment by hyporational · 2015-08-21T15:38:46.712Z · LW(p) · GW(p)
I wasn't sure if we were metaphorically talking about the foetus brain in question or a hypothetical human that's fully grown in an isolation tank. If we were talking about the former, we seem to have a fundamentally different set of ethics. With your clarification I assume we're talking about the latter, in which case I agree with you.
Saying that an undeveloped foetus brain isn't thinking because it hasn't received sensory stimuli is a different argument than saying that a fully grown brain can't think because it hasn't received sensory stimuli.
↑ comment by Richard_Kennaway · 2015-08-19T13:27:44.241Z · LW(p) · GW(p)
Don't ever let this guy walk around when someone is in a sensory deprivation tank.
Tangentially, are those still used? There was a fad for them (especially combined with LSD) something like 40 years ago, but I've hardly heard of them since.
Replies from: g_pepper↑ comment by g_pepper · 2015-08-19T13:45:42.429Z · LW(p) · GW(p)
Sensory deprivation tanks (aka float tanks) are still a thing. Here's a business in Atlanta with float tanks. (I've never tried a float tank, so I can't speak to their efficacy.)
comment by Richard_Kennaway · 2015-08-19T12:00:40.066Z · LW(p) · GW(p)
I tried to find the original scientific work online, but it appears to be so new it isn't, yet.
The researcher is Rene Anand, and the conference it is being presented at is the Military Health System Research Symposium, which is in progress right now. Perhaps there will be more information there after the conference. At the moment there isn't even a programme listing, and most of the information that is there is behind a registration wall.
Here is his university's press release, which mentions having been able to grow the brain to the 12-week point, and speculating about 16 or 20 weeks.
At what time is it currently thought that the fetal brain can be said to be conscious? If this brain-in-a-vat was grown to the equivalent of full term, with no sensory or motor nerves, how would we decide whether it was conscious? Or to put the real issue, how would we decide if it could legitimately be treated as an inanimate object for experimental purposes? How would a religious person decide if it had a soul, bearing in mind that it was created from skin cells, not eggs and sperm?
The aim of the research is to create model tissues for studying neurological disorders, but some further possibilities are obvious, along with their moral hazards. For example, give it some sort of sensory inputs and motor outputs, see if it can learn, and look at the effect on its structures. Have the motor outputs cause external effects that produce sensory inputs (think of a baby with a rattle), and watch it learn to control features of its environment. If a brain-in-a-vat can learn to do useful things, could it be practically used as an embedded controller for a complex machine, such as a chemical plant? Or a robot body? How intelligent could these brains-in-vats be? How insane?
comment by [deleted] · 2015-08-19T21:23:00.158Z · LW(p) · GW(p)
I look forward to the paper and the ability to know what they actually did rather than what journalists say about it.
For now, non-vascularized neural tissue isn't gonna be doing much or be very big, especially tissue derived from an animal with as thick and bulky neural structures as a primate.
Replies from: jacob_cannell↑ comment by jacob_cannell · 2015-08-20T04:38:15.446Z · LW(p) · GW(p)
If they claim to have multiple neuron types and brain structures, it seems that later developments could add the other key ingredients such as astrocytes and vascular tissue. Even then it may be one of those situations where there are so many just so dependencies that you don't get real brain functionality until you have just one more feature, and then one more, and so on .. .such that you might as well just skip all those steps and start with a fetus in a vat.
comment by [deleted] · 2015-08-19T07:19:08.693Z · LW(p) · GW(p)
How do you give an unplugged brain PTSD?
comment by eternal_neophyte · 2015-08-19T18:47:03.916Z · LW(p) · GW(p)
How the hell can he claim a brain is not conscious just because it's not being stimulated, when consciousness is so very badly understood to begin with?
Replies from: Lumifer↑ comment by Lumifer · 2015-08-19T19:10:52.641Z · LW(p) · GW(p)
Not just because it is not being stimulated, but rather because it has never been stimulated. That's a rather large difference.
Replies from: eternal_neophyte↑ comment by eternal_neophyte · 2015-08-19T22:15:18.141Z · LW(p) · GW(p)
I fail to see how you could derive that it's unconscious either way.
Replies from: Lumifer↑ comment by Lumifer · 2015-08-20T02:28:46.548Z · LW(p) · GW(p)
I am not sure what does "conscious" mean in this context.
Replies from: eternal_neophyte↑ comment by eternal_neophyte · 2015-08-20T09:52:25.272Z · LW(p) · GW(p)
In the context of ethics most likely something like the capacity for suffering, or for any kind subjective experience.
Replies from: Lumifer↑ comment by Lumifer · 2015-08-20T14:49:53.447Z · LW(p) · GW(p)
That doesn't help me -- essentially, you just replaced the word "conscious" with the word "suffering" and that does not clarify much.
Let's try it this way. Here is a black box with something inside it. It does not communicate in any way that's meaningful to you. How can you decide whether it's conscious or capable of suffering? What would you need to measure or observe? What are your criteria?
Replies from: eternal_neophyte↑ comment by eternal_neophyte · 2015-08-20T16:55:00.794Z · LW(p) · GW(p)
Tabooing doesn't work here, you can only taboo your terms so far before you've completely severed yourself from the semantics of your language. If you don't understand what suffering is at a visceral level then no experimental contrivance will clarify the notion for you.
Replies from: Lumifer↑ comment by Lumifer · 2015-08-20T17:23:59.836Z · LW(p) · GW(p)
If you don't understand what suffering is at a visceral level then no experimental contrivance will clarify the notion for you.
That's pure hand-waving.
Look: "I think this rock here is suffering. I can't prove it, but if you don't feel it at a visceral level then no experimental contrivance will clarify it for you"
Replies from: eternal_neophyte↑ comment by eternal_neophyte · 2015-08-20T17:27:31.081Z · LW(p) · GW(p)
I'm not claiming that rocks or artificially grown foetal brains are suffering. The people involved in this research claim they aren't - if the meaning of that claim is unclear the onus is on them to clarify it. Until such a time we are all at liberty to filter that claim through our own intuitively constructed concepts.
Replies from: Lumifer↑ comment by Lumifer · 2015-08-20T17:54:48.427Z · LW(p) · GW(p)
We are, of course, at liberty. However it seems to me you don't want them to satisfy their own definition -- that would be too easy -- you want them to satisfy your definition, but for that you should have an idea of what you want clarified and what criteria do you expect to be met. Demanding that they clarify something to the satisfaction of your "visceral level" is still hand-waving.
Replies from: eternal_neophyte↑ comment by eternal_neophyte · 2015-08-20T18:03:19.759Z · LW(p) · GW(p)
you don't want them to satisfy their own definition -- that would be too easy -- you want them to satisfy your definition
How could I say either way when they don't offer any definition to begin with? My original complaint was precisely that consciousness is not sufficiently well understood to allow anyone to be cavalier about these things in either direction.
Demanding that they clarify something to the satisfaction of your "visceral level" is still hand-waving.
The only one who has demanded that a concept be defined to his satisfaction here is you, when you explicitly requested a definition of suffering in terms of literal significance.
Replies from: Lumifer↑ comment by Lumifer · 2015-08-20T18:27:09.945Z · LW(p) · GW(p)
If you already have some idea of what the word "consciousness" means, you want to be reassured that the brain tissue in question is not conscious according to your idea.
I doubt you will let "them" define consciousness any way they wish. For example, I can say "X suffers iff X can communicate to me that it wants the current condition to stop". Will you be happy with that? Probably not.
Replies from: eternal_neophyte↑ comment by eternal_neophyte · 2015-08-20T18:32:35.648Z · LW(p) · GW(p)
More importantly, I want there to be a serious recognition of the ethical boundaries that are being pushed against by this kind of research due to the fact that neither I nor anyone else can yet offer any satisfactory theory of consciousness. That's the whole motivation behind my original comment, rather than the desire to advance a philosophical dogma, which seems to be what you want to impute to me.
Replies from: Lumifer↑ comment by Lumifer · 2015-08-20T18:40:28.535Z · LW(p) · GW(p)
You can't talk about ethical boundaries being pushed unless you place that ethical boundary somewhere first. Otherwise we're back to hand-waving: Can I say that because no one "can yet offer any satisfactory theory of consciousness', chewing on a salad is ethically problematic?
Basically, you can't be both worried and unhappy, and completely unspecific :-/
Replies from: eternal_neophyte↑ comment by eternal_neophyte · 2015-08-20T18:50:03.857Z · LW(p) · GW(p)
Is there any particular reason to believe that a salads might be capable of consciousness? No.
Is there any particular reason to believe that brains might be capable of consciousness? Yes - namely the fact that most brains insist on describing themselves as such. Does this imply brains are conscious if and only if they insist on describing themselves as such? No. No more than than a bird is only capable of flight when it's actually literally soaring in the air.
Replies from: Lumifer↑ comment by Lumifer · 2015-08-20T18:53:52.364Z · LW(p) · GW(p)
Is there any particular reason to believe that a salads might be capable of consciousness? No
How can you tell without "any satisfactory theory of consciousness"?
Replies from: eternal_neophyte↑ comment by eternal_neophyte · 2015-08-20T19:08:12.705Z · LW(p) · GW(p)
The same way I don't need to understand aerodynamics to know that I have no reason to believe that turtles might be capable of flight. I've never seen a turtle do anything that sits in the neighbourhood of the notion of "flight" in the network of concepts in my head. This type of argument doesn't work against the putative consciousness of foetal brains, since we have good reason to believe that at least brains at a certain stage of development are in fact conscious. To argue that this means we can only have an ethical problem with running dubious experiments on brains at that stage of development is rather like arguing that since you've only ever seen white swans fly, the supposition that black swans might fly too is not justified as such.
Replies from: Lumifer↑ comment by Lumifer · 2015-08-20T19:40:11.256Z · LW(p) · GW(p)
The same way I don't need to understand aerodynamics to know that I have no reason to believe that turtles might be capable of flight.
You don't need to know the underlying mechanics, but you do need to know what flight is.
You're saying we don't even know what consciousness is.
To argue that this means we can only have an ethical problem with running dubious experiments on brains at that stage of development
No one is arguing that. I am saying that if you claim to have a problem, you have to be more specific about what your problem is and what might convince you that it is not a problem.
"Prove to me something I don't know what" is not a useful attitude.
Replies from: eternal_neophyte↑ comment by eternal_neophyte · 2015-08-20T19:54:52.813Z · LW(p) · GW(p)
You're saying we don't even know what consciousness is.
Not in the least. I know what consciousness is because I am a consciousness. The need for a theory of consciousness is necessary to tie the concept to the material world, so that you can make statements like "a rock cannot be conscious, in principle".
I am saying that if you claim to have a problem, you have to be more specific about what your problem is and what might convince you that it is not a problem
What might convince me is a satisfactory theory of consciousness. Do I have to provide a full specification of what would be "satisfactory" just to recognize an ethical problem? If so there is hardly anything about which I could raise an ethical concern, since I'd perpetually be working on epistemic aesthetics until all necessary puzzles are solved. This is just in fact not how anyone operates. We proceed with vague concepts, heuristic criteria for satisfactoriness, incomplete theories, etc. To say that this should be disallowed unless you can unfold your theory's logical substructure in a kind of Principia Ethica is waaay more useless than interpreting ideas through partial theories.
Replies from: Lumifer↑ comment by Lumifer · 2015-08-20T20:00:37.903Z · LW(p) · GW(p)
Do I have to provide a full specification of what would be "satisfactory" just to recognize an ethical problem?
Not "full", but some, yes. Otherwise anyone can squint at anything and say "I think there is an ethical problem here. I can't quite put my finger on it, but my gut feeling ("visceral level") is that there is" -- and there is no adequate response to that.
Replies from: pangel↑ comment by pangel · 2015-08-20T20:25:17.704Z · LW(p) · GW(p)
As an instance of the limits of replacing words with their definitions to clarify debates, this looks like an important conversation.
The fuzziest starting point for "consciousness" is "something similar to what I experience when I consider my own mind". But this doesn't help much. Someone can still claim "So rocks probably have consciousness!", and another can respond "Certainly not, but brains grown in labs likely do!". Arguing from physical similarity, etc. just relies on the other person sharing your intuitions.
For some concepts, we disagree on definitions because we don't know actually know what those concepts refer to (this doesn't include concepts like "art", etc.). I'm not sure what the best way to talk about whether an entity possesses such a concept is. Are there existing articles/discussions about that?
Replies from: TheOtherDave↑ comment by TheOtherDave · 2015-08-20T22:41:04.535Z · LW(p) · GW(p)
If I don't know what I'm referring to when I say "consciousness," it seems reasonable to conclude that I ought not use the term.
Replies from: Richard_Kennaway↑ comment by Richard_Kennaway · 2015-08-21T09:51:24.172Z · LW(p) · GW(p)
What it is, to know what one is referring to? If I see a flying saucer, I may be wrong in believing it's an alien spaceship, but I am not wrong about seeing something, a thing I also believe to be an alien spaceship.
pangel says:
The fuzziest starting point for "consciousness" is "something similar to what I experience when I consider my own mind".
and that is the brute fact from which the conundrum of consciousness starts. The fact of having subjective experience is the primary subject matter. That we have no idea how, given everything else we know about the world, there could be any such thing as experience, is not a problem for the fact. It is a problem for those seeking an explanation for the fact. Ignorance and confusion are in the map, not the territory.
All attempts to solve the problem have so far taken one of two forms:
Here is something objectively measurable that correlates with the subjective experience. Therefore that thing is the subjective experience.
We can't explain it, therefore it doesn't exist.
Discussion mostly takes the form of knocking down everyone else's wrong theories. But all the theories are wrong, so there is no end to this.
The actual creation of brains-in-vats will certainly give more urgency to the issue. I expect the ethical issues will be dealt with just by prohibiting growing beyond a certain stage.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2015-08-21T20:02:25.289Z · LW(p) · GW(p)
To know what I'm referring to by a term is to know what properties something in the world would need to have to be a referent for that term.
The ability to recognize such things in the world is beside the point. When I say "my ancestors," I know what I mean, but in most cases it's impossible to pick that attribute out empirically -- I can't pick out most of my ancestors now, because they no longer exist to be picked out, and nobody could have picked them out back when they were alive, because the defining characteristic of the category is in terms of something that hadn't yet been born. (Unless you want to posit atypical time-travel, of course, but that's not my point.)
So, sure, if by "flying saucer" I refer to an alien spaceship, I don't necessarily have any way of knowing whether something I'm observing is a flying saucer or not, but I know what I mean when I claim that it is or isn't.
And if by "consciousness" I refer to anything sufficiently similar to what I experience when I consider my own mind, then I can't tell whether a rock is conscious, but I know what I mean when I claim it is or isn't.
Rereading pangel's comment, I note that I initially understood "we don't know actually know what those concepts refer to" to mean we don't have the latter thing... that we don't know what we mean to express when we claim that the concept refers to something... but it can also be interpreted as saying we don't know in what things in the world the concept correctly refers to (as with your example of being wrong about believing something is an alien spaceship).
I'll stand by my original statement in the original context I made it in, but sure, I also agree that just because we don't currently know what things in the world are or aren't conscious (or flying saucers, or accurate blueprints for anti-gravity devices, or ancestors of my great-great-grandchild, or whatever) doesn't mean we can't talk sensibly about the category. (Doesn't mean we can, either.)
And, yes, the fact that I don't know how subjective experience comes to be doesn't prevent me from recognizing subjective experience.
As for urgency... I dunno. I suspect we'll collectively go on inferring that things have a consciousness similar to our own with a confidence proportional to how similar their external behavior is to our own for quite a long time past the development of (human) brains in vats. But sure, I can easily imagine various legal prohibitions like you describe along the way.
Replies from: pangel↑ comment by pangel · 2015-08-22T16:32:19.085Z · LW(p) · GW(p)
I meant it in the sense you understood first. I don't know what to make of the other interpretation. If a concept is well-defined, the question "Does X match the concept?" is clear. Of course it may be hard to answer.
But suppose you only have a vague understanding of ancestry. Actually, you've only recently coined the word "ancestor" to point at some blob of thought in your head. You think there's a useful idea there, but the best you can for now is: "someone who relates to me in a way similar to how my dad and my grandmother relate to me". You go around telling people about this, and someone responds "yes, this is the brute fact from which the conundrum of ancestry start". An other tells you you ought to stop using that word if you don't know what the referent is. Then they go on to say your definition is fine, it doesn't matter if you don't know how someone comes to be an ancestor, you can still talk about an ancestor and make sense. You have not gone through all the tribe's initiation rituals yet, so you don't know how you relate to grey wolves. Maybe they're your ancestors, maybe not. But the other says : "At least, you know what you mean when you claim they are or are not your ancestors.".
Then your little sisters drops by and says: "Is this rock one of your ancestors?". No, certainly not. "OK, didn't think so. Am I one of your ancestors?". You feel about it for a minute and say no. "Why? We're really close family. It's very similar to how dad or grandma relate to you." Well, you didn't include it in your original definition, but someone younger than you can definitely not be your ancestor. It's not that kind of "similar". A bit of time and a good number of family members later, you have a better definition. Your first definition was just two examples, something about "relating", and the word "similar" thrown in to mean "and everyone else who is also an ancestor." But similar in what way?
Now the word means "the smallest set such that your parents are in it, and any parent of an ancestor is an ancestor"..."union the elders of the tribe, dead or alive, and a couple of noble animal species." Maybe a few generations later you'll drop the second term of the definition and start talking about genes, whatever.
My "fuzziest starting point" was really fuzzy, and not a good definition. It was one example, something about being able to "experience" stuff, and the word "similar" thrown in to mean "and everyone else who is conscious." I may (kind of) know what I mean when I say a rock is not conscious, since it doesn't experience anything, but what do I mean exactly when I say that a dog isn't conscious?
I don't think I know what I mean when I say that, but I think it can help to keep using the word.
P.S. The final answer could be as in the ancestor story, a definition which closely matches the initial intuition. It could also be something really weird where you realize you were just confused and stop using the word. I mean, the life force of vitalism was probably a brute fact for a long time.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2015-08-23T03:13:51.629Z · LW(p) · GW(p)
Hm.
So, I want to point out explicitly that in your example of ancestry, I intuitively know enough about this concept of mine to know my sister isn't my ancestor, but I don't know enough to know why not. (This isn't an objection; I just want to state it explicitly so we don't lose sight of it.)
And, OK, I do grant the legitimacy of starting with an intuitive concept and talking around it in the hopes of extracting from my own mind a clearer explicit understanding of that concept. And I'm fine with the idea of labeling that concept from the beginning of the process, just so I can be clear about when I'm referring to it, and don't confuse myself.
So, OK. I stand corrected here; there are contexts in which I'm OK with using a label even if I don't quite know what I mean by it.
That said... I'm not quite so sanguine about labeling it with words that have a rich history in my language when I'm not entirely sure that the thing(s) the word has historically referred to is in fact the concept in my head.
That is, if I've coined the word "ancestor" to refer to this fuzzy concept, and I say some things about "ancestry," and then someone comes along "this is the brute fact from which the conundrum of ancestry start" as in your example, my reaction ought to be startlement... why is this guy talking so confidently about a term I just coined?
But of course, I didn't just coin the word "ancestor." It's a perfectly common English word. So... why have I chosen that pre-existing word as a label for my fuzzy concept? At the very least, it seems I'm risking importing by reference a host of connotations that exist for that word without carefully considering whether I actually intend to mean them.
And I guess I'd ask you the same question about "conscious." Given that there's this concept you don't know much about explicitly, but feel you know things about implicitly, and about which you're trying to make your implicit knowledge explicit... how confident are you that this concept corresponds to the common English word "consciousness" (as opposed to, for example, the common English words "mind", or "soul", or "point of view," or "self-image," or "self," or not corresponding especially well to any common English word, perhaps because the history of our language around this concept is irreversibly corrupted)?
comment by Orz · 2015-08-29T20:32:26.775Z · LW(p) · GW(p)
Guys, "The brain, which is about the size of a pencil eraser. . ." It isn't sentient, no matter what kind of stimulation or lack thereof it's been getting.
Now, can they make neurons from the skin cells of people with Parkinson's and Alzheimer's and inject them into their brains to help with the disease?