Hacker News new | past | comments | ask | show | jobs | submit login
Woman of 24 found to have no cerebellum in her brain (newscientist.com)
496 points by shahmeern on Sept 11, 2014 | hide | past | favorite | 179 comments



This condition is known as cerebellar agenesis. A review of many of the case studies was done by a prominent cerebellum researcher [1]. Typically the individuals that survive past birth live relatively normal lives but with impaired motor skills which are slower to develop. Their abilities are remarkable given that acute lesions to the cerebellum result in much more significant impairments (e.g. not being able to touch your nose with the tip of your finger in one smooth, coordinated movement).

These individuals probably also exhibit diminished cognitive function as well. Only recently has it been recognized that the cerebellum is also involved in cognition [2]. It's interesting to note that you don't need a cerebellum to move or think, but the loss of it impairs both. Contrast this to damage to your motor cortex which can result in paralysis.

[1] Glickstein, M (1994). Cerebellar Agenesis. Brain, 117, 1209-1212. [2] http://www.ncbi.nlm.nih.gov/pubmed/23996631


> These individuals probably also exhibit diminished cognitive function as well.

I wonder how much of that is "the cerebellum is also involved in cognition" and how much of that is "with the rest of the brain picking up the additional load, there's less "processing power" available for other things".


My PhD adviser and I have had that debate a number of times. For the congenital case, we can't say one way or the other. But in the acute case, where there are localized lesions to the parts of the cerebellum thought to be involved in cognition, the effect seems to be primarily cognitive. From anatomical studies, we also know that there are "loops" formed between non-motor areas of the cerebral cortex and the cerebellum.


But couldn't that alternatively be explained by the cerebral cortex being able to act as a "backup" of sorts for the cerebellum, to the extent of "neglecting" its ordinary tasks? It would make sense if the brain were to "consider" motor control to be at a higher priority than cognition, and was designed as such. (Sort of like in a pinch a kitchen might use a cook as a dishwasher.)

I'm using wishy-washy words here, sorry. I don't know how better to explain it.


It's unclear whether the circuitry of the cerebral cortex has the ability to implement the function of the cerebellum (whatever that may be, which we don't really understand). I think we're a long ways off from answering that question.


Well, given this development, I think it's fair to say there's something else that can implement the function of the cerebellum.


Something that can partially implement the function of the cerebellum.


True.


It might be sort of like switching from hardware rendering of video to software rendering.


Is there a way to check for this other than brain scans? My son is incredibly clumsy at 2.5years old and he fell off the bed when he was younger, perhaps 1 year old, and had a nasty bang to his fore head. He is a twin and his sister doesn't have as many falls as he does.


I wouldn't worry too much. I was very late learning to walk among other things and a doctor told my mom that I was "not exactly gonna be playing outfield for the Yankees". No Yankees yet but I ended up an all-state athlete, plus a two-sport college athlete. Most days I'm not totally stupid either.


Boys will be boys. I fell out of my bed at the age of 8! As far as I can tell, I am perfectly normal with excellent sense of balance now at the age of 30. As long as your son eats well, sleeps well, and is otherwise healthy, give it a few more years before you really start to worry.


Also, make sure to encourage movement (adults call it exercise, for kids is play). I used to be really clumsy as a kid before I got into sports as a late teenager.


Consider occupational therapy as well. It was very very helpful for my daughter, who was "behind the curve" with her gross motor skills. It's not just about increasing physical ability, either. Lack of core strength can affect the ability to sit still and focus as the body gets fatigued more easily.


Something tells me that going to your family doctor and saying 'I'm concerned that my son is missing some of his brain' isn't going to get you a referral to radiology, but who knows. If he hit his forehead really hard I'd probably take him in in case of a concussion, and that would probably tell you in some detail.


I did take him at the time and they said he was fine but he was also so small he wasn't really coordinated or talking anyway.


Inner ear problems sound like a far more likely issue. But, I would just talk to your pediatrician vs. soliciting random advice on the web.


My 9 month old had his brain (and other organs) checked whilst in the womb during one of our ultra sound checks. BTW, it might be a country specific thing. I'm in the UK.

Did your twins get ultra sound checks?


This is very unlikely to be the cause. If you're really worried about his clumsiness, I'd see a doctor -- there are lots of more common causes, many of them not even serious.


Or just "if there's something so wrong with your brain that the cerebellum is completely missing, odds are there is some hard-to-detect abnormalities in the rest of the brain".


Good point, hadn't considered that.


Ah, always look for confounders :)


Could it be that the cerebellum is merely a communication channel for other parts of the brain which are more focused on cognition. After injury one would not expect immediate re-routing of these channels.


A very interesting question!

For me, some of the way I think things through seems very physical to me. The sort of thing you do when you pick up lunch table objects and say, "Ok, this salt shaker is the web server, this fork is the firewall, and..." Except I'm more likely to do it with just gestures, or just thinking about placing things in an imaginary space.

I wonder how much of that is really enlisting the cerebellum versus it being an output-only thing. Perhaps nominally output-only devices do more. As in rubber-duck debugging or Flannery O'Connor's line, "I write because I don't know what I think until I read what I say."


I'm not very invested in this field, however, this "processing power" metaphor sound like those ideas "where people only use 10% of their brain". The brain doesn't work like that, it's not a general purpose computer. It's made up of several independent mechanisms, that have little to do with each other, and that have emerged at different points in evolution, driven by different forces. So their creation is rather chaotic.


Generally yes, but neuroplasticity does in fact exist: http://en.wikipedia.org/wiki/Neuroplasticity


I think that sort of misses the point of the metaphor as used here. If one big part of the brain is missing, and thus the mechanisms it enables are unavailable, some other part of the brain (at least in this case) makes up for it by providing an approximation of that mechanism. That other part of the brain is thus not able to perform whatever its normal function would be; even if it's doing double-duty, you expect that there's some capacity limit. When and if that deprioritized function involves reasoning, then you'd expect a drop in reasoning ability. Thus the "processing power" metaphor is simply saying that if a neuron is busy doing something, it can't also be busy doing other things.


There is very interesting book called 'the brain that heals itself' that argues quite strongly against that.


Are you saying our brains follow the Law of Demeter?


The article mentions:

"Problems in the cerebellum can lead to severe mental impairment, movement disorders, epilepsy [...]"

However this seems potentially much worse than the symptoms this woman without a cerebellum is experiencing. Is it theoretically possible that people with damaged but otherwise intact cerebellum to be identified at birth so that they can have their cerebellum removed completely? My thinking is that if you do it early enough, plasticity might allow other parts of their brain to take over and do the job better than the damaged cerebellum would be able to.

This is probably one of the reasons why I am not allowed to perform surgery without a license...


> This is probably one of the reasons why I am not allowed to perform surgery without a license...

Indeed :) A damaged cerebellum at birth might still be useful because it can fulfill some, if not all of the tasks it is supposed to. We don't know enough about this yet to really make any kind of call about it. For all we know, many people are borne with malformed cerebellums but never experience any problems, thus we just don't know about them.


I did see a documentary about some young girl with a brain problem. I can't remember exactly (severe form of epilepsy?), but they wanted to do something pretty severe to her brain, and do it within the first year, as it still had the neuro-plasticity to recover at that stage.


> This condition is known as cerebellar agenesis.

To tease you a little bit, I read this as

> This condition is known as being born without a cerebellum.

I don't know why we need a latin name for everything!


It certainly makes it easier to find in a book or via search. You have one word for what it's called, instead of having some under "Born without a cerebellum," some as "Missing cerebellum," others in "No cerebellum," "Undeveloped cerebellum," etc.

Plus it's more convenient to say. Not a big deal for us, but to people who deal with crazy medical conditions all day long, describing each one in natural language would be imprecise and time consuming.


Though obviously it's not the reason it was adopted, it is kind of neat that using a dead language for scientific terms disambiguates them cleanly for the purposes of searching.

Any live language would have accidental matches (even quoted) where it's just the obvious thing to say, a la "born without a cerebellum".


To expand on that... it also solves the problem of technical terms evolving new nomenclature (or new meanings for old nomenclature!) over time.


not a dead language.. greek language is still alive and indeed it sounds like "born without a cerebellum" (but to be fair the syntax reminds of medical term)


Latin is a dead language, there has not been a native speaker for a very long time[1].

[1]:http://en.wikipedia.org/wiki/Language_death


It is still in constant use as the official language of the Vatican.

They keep having to invent new Latin words and phrases so they can discuss things like hotpants, which are brevíssimae bracae femíneae apparently.

Have a look here - http://usvsth3m.com/post/95991771713/hotpants-flirt-and-othe...

The cashpoint with Latin in comic-sans is awesome.

edit - translating from the latin, comic-sans is a pretty accurate font name.


having dead languages to imprecisely map to a word (but nobody knows that because no one really speaks the dead language) isn't any better. It also creates barriers and wasted time in learning the practice. The average individual has to deal with that folly even more when it comes to law.


It's better to have a shorter and noun form for referencing being born without a cerebellum. It's especially convenient for researchers who write about it and have to refer to being born without a cerebellum multiple times in a single paragraph.

It's better to have a shorter and noun form for referencing cerebellar agenesis. It's especially convenient for researchers who write about it and have to refer to cerebellar agenesis multiple times in a single paragraph.


Well, I'd read it as "this condition is known as failing to develop a cerebellum". Even if everyone was born without a cerebellum and you were expected to grow yours by the age of 10, not growing one would be sensibly termed "agenesis".

Also, "agenesis" is greek, like most medical terminology ;)


Aha, thanks :)


Because it's much more convenient to have one single name for it in all of Earth's languages.


Perhaps you're thinking of taxonomy? English names for diseases don't set their names in other languages.


Does "cerebellar agenesis" sound English to you? Almost all things in medicine have almost universal names derived from Greek and/or Latin. Of course, until a few hundred years ago it was because those were the languages of science; these days the terms still fill the same purpose as they did back then - providing a common vocabulary for people from diverse origins.


Yes, "cerebellar agenesis" is an English term. Having Latin and Greek etymologies doesn't make the words Latin or Greek; "cerebellar" isn't even a legal Latin adjectival form.

Here are the titles of the wikipedia article "Cerebellum" in some other languages:

    Lillehjerne (Danish)
    Kleinhirn (German)
    Parengephaliδa (Greek - Παρεγκεφαλίδα if you can read Greek)
    Cerebelo (Spanish)
    Cervelet (French)
    Otak kecil (Indonesian)
    Smadzenites (Latvian)
    Kisagy (Hungarian)
    Beyincik (Turkish)
    Xiaonao (Chinese - 小脑)
Nobody's copying the English word (well, Tagalog and Malaysian are) -- they're all using their own native terms for "small brain".


Yes, the cerebellum is know as 'lillehjernen' in Danish and that word is the only word most people know for it. Nevertheless, doctors learn the word cerebellum so they can read what doctors in other countries write.

They also use the word amongst themselves. Googling for 'cerebellum ugeskrift for læger' gives plenty of hits. Likewise for 'cerebellar ugeskrift for læger'.

They might write 'agenese' instead of 'agenesis', though.


I don't know why we need an english name for every keyword in programming languages!


If I had to guess, I'd say it's because it's slightly faster, still makes sense if you have learned the right bits of Latin, and because it makes you feel smart.


More usefully it means doctors fluent in different languages have a very good chance of accurately describing conditions to each other.


Same with Greek! Can't wait for that "not-quite-finished-version-of-a-software-product" to be released!


> It's interesting to note that you don't need a cerebellum to move or think, but the loss of it impairs both. Contrast this to damage to your motor cortex which can result in paralysis.

This strikes me as akin to saying that you don't need a GPU to perform graphical processing, but not having it impairs your graphical processing capability. The brain wires itself throughout a human's development to take advantage of the specialization of its components and their parallelism.

Damaging an adult's cerebellum once those connections are in place would be the equivalent of removing a GPU before trying to play a game that has been developed to rely on it.

At least that's my simplified analogy drawn from my admittedly imperfect understanding of how the human brain and computers work.


Trying to reason about the brain like it's a computer is a very common thing for computer scientists to do. Unfortunately, it's also mostly fallacious (since for a whole host of reasons, neurons don't work like integrated circuits), and simplistic analogies like cerebellum = GPU are probably not going to give much insight into what's really going on. I'm not saying "the brain is beyond human comprehension" or anything silly like that, just that you have to approach it from a biochemical context.


Sure, but the brain is a Turing machine, albeit with hardware acceleration of key functions (e.g., edge detection in the visual cortex). Mathematics, not chemistry, structures the problems it solves, if not the algorithms and heuristics involved. It's utterly fascinating to see how nature tackles the same problems that we solve independently using completely different tools.


I don't think anyone is anywhere close to "understanding perfectly how the brain works".


What fills up that space? Liquid? Scar tissue?


Cerebrospinal fluid in this case and most; it's the general purpose volume-filler of the central nervous system.


This also reminds me of Hemispherectomy[0] where an entire half of the brain is surgically removed in extreme cases to prevent seizures. And amazingly, especially if you do this on younger children:

"Studies have found no significant long-term effects on memory, personality, or humor,[4] and minimal changes in cognitive function overall."

If you don't _really_ need half of the brain and you don't _really_ need the cerebellum, I wonder how little (and what part) of the brain we actually do _really_ need. And then there are so many people living just fine with lesions in so many parts of the brain.

It's just amazing. Imagine going into our code bases and tearing out entire classes or modules; That wouldn't go down well.

[0] http://en.wikipedia.org/wiki/Hemispherectomy


>> It's just amazing. Imagine going into our code bases and tearing out entire classes or modules; That wouldn't go down well.<< It's probably more like removing half the CPU cores and your clever code being ok with this.


Thinking of the brain as a computing platform is never a good metaphor. Neural networks do not have a rigid delineation between instruction-storage, data-storage, and CPU. Every neuron wears all three hats and intertwingles those concepts.


Von Neumann architecture isn't the only form of computation that exists.


Fair enough. I obviously meant a typical desktop/laptop/smartphone/server/whatever.


True. If it were, what humans do with their biological CPUs would be impossible. Too bad we don't understand our own brains.


In our defense, they didn't grow to be understood but to serve its purpose. So it's akin to a code base with billions of years in the making, without any good documentation.


Think you meant 'intertwines'. Thanks for the laugh :D


Actually, it's more like ripping out the GPU and the CPU still being able to handle video decently.


Which is? Hemispherectomy is close to taking out half the cores. Missing cerebellum is close to what you described.


In fact, it's almost like a typical state-of-the-art computer is a limited (perhaps even poor) metaphor for the brain.


I would imagine that it would be like cutting a hologram in half or what you described.


Where does the code live, though? :)


As others have mentioned, I think this is closer to treating them as redundant CPU cores.

Ripping out code would be like... radiation. Mucking with DNA. And that can be really adverse, just like ripping out/changing code.


Radiation would probably be analogous to burning out random transistors, and you could target specific areas if you wanted to.


If you enjoy thinking about this, you might like the book "Blindsight," a sci-fi novel written by a biologist. The main character had a hemispherectomy as a child, and all the major characters are far from neurotypical. The book is creepy as hell, but incredibly interesting.


Maybe the redundancy offers an evolutionary advantage?


Severe brain injuries are all too common in modern times and were probably more common in prehistoric times. X-rays of (mostly) men with foreign objects through their brain are all too common. I suppose being flexible about which part of the brain does what would make the difference between a survivable and recoverable injury, and a debilitating injury, probably leading to death.

Edit: Steven Pinker, in "The Better Angels of our Nature: Why Violence Has Declined", makes it out to be 15% average death by violence, so, pretty common. Explaining his graphs, he says, “The topmost cluster shows the rate of violent death for skeletons dug out of archaeological sites.” “The death rates range from 0 to 60 percent, with an average of 15 percent.”


WTF?! Why wouldn't this have the same side effects as a stroke? Blindness, deafness and paralysis?


Age is one of the big issues. The brain's fairly good at routing around problems when you're young and the brain's flexible; it's why young polio patients were able to learn to walk again even after spending months in an iron lung. When you get older, because there's a lot more well-established connections, the brain becomes less capable of adapting to sudden loss of functionality.


The answer to that seems to be "yes, it would": https://en.wikipedia.org/wiki/Paul_Bach-y-Rita#Research_into...


I wonder how little (and what part) of the brain we actually do _really_ need.

Maybe we need the software. The cloud provider sometimes has more available instances, other times less of them


Not that crazy to say "You only use % of your brain!".

So, naive question: Could be correct to think that if half the brain is "enough" to have a full life, then the other half is utilized (in normal brains) but is wasting in doing unnecesary work? ie: Could be the brain is truly under-utilized, under-performing all the time? Like, is lazy?


Brains are probably lazy for the same reason animals in general are lazy - to save energy. The brain accounts for something like 25% of the body's energy use. I'm guessing it's similar to CPUs where most of the time they are under-utilised but can fire up when needed.


You're only using a certain percent at a time, but all the different sections of your brain do different things. And they are lazy when they can get away with it, read Thinking, Fast and Slow for some more details but even when you're thinking hard you're not using your whole brain. It's not at all obvious to me that the glucose supply, oxygen supply, or cooling of the brain are sufficient for you to run it at 100% without dying.


I love how these kinds of discoveries challenge, if not out right shatter, our current scientific understanding of human beings.

Again, I recommend Gattaca movie http://www.imdb.com/title/tt0119177/

Don't let medical science try to dictate your potential based on gender, race or anything about your DNA. They're only right until they find out they're wrong.


"Don't let mainstream misinterpretations and gross oversimplifications of medical science try to dictate your potential based on gender, race or anything about your DNA. They're only right until they find out they're wrong."

FTFY.

Most of the issues you're alluding to are not because medical science says something is impossible but because the media hears "X group has slightly elevated probability of Y" and reports it as "X group has Y".


the original comment was articulated just fine. the point is that science of all varieties has proven wrong over, and over, and over, and over, and over. science is a constant reevaluation of things we "know".

thing is, we don't know what we don't know.


http://chem.tufts.edu/answersinscience/relativityofwrong.htm

>My answer to him was, "John, when people thought the earth was flat, they were wrong. When people thought the earth was spherical, they were wrong. But if you think that thinking the earth is spherical is just as wrong as thinking the earth is flat, then your view is wronger than both of them put together."

>The basic trouble, you see, is that people think that "right" and "wrong" are absolute; that everything that isn't perfectly and completely right is totally and equally wrong.


Exactly. There have already been enough examples to show that intelligence, creativity and potential in the brain occurs in any race or gender, and often defies genetics.

This is one of the reasons I want to find more ways to get technology to poor cities and third world countries. Not just for their potential as programmers or IT people, but to give them the same access to information that we have right now.

The person who can cure cancer could be sitting in the poorest slums of America, a village in Africa or Asia. The sooner people realize that the more inclined they may be to help.

Sorry to divert so much it's just that your comment rings true in so many ways.


Doesn't the guy in Gattaca probably have a heart condition? Science can definitely tell you that you're in a group that will probably die upon doing X.

Gender and race moreso are lousy predictors for most things, but they're not useless.


All science can tell you is that if x people with your condition do y, n of them will likely die. Neither you not the statisticians can say with certainty whether you're in group n or x minus n.


But sometimes x=n. And sometimes x is pretty close to n and you should listen to the doctor unless that alternative is truly unacceptable to you.


I love how you compare "missing a huge chunk of brain" to "having a certain race or gender."


Creativity and intelligence don't have a strong correlation.


Wow, could you imagine building a computer so resilient that it still works after a part equivalently important disappeared‽


Some high end servers have hot-swappable CPU's, RAM etc. (using mirroring in the case of RAM [1]). Of course you need to keep enough installed at any one time to keep the system running. Couple it with a suitable dual-ported storage array, and almost any system component can be replaced without taking down the server.

The main reason it's not more common is that few people are willing to pay for it vs. getting redundancy via multiple cheaper servers, especially once you've got enough load that you need to scale out anyway.

[1] http://www.redbooks.ibm.com/abstracts/tips0259.html


Not really a computer but some distributed systems already work that way. I've read somewhere that Netflix (might have been Amazon?) is designed in such a way that it's capable of satisfying certain requirements even when services are down.


Are we talking about a "fallback db servers take over when the main ones drop", or "static assets server morphs into temporary db server if the main one drops"? Because there is a massive difference.


With virtualization/containerization that's not an unreasonable scenario. One of VMWare's selling points is that a sufficiently expensive vSphere cloud can expand and contract over its available hardware depending on load/time-of-day. With iLO integration, it can even power off the hardware it's not using and power it on when needed. If one server goes down, its VMs can be distributed over the remaining hypervisor, leading to potential resource contention but maintaining availability.

So, sort of.


There are crazy cases of plasticity like that, this is more of "massive nosql caching infrastructure is down, everything else works overdrive to still mostly function correctly".


So when your legs stop working and you start crawling around the floor with your hands pulling the rest of your body, is that plasticity too?

It's easy to continue working at increased costs when you don't have a cache. That's.. that's what a cache is for; it's not critical to the infrastructure, it's here to reduce the costs.

Not to rain on any parade of course, I'm just pointing out I'd like to see more cases of actual plasticity of an email server that puts all jobs on hold while it temporarily takes over for a database server that just stopped responding.


I'm not saying servers can be plastic worth anything, I'm saying that the role of the cerebellum is to assist functions that exist elsewhere, so missing it is not the same as missing a primary functional unit.

The motor cortex is still doing its original job, it just has to work harder.


Caching can be critical if the volume is high enough. There are physical limitations of computing appliances that make sites like Google or Facebook actually impossible to run without extensive caching and indexing layers, not just prohibitively expensive.


I thought that was obvious. OP was making a point about things running without cache at the cost of increased load... I'm sure Google Search couldn't reliably achieve that right now, but we were not talking about Google Search.


Right. I figured I wasn’t telling you anything new, but thought it worth making explicit for the benefit of others.


If one designed their down-scaling properly, the caching layer would be shut off as the system contracted, not being part of solving the domain problem.



One of the features of 'amorphous computing' is the use of a large number of redundant subunits, giving resiliance against failure: https://en.wikipedia.org/wiki/Amorphous_computing

However, such an architecture arguably means that any single part is not particularly important.


The brain doesn't seem to be like a single CPU - more like a cluster of millions of CPU. It wouldn't be surprising that the system could work very well even if you disconnected a few thousand of those CPUs here and there.

(NOTE: disconnected, not destroyed; the woman's cerebellum didn't develop; things would be much different if you tried to surgically remove it)


SSD works like that.


Riak is just such a storage system.


Not really; riak is composed entirely of nodes that are more or less identical. Losing your cerebellum would be like losing a component on every one of your nodes because you only have one cerebellum.


The brain has many levels of structure.

Losing the cerebellum would be like losing a DC for which you have no backup, which then would require you to attempt to duplicate the original functionality in other DCs.


The robustness of evolved systems is just crazy.


If only we had this in the much simpler than a brain computer systems, right?


Yea try booting up with half a CPU.


it can be done! You can disable cores!


well not everyone knows how to do that. And perhaps similarly.. not every brain knows how to switch of the cerebellum.


It's all the more humbling when you realize how little power is consumed by all that processing and redundancy.

Compare a favorite computer robustness technique: triple-modular redundancy (TMR).


This is fascinating because the cerebellum is part of the "reptilian brain", one of the three sections of the mammalian triune brain that include the limbic and neocortex as well.

The reptilian brain is responsible for basic motor functions, heart rate, temperature regulation, and balance, and evolutionarily seems to be the part of the brain that is most connected to that of ancient fish and reptiles, as the name implies.

A person who is missing a portion of this rigid subsystem should still be able to think, process new information, and remember it, but might suffer from imbalance and other basic health issues as in fact this woman does. Yet, she can do lots of stuff. Apparently the surviving portions of her reptilian brain are able to compensate for the loss of the cerebellum.

It sheds a whole new light on a phrase like "my cold reptilian hindbrain tells me to ruthlessly proceed". We think of ourselves having this sort of emotionless hindbrain that is moderated by the more modern brain centers for sympathy, empathy, emotion, and higher reasoning. But what if in fact there is no such thing as a ruthless, primitive hindbrain and we are all completely in charge of our behavior, ethically and emotionally speaking?

http://thebrain.mcgill.ca/flash/d/d_05/d_05_cr/d_05_cr_her/d...


Maybe it isn't that the cerebellum is "gone" but maybe the surrounding tissue that bordered the cerebellum has sort of involuntarily taken over the same function. Those nerve endings that would terminate in the cerebellum are probably still intact?


>"[...] the woman joins an elite club of just nine people who are known to have lived without their entire cerebellum. A detailed description of how the disorder affects a living adult is almost non-existent, say doctors from the Chinese hospital, because most people with the condition die at a young age and the problem is only discovered on autopsy."

Is this woman the only one of the nine to have lived this long? Incredible given how critical the cerebellum is.


Well autopsies are rare so it's an unknown number of survivors.


I would think that people who die very young, like kids, would get autopsied all the time, simply because such deaths are rare and there would be suspicion of foul play or other issues.


This is fascinating considering that the cerebellum contains more neurons than the rest of the brain(!) (source: http://neuroscience.uth.tmc.edu/s3/chapter05.html). I wonder what other problems she experiences (the article only says she started speaking and walking at age 6-7).


The cerebellum is a much more primitive neural network, though. It's feed-forward only - it actually looks like a tree in MRIs - so it's a lot harder to encode anything. Normal neural networks ("connectomes") in the brain are massively interconnected in all directions, which allows for parallel computation; with the tree-like, feed-forward-only architecture of the cerebellum, neurons from different "branches" don't communicate.


Interesting. How was it determined that they're feed-forward-only? Can signals not travel back down the "tree"? (Note, I'm not even an amateur neurologist).



I hope she knows all the lyrics to The Ramones' "Teenage Lobotomy".

"Then I guess I'll have to tell 'em / That I've got no cerebellum."

https://www.youtube.com/watch?v=6ssoBUb2cJk


I was just going to post this! First thing I thought of.


The brain knows how to survive, that's for sure... http://www.cnn.com/2009/HEALTH/10/12/woman.brain/index.html?...


The upper image which I assume is her MRI scan is interesting. It isn't just her cerebellum missing - her brain stem looks odd too. Where is the pons? Where are the cranial nerves attaching? Need more images! Edit. On closer reading this article isn't great. <<Doctors did a CAT scan and immediately identified the source of the problem – her entire cerebellum was missing (see scan, below left)>>. Assuming it isn't some sort of problem related to me viewing the article on a phone, that image is an MR. No CATs involved.


? The top image is most definitely from a CAT scan. If it was MR, there would be way more artifact around the eye/nose/throat.


No it isn't. Look at the contrast and the signal from fat and water, look at the tongue. That is a T1 sag. Edit: Article, or images anyway, appear on multiple sites labeled as MRI. Pretty sure the fuzzy writing says T1 FLAIR on this site. http://io9.com/doctors-discover-a-woman-with-no-cerebellum-1... Edit 2: Here is another link and the ax T2 is seen at bottom of the images. Edit 3. The actual journal article. New scientist screwed up didn't didn't read it properly. Patient had a CT then an MRI as per the image label. New Scientist only kept the MRI image as brain MR is way nicer than CT but kept the journal text saying CT. http://brain.oxfordjournals.org/content/early/2014/08/22/bra...


> Her doctors describe these effects as "less than would be expected"

Understatement of the century? I wonder, then, what parts of the brain (if any) truly are essential for conscious thought?


The main role of the cerebellum is the motor function. According to the wikipedia it's still unclear if it plays a part in some cognitive functions such as attention and language, and in regulating fear and pleasure response


Define conscious thought.


Remember things like this when people talk about biological differences between men and women's brains. Studies sometimes find tiny differences, and then some people claim that's why 19 out of 20 board members are men. It's not bias, it's science!

But if people can live missing massive chunks of their brain, is it really believable that tiny differences can cause such massive societal outcomes?


> But if people can live missing massive chunks of their brain, is it really believable that tiny differences can cause such massive societal outcomes?

Congratulations, you are today's demonstration of 'proving too much': you have also just proven that things like lesions and scars cannot affect cognition, warp personalities, create agnosias and aphasias, and result in bizarre conditions like those Oliver Sacks has so memorably documented, because lesions're so tiny and such small parts of the brain - 'if people can live missing massive chunks of their brain, is it really believable that tiny differences can cause such massive societal outcomes?'


The OP was not talking about the magnitude of the differences, but the nature of them.

Do you believe that neuroplasticity remains constant throughout a person's lifetime? Unless you believe that, your statement is incoherent.

Lesions and scars are acute changes in the brain that happen after birth. That's different from starting out missing massive chunk of your brain.

OP: "Look at that person who was born with no arms due to a birth defect. They're able to live a fairly normal, happy life. Maybe arms aren't essential to human happiness."

You: "So you're saying if you got your hand mutilated in a garbage disposal, that wouldn't make you unhappy? A hand is much less than a whole arm."

(I'll leave what this makes you a demonstration of as an exercise for the reader.)


> The OP was not talking about the magnitude of the differences, but the nature of them.

And what, pray tell, are the 'nature' of disorders like Cotard's syndrome?

> OP: "Look at that person who was born with no arms due to a birth defect. They're able to live a fairly normal, happy life. Maybe arms aren't essential to human happiness."

Try going back and reading what was said. Your paraphrase is incorrect. Here's a correct paraphrase:

"OP: look at that person born with no legs. They're able to lead a somewhat normal life. This shows that anyone claiming that there might be differences between the fingered and the fingerless such as in fine motor control is a moron - because a leg is so much larger than a finger!"

> I'll leave what this makes you a demonstration of as an exercise for the reader.

Well, it demonstrates you can't paraphrase or follow the logical structure of an argument. I'm not sure what I'm demonstrating; hopefully something good.


What a large chip you must have on your shoulder. This is not even in the same ballpark of discussing gender differences in how brains function. This article is saying that even with certain portions of the brain missing, that other portions can compensate. What you are talking about is the different way that things are processed (if any) between male and female.

But good job on sticking it to THE MAN. I'm sure that your social media activism is revolutionizing the world.


Please don't make posts like this on HN. Your sarcasm made the post overwhelmingly negative, and contributed nothing to the conversation. This would have been a much better post had you removed the first sentence and the last two.


Thanks for the redlines on my writing. I'll make sure to incorporate these nifty tips into future correspondence.

However, I think you can agree that you are being a bit extreme by saying that my post "contributed nothing to the conversation". Just because you happen to dislike what I said or how I made the point doesn't invalidate it.

I'm sorry that you find my post distasteful, but given the initial post I think that it was fully appropriate.


you should probably re-read my post. your negativity contributed nothing to the conversation. your post would have been a great post without it. That was the point my gp post was trying to make.

This is probably why you got downvotes here. HN doesn't tolerate this negativity. Please keep it where it belongs, on Reddit or 4chan. thanks.


> is it really believable that tiny differences can cause such massive societal outcomes?

Yes it is believable, as evidenced by numerous people that work on, get funding for, and continue to do research on a large slew of topics that contain "tiny differences", such as <insert any sort of engineering>.

EDIT: (Hint: "tiny differences" is fundamental to the notion of calculus.)


To add to other excellent comments refuting this line of reasoning - brains are not rocks, they are very complex systems, and the more complex something gets the more delicate it is, i.e. even a tiny chance has a chance to alter behavior of the system in significant ways.

As an illustration, think of your computer. If you open it up and remove one RAM die, it will most likely run fine, albeit slower. Were you to introduce a "tiny difference", say, swap two pins on a die, you risk getting everything from machine not booting up, to crashing constantly, to running fine but spewing out nonsense and corrupting data every now and then. Were you to introduce a little bigger change - say, saw the die in half, you'll likely fry the whole machine.


It's so amusing how intelligent people can come to such ignorant conclusions. I have no doubt I do it myself from time to time.


You took it in the exact opposite direction of logical.

She's missing a huge portion of her brain and managed fairly well. Wouldn't that lead you to believe that it would take quite a bit of structural difference to cause noticeable societal outcomes?


Indeed. You'd think that just a normal life of experiencing variance between humans would dispel that notion, but people are irrationally wedded to their prejudices.


I have always found it fascinating that the cerebellum has more neurons than the rest of the brain. What the heck is going on in there.



A friend of mine lost the left side of his brain ... And he's all right now!


I'm being sincere here. I am surprised and excited to see interest in this. Someone very close to me has no discernible cerebellum and no one we've seen or known has ever considered it medically interesting.


I felt the same way when I saw this article. My cousin's daughter was recently diagnosed with cerebellar hypoplasia (a missing or smaller than normal cerebellum), and it's been pretty terrifying for them. They live in a rural area where no one has ever dealt with the problem, and they're spending insane amounts of time and money on therapy (in a city an hour away) to help her learn basic skills. She celebrated her 3rd birthday this week and she can't walk, talk, or even stand unassisted.

On one hand, this article thrills me to think that at some point, her daughter might lead a relatively normal life. It's heartbreaking to see the way she suffers right now - like there's more going on in her head than she can tell us, and you can see the frustration on her face when she tries to do things or get her point across. On the other hand, I'm hesitant to send the article to my cousin because I know everything related to her daughter's problem is deeply depressing to her as she's dealing with a frustrated child who makes very little progress from day to day.


"Don't mind the gap . . ."

Perhaps a more tasteful lead-in was in order.


I'm really curious, how would this woman react to alcohol?


Talk about living on "Hard" mode.


That is nothing... I know of several politicians that have no brain at all, and no one has noticed yet. jk,jk


Is the picture of the CT scan real or just an illustration? Because I would assume that even if the woman doesn't have a cerebellum the brain should expand to occupy that space.


The article indicates it's the actual scan and says the space "was filled with cerebrospinal fluid, which cushions the brain and provides defence against disease."


Don't read the article provided too carefully. The image isn't a CT and has been dumbed down from the source. The image is an MR not a CT, and the image below is some sort of made up composite image. Original source with proper images: http://brain.oxfordjournals.org/content/early/2014/08/22/bra...


Please read the article before asking questions:

> The space where it should be was empty of tissue. Instead it was filled with cerebrospinal fluid, which cushions the brain and provides defence against disease.


Sorry, you are right. I only skimmed trough the article. It probably took me longer to write the comment that what it would have taken me to read the whole article.


It's strange how materialists of all sorts (just look at the comments) take it for granted that, no matter how scientifically absurd, these facts cannot be used as evidence for non-materialistic explanations of the life and the world. Everything will be explained by materialistic science, and that is settled.


Okay, but consider the alternative -- that the existence of a non-materialistic reality is assumed without empirical (i.e. material) evidence. If that were to be accepted as a given, then we might build a pseudoscience on that foundation, one that would burden everyone with assumptions about empirically untestable, non-objective properties of reality. We would have created psychology.


Empirical evidence isn't material evidence.


That's exactly what it is.

http://en.wikipedia.org/wiki/Empirical_evidence

Quote: "Empirical evidence (also empirical data, sense experience, empirical knowledge, or the a posteriori) is a source of knowledge acquired by means of observation or experimentation.[1] The term comes from the Greek word for experience, Εμπειρία (empeiría)."

The definition goes on to contrast empirical evidence with reasoning and other ways of approaching analysis -- all the non-materialist approaches.


Read the quote again, you're begging the question.

Empirical evidence is experiential by definition - it can be material evidence if it relates to claims about matter, but it can also be evidence about other domains.

Ex. If God exists, then religious experience is empirical evidence of this. It is not probably not material evidence however.


> Read the quote again, you're begging the question.

Yes, perhaps I am to some extent.

> Empirical evidence is experiential by definition

I would have said it relies on tangible evidence, material evidence. Its status as an experience by an observer, if present, is secondary. I say this because evidence can be gathered without anyone experiencing it directly. Consider Curiosity on Mars. If we read a mass spectrometer's results radioed back to Earth and draw conclusions on that basis, it's a stretch to assert that we've experienced the evidence. Its interpretation certainly involves an observer, but not the evidence gathering itself -- that is often automated, even here on earth.

> Ex. If God exists, then religious experience is empirical evidence of this.

No, I think a spiritual experience contradicts the direct, physical sense of empirical. I usually regard empirical evidence as that kind of physical evidence that forces different, similarly equipped observers into agreement on its meaning.

Example -- when the CMB was confirmed in the mid-1960s, it killed off the last hope for a steady-state universe. Until then the Big Bang's critics were theorizing that the universe created new matter between the galaxies, so even though the universe was clearly expanding, this didn't mean it had a beginning or an end. The CMB detection, which wasn't really anyone's direct experience, falsified this alternative to the Big Bang. And it's objective in the sense that anyone can set up and detect the same evidence using indirect means -- not by direct experience.


I don't know what you mean by "spiritual experience" but I think the previous comment about "religous experience" was referring to tangible, observable phenomena, like a man rising from the dead. This is empirical evidence (i.e. based on experience) but may not necessarily be measured in SI units. Data like this may lead to defensible logical conclusions that (because they are immaterial) cannot be tested experimentally.


Where would that leave something like free will?

I think most people will agree with me that we have the sensation of control over our own actions and yet it's not something that's directly testable per se.


Because the notion of free will isn't a materially testable question, it's not in the realm of science but philosophy.


If what you mean by materialistic is non-supernatural (or just natural) then that much is quite settled, yes.


I don't see any of "these facts" as "scientifically absurd." Which are absurd in your opinion?


Science describes the world as it is, no matter how strange, whether or not it's 'materialistic'.


Science describes the material portion of the world, and ignores the rest, or says: "the rest will be discovered as having a material foundation in the future and by that time we will describe it, we will not bother to describe it now".

This is a quite big assumption, and has none to do with describing "the world as it is".


That assumption does not exist. Science will measure and model things without material foundation just as readily. Unless 'material' means 'able to be measured'. Science doesn't really putt around with things you can't measure.


> Science describes the material portion of the world, and ignores the rest ...

Let's say that science doesn't try to analyze those parts of the world not accessible to empirical observation. That could be described as modesty or reticence.

> This is a quite big assumption, and has none to do with describing "the world as it is".

Those who do try to describe the non-material world have a pretty terrible record for reliable results.


"ignores the rest"

by "rest", I assume you are a scientologist talking about theatans.

"This is a quite big assumption, and has none to do with describing "the world as it is"."

I disagree, my favorite thing about the scientific method is that it does its best to drop assumptions. I am not a scientist, but resort to this kind of thinking when I have to debug code. Suggest a better way and I will try it out.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: