FacebookTwitter
Hatrack River Forum   
my profile login | search | faq | forum home

  next oldest topic   next newest topic
» Hatrack River Forum » Active Forums » Books, Films, Food and Culture » The Brain as an Interface to the Body (Page 3)

  This topic comprises 5 pages: 1  2  3  4  5   
Author Topic: The Brain as an Interface to the Body
Dagonee
Member
Member # 5818

 - posted      Profile for Dagonee           Edit/Delete Post 
I'm confused here. The title clearly suggests some supernatural attributes to the theory (the brain being part of the body, describing it as an interface to the body implies an interfact to something outside the body. Such things are generally described as the consciousness, soul, or spirit.

So people come into a thread that has as its starting supposition some supernaturalist element, and complain becasue religion comes into it? I don't get it.

Dagonee

Posts: 26071 | Registered: Oct 2003  |  IP: Logged | Report this post to a Moderator
pooka
Member
Member # 5003

 - posted      Profile for pooka   Email pooka         Edit/Delete Post 
I don't think anyone has argued that science has proved that there is no spirit, just that some things that have been attributed to the spirit can be explained by neurology. We are trying to get a groundwork of neurology laid, and then see where we are. And I'm always interested in brain stuff.
Posts: 11017 | Registered: Apr 2003  |  IP: Logged | Report this post to a Moderator
Xaposert
Member
Member # 1612

 - posted      Profile for Xaposert           Edit/Delete Post 
I'm not sure it's quite right to call consciousness or a soul supernatural - although it is certainly something dealt with by religion a great deal.
Posts: 2432 | Registered: Feb 2001  |  IP: Logged | Report this post to a Moderator
Destineer
Member
Member # 821

 - posted      Profile for Destineer           Edit/Delete Post 
pooka, Descartes argued (in the Treatise on Man, I think) that it was impossible for the sounds of human speech to be made by anything without a soul. Obviously we now have computers, not to mention tape-recorders, that can make speech sounds.
Posts: 4600 | Registered: Mar 2000  |  IP: Logged | Report this post to a Moderator
Destineer
Member
Member # 821

 - posted      Profile for Destineer           Edit/Delete Post 
quote:
Science might be able to explain the nonphysical some day somehow, for instance, which would be entirely consistent with dualism.

This gets into the difficult issue of defining 'physical.' The definition I prefer is, X is physical iff X is one of the objects or properties appearing in the complete, accurate scientific theory of the world (which is obviously not yet discovered). So by my preferred definition, what you're saying doesn't make sense.
Posts: 4600 | Registered: Mar 2000  |  IP: Logged | Report this post to a Moderator
Dagonee
Member
Member # 5818

 - posted      Profile for Dagonee           Edit/Delete Post 
All I meant is that any matter or energy within the body that forms a permanant part of it would likely still be called part of the body. Therefore, the thing that formed the connection to it and the rest of the body wouldn't be called an "interface to the body." It might be called an interface within the body.

Therefore, the title of the thread implies some non-materialistic construct, which implies the discussion will have to involve non-scientific topics.

Dagonee

Posts: 26071 | Registered: Oct 2003  |  IP: Logged | Report this post to a Moderator
pooka
Member
Member # 5003

 - posted      Profile for pooka   Email pooka         Edit/Delete Post 
Interesting, Destineer, I wonder if that counted parrots, of if he just didn't know about them. I mean, was this a linguistic thing or an opera afficionado thing?
Posts: 11017 | Registered: Apr 2003  |  IP: Logged | Report this post to a Moderator
skillery
Member
Member # 6209

 - posted      Profile for skillery   Email skillery         Edit/Delete Post 
mackillian:

quote:
Neurons in humans: 12-15 billion neurons in cerebral cortex and associated areas, 70 billion neurons in the cerebellum (hindbrain), 1 billion neurons in the spinal cord.
If neurons were analogous to computer bits, that wouldn't be much storage capacity.

Do we know how neurons store information, and how much information can be contained in a single neuron? Do memories of visual patterns take more memory space than spoken language memories?

I'm enjoying this discussion. Thanks for the good information.

Posts: 2655 | Registered: Feb 2004  |  IP: Logged | Report this post to a Moderator
Xaposert
Member
Member # 1612

 - posted      Profile for Xaposert           Edit/Delete Post 
quote:
This gets into the difficult issue of defining 'physical.' The definition I prefer is, X is physical iff X is one of the objects or properties appearing in the complete, accurate scientific theory of the world (which is obviously not yet discovered). So by my preferred definition, what you're saying doesn't make sense.
Well, yes. I'd argue, though, that "science might be able to explain the nonphysical" is something that DOES make sense, and hence indicates your definition is not accurate. I suspect most people on this forum understood what I meant when I said it, at least.

And also, I don't think it's fair to call dualists "arrogant" for claiming something that only your definitions are forcing them to claim. I mean, the dualist could just as easily argue that it's your definition that is the thing suggesting science will never be able to explain the nonphysical (which it does, no?), not them, and hence it is your definition that is arrogant, not them. You said yourself that "we don't know the limits of future science", but now your definitions tell us that we DO know science is limited to understanding the physical.

Posts: 2432 | Registered: Feb 2001  |  IP: Logged | Report this post to a Moderator
Xaposert
Member
Member # 1612

 - posted      Profile for Xaposert           Edit/Delete Post 
quote:
If neurons were analogous to computer bits, that wouldn't be much storage capacity.

Do we know how neurons store information, and how much information can be contained in a single neuron? Do memories of visual patterns take more memory space than spoken language memories?

Well, neuron's don't store things like normal computers do. In computers, there is a specific location in memory for each piece of data. Each piece of data takes up a certain amount of space, and has a fixed beginning and end within the memory space.

Neural networks, though, don't store things in a specific location. Instead, they use what is called "distributed" memory, which means that each piece of knowledge is stored across the entire network, rather than in a single spot. If you want to know where "1+1=2" is stored, you can't go to any specific neurons to find it. The knowledge is stored across the whole network, by the exact way in which all the neurons are connected to one another.

This means that in computers, if you destroy one bit, you probably ruin one piece of data but leave everything else exactly intact. Whereas, in a neural network like the brain, if you destroy one neuron, you very slightly alter every piece of data stored in the network, but don't really completely delete anything. Each neuron may be part of millions of different facts.

This sort of system has a lot of benefits for the brain. For one thing, if brains stored stuff like computers, then when a certain neuron contained some critical piece of information for speech, you might be able to lose the capacity to talk by losing just that one neuron. By spreading everything across the whole network, losing individual neurons only cause a virtually unnoticable effect on your knowledge - unless vast numbers of neurons are lost at once.

It's kinda complicated I think, but the important thing to remember from it is that you can't talk of a single neuron storing a piece of information. It's the network as a whole (or at least areas of it) that stores things.

[ April 17, 2004, 11:00 PM: Message edited by: Xaposert ]

Posts: 2432 | Registered: Feb 2001  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
quote:
If neurons were analogous to computer bits, that wouldn't be much storage capacity.
This is what I mean about complete ignorance being the leading factor in that side of the discussion.

A single "bit" in a computer can hold only a single character's worth of information, like the letter "K." A single cell in the human body—not even a neuron or neuroglion—holds exponentially more information in just the nucleus. It stores exponentially more in just the chromatin of the cell. And by "exponentially," I mean hundreds and hundreds and hundreds of bits' worth of information. In just the chromatin. Even more within the whole of the cell—all of the organelles perform specific tasks. In just one cell. The human body is composed of at least trillions upon trillions of these cells.

Also, these cells become even more complex in "what they do" in that when they work together to form cell systems (tissues), the complexity increases, not decreases. Then, when these systems form organs, the level is again increased in complexity. It is even more complex when organ systems (endocrine, vascular, digestive, nervous, etc.), of which the brain is only part of one of these systems (nervous).

The human brain alone can hold more information than, say, Google's famous computer farm. The brain is more complex than even the most famous artificial intelligences, like Kismet. There are many projects out there that can almost simulate the human brain's complexity, but none that can duplicate it. None.

And to further shoot down this computer analogy, I'll also point out that the brain alone is not the source of "who we are." The nervous system alone does not define "who we are." We are an amalgam of our various systems, working together chemically and otherwise, in a symbiotic relationship.

In other words, we are more than the sum of our parts. Trying to define mankind, the mind, or the soul from only one part of one part of what makes a human being a human being is extremely ignorant of what modern medical and physical science has shown us about who we are. Philosophize all you want, but all of the philosophizing in this thread has been from the viewpoint of old, incorrect assumptions of how the body works, which leads to how personality, cognition, memory—everything that makes us who we are—works in making us who we are incredibly flawed.

I am not going to teach a whole Bio 101 class here, but suffice to say, there is enough misunderstanding of simple animal biology, let alone human biology, rampant in this thread to seem completely ridiculous. And this is just from a basic biological perspective, not the more distinct specific medical perspectives. And this is why I keep saying that this is all based on fantasy, because the foundation for all the philosophizing is completely flawed, making the conclusions just as flawed (or more). The early great thinkers of the Greek and Roman times had some great ideas, but as medical scientific knowledge has gained more basis and understanding, the original basis for these philosophers' ideas need to be adjusted accordingly (and correctly). Same with those of the Renaissance and Enlightenment. Same with those great philosophers of the ancient and older-era East.

Without a firm understanding of what is already known about what makes us who we are, all the suppositions are going to come to ridiculously erroneous conclusions.

Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
fugu13
Member
Member # 2859

 - posted      Profile for fugu13   Email fugu13         Edit/Delete Post 
Uh, John, a bit can only hold a zero or a one. A byte can only hold a single character (of a very limited character set, of course).
Posts: 15770 | Registered: Dec 2001  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
You're right, fugu. I don't know where my mind was. Of course, that make what I was saying about that even more poignant, doesn't it?
Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
Bob the Lawyer
Member
Member # 3278

 - posted      Profile for Bob the Lawyer   Email Bob the Lawyer         Edit/Delete Post 
I don't want to step on Mack's toes here, she’s running this thread’s bio ship, but I figured I'd answer pook’s pain question. At least, what I think her question was.

What you're talking about is, I think, something called "The Danger Model" which was proposed by a woman named Polly Matzinger. It goes like this:

When a cell "dies badly" (i.e. not by apoptosis. Apoptosis is how cells are supposed to die and when a cell dies in this manner it doesn't pop, it shrivels. A lot of people get this confused. Remember no popping in apoptosis) it spews out its contents. "Bad death" can happen by things like pathogen infection or getting stabbed.

Ms. Matzinger holds that it is internal proteins within your own cells that trigger the immune response and not foreign bodies. An invading virus in her model doesn’t illicit a response until it kills something which means that vaccination works because of damage done by the needle and not the viral fragments that are injected. There is some scientific evidence for this, but the theory is by no means canon.

Anyway, where pain comes in (keep in mind that this is a very simplified model). The immune system’s first line of defence are these guys called macrophages. These are big cells that eat up foreign bodies indiscriminately and also signal the heavier hitters to come to the scene and specifically take out the problem. They roam all over the body and are going to be around when things start blowing up. One of the first things they do at a problem site is release two cytokines (cytokines are chemicals released by one cell to trigger a specific response in another cell) called Tissue Necrosis Factor-alpha (TNF-a) and interleukin-1 (IL-1). The aspect of their function that concerns us is signalling increased blood flow (which causes swelling) and increased body temperature. Increasing the blood flow allows for faster transport of cells involved in the immune response and the swelling isolates the infection to one area so it can’t spread. Increasing the body temperature interferes with bacterial and viral production and increases the rate at which the immune system works. Although, it also messes up a lot of other systems in the body so it's by no means a good thing all the time. Anyway, that's why these two cytokines are good, but back to the topic at hand. Why does it hurt?

TNF-a will also cause a depolarization in nearby senory neurons that are geared to send pain signals. Which is normally a good thing. Pain makes you miserable and makes you not want to run around and expose yourself to new threats, irritate the site of infection and just generally conserve your strength.

I don’t mean to say the TNF-a is the pain signaller, there are plenty of other things that can cause you to feel pain (heat, deformation of skin cells, etc). But, with regards to the danger model, it’s the guy that’s causing the problems.

Action potentials only arise in nerve cells, cells popping in their general vicinity oughtn’t cause them to lose their ability to polarize/depolarize. Indeed, if it did interfere with the propagation of action potentials in some way you wouldn’t feel pain at all.

Hope I haven’t only served to confuse you even more [Wink]

Edit: If you tried to read this before, I'm sorry. One paragraph was a brutal combination of two others that I wrote.

[ April 17, 2004, 11:33 PM: Message edited by: Bob the Lawyer ]

Posts: 3243 | Registered: Apr 2002  |  IP: Logged | Report this post to a Moderator
Xaposert
Member
Member # 1612

 - posted      Profile for Xaposert           Edit/Delete Post 
quote:
Philosophize all you want, but all of the philosophizing in this thread has been from the viewpoint of old, incorrect assumptions of how the body works, which leads to how personality, cognition, memory—everything that makes us who we are—works in making us who we are incredibly flawed.
Not ALL of the philosophizing.... some of it.
Posts: 2432 | Registered: Feb 2001  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
Yeah, and the parts that aren't are bad science. That's my point.
Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
pooka
Member
Member # 5003

 - posted      Profile for pooka   Email pooka         Edit/Delete Post 
Thanks, Bob, I hadn't known where that pain theory came from. But it puzzled me. I'm not sure I grasped what you were saying about vaccinations. Were you merely outlining the idea or advancing it?

So what about the idea that our skin can only sense hot or not, intense or not, and sharp or not? Is that still the thinking? Also that the tongue can only actually sense 4 tastes? Or all these a matter of "we have only discovered 4 tastes so far"?

Posts: 11017 | Registered: Apr 2003  |  IP: Logged | Report this post to a Moderator
Richard Berg
Member
Member # 133

 - posted      Profile for Richard Berg   Email Richard Berg         Edit/Delete Post 
quote:
The human brain alone can hold more information than, say, Google's famous computer farm.
The two share some design patterns, namely massive parallelism, but I don't think the brain is anywhere near approaching the scale of Google. The smartest people of a generation can hold ~100k digits of pi, or the contents of a bookshelf of medical references, or the score to every symphony in the repertoire. Amazing feats, but nothing rivalling a 300GB hard disk.

The unique parts of the brain are its suborgans for processing speech and vision. They are extremely difficult to match because they represent specialized hardware evolved to specific and complex tasks; present-day computers have to get by with software emulation, as it were.

Posts: 1839 | Registered: May 1999  |  IP: Logged | Report this post to a Moderator
skillery
Member
Member # 6209

 - posted      Profile for skillery   Email skillery         Edit/Delete Post 
Mr. L:

quote:
This is what I mean about complete ignorance being the leading factor in that side of the discussion.
If I had a mentally impaired child, or if I wanted to interact with such a person, I would want my actions and attitudes regarding that person's impairment to be based on the best knowledge available. If science did not have all the answers, I might add knowledge from other fields.

In my opinion, imperfect knowledge will always result in less than ideal actions and attitudes.

Intolerance of a person's mental impairment or a person's apparent complete ignorance, and inability to interact with such a person in an ideal manner indicates a less than perfect knowledge of that person's impairment.

I am interested in this thread because I want to better understand mentally impaired people and be able to have positive interactions with them.

Posts: 2655 | Registered: Feb 2004  |  IP: Logged | Report this post to a Moderator
beverly
Member
Member # 6246

 - posted      Profile for beverly   Email beverly         Edit/Delete Post 
*is in awe of the human brain*
Posts: 7050 | Registered: Feb 2004  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
quote:
The two share some design patterns, namely massive parallelism, but I don't think the brain is anywhere near approaching the scale of Google. The smartest people of a generation can hold ~100k digits of pi, or the contents of a bookshelf of medical references, or the score to every symphony in the repertoire. Amazing feats, but nothing rivalling a 300GB hard disk.
[Roll Eyes]

Yet another incredible feat of raw ignorance. A 300 GB hard disk wouldn't be able to hold all the sensory data the brain collects from a simple solitary stroll once around a block. Google's computer farm is nowhere near the complexity of a single human's brain, and are not the same design pattern. It may have some similarities to parts of the human brain, but nothing like the whole.

quote:
The unique parts of the brain are its suborgans for processing speech and vision. They are extremely difficult to match because they represent specialized hardware evolved to specific and complex tasks; present-day computers have to get by with software emulation, as it were.
Simulation, not emulation. Not even the most complex and advanced projects can pass the basic "human intelligence" tests (not to say that the tests themselves are basic, because determining "human intelligence" is very complex). I can sit in a cardboard box and simulate driving a car, but I'm not driving in a car. The level current technology is at with regard to emulating a human is even lower than that.

skillery:
quote:
If I had a mentally impaired child, or if I wanted to interact with such a person, I would want my actions and attitudes regarding that person's impairment to be based on the best knowledge available. If science did not have all the answers, I might add knowledge from other fields.
And using personal computers as your basis for comparison for such a thing would be horribly wrong, and drastically underestimating that impaired person's ability and disability.

quote:
In my opinion, imperfect knowledge will always result in less than ideal actions and attitudes.
Very Hume-like. We can never know for sure. Nice cop-out.

quote:
Intolerance of a person's mental impairment or a person's apparent complete ignorance, and inability to interact with such a person in an ideal manner indicates a less than perfect knowledge of that person's impairment.
Ignorance is not an impairment. It is the state of lacking knowledge. I am intolerant of wanton ignorance, where knowledge is assumed but not present. Saying "I don't know" is acceptable, saying "I know" but not really knowing is not.

quote:
I am interested in this thread because I want to better understand mentally impaired people and be able to have positive interactions with them.
Then stop looking at them as a computer with a faulty keyboard or some bad RAM, because it completely misunderstands them and dehumanizes them.
Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
fugu13
Member
Member # 2859

 - posted      Profile for fugu13   Email fugu13         Edit/Delete Post 
Quite true, the perceptual data that a human can recall is truly stupendous in its vastness. Though it does use some tricks to reduce load, as far as we can tell. For instance, unless its particularly unusual people will often not recall what someone was wearing after a bit, and if they try to visualize an event will not have that person wearing the correct clothes.

Which suggests seemingly useless or duplicate information gets tossed.

However, even leaving out stuff like that, the pure visual imagery capabilities of the human brain are ridiculously large.

Now, whether or not its being directly stored is a much more complex question. Its possible only 'hints' are being stored, that trigger certain pathways in our minds, generating the sensations/images/et cetera. But in some ways thats even more amazing.

There's an interesting question related to that: are human thoughts P or NP? And what would the implications of either be?

Posts: 15770 | Registered: Dec 2001  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
But fugu, people can not only develop photographic memory—which indicates the capacity is there—but there are techniques that can be used to help those who forget details to bring them up from the unconscious. Even with the most forgetful (barring brain damage like mine), almost everything is stored, it's the retrieval that is having problems.
Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
beverly
Member
Member # 6246

 - posted      Profile for beverly   Email beverly         Edit/Delete Post 
Any clue what's up with de ja vu? I just had one today. [Confused]

I had a brief conversation with my husband while handing off the baby to him at a friend's house. I remember the exchange and actions as very similar as though I knew what was going to happen next before it happened, but had the impression of it happening in a different place, time, and with a preivious baby. I don't get de ja vu often, but when I do it feels, not so much "this has happened before" but "something eerily, impossibly similar to this has happened before".

Posts: 7050 | Registered: Feb 2004  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
It's mostly just familiarity with a situation, or remembering similar parts of things that make it look like something that has never happened before actually did. It has to do with how the brain saves certain details, and I don't know off the top of my head the contributing factors, but the way those things are only kinda remembered helps create deja vu. In other words, had you no experiences from which to draw analogous connections (and they don't have to be logical, and are often not), then you would not experience deja vu. Not that no experience of deja vu means you have no experience, it's just that without the prior experiences (no matter how dissimilar), there would be no deja vu.

Or it's a glitch in the Matrix.

Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
fugu13
Member
Member # 2859

 - posted      Profile for fugu13   Email fugu13         Edit/Delete Post 
John -- I am not aware of anyone with a complete photographic memory, ever. Most people with a photographic memory I know of either have a perfect short term memory, the ability to selectively remember scenes perfectly for a fairly long time (but not all scenes), or some combination thereof. Never everything, for all time.

Also, I think you'll find a lot of hypnotic recall people just don't recall many details except what they happened to be focusing on.

Posts: 15770 | Registered: Dec 2001  |  IP: Logged | Report this post to a Moderator
Richard Berg
Member
Member # 133

 - posted      Profile for Richard Berg   Email Richard Berg         Edit/Delete Post 
The brain's sensory inputs are massive, yes, but its storage is very finite. Google doesn't remember the millions of pages of links that it generates millions of times a day; it remembers the index. The brain doesn't even do that much.

I'm not sure what you mean by visual memory being immense. It's immensely perceptive in some ways -- face recognition is one application that's hardwired and difficult to simulate. But how many faces can someone really remember? Let's be inordinately kind and say 100,000. Lots of distinct features there, but they're undoubtedly stored as patterns (compression). 100k JPEGs is nothing to a disk, and it will preserve a lot more detail to boot. As well as you know someone's face, could you paint in a 1000x1000 image?

Scenes are bigger, of course. A 300GB disk can "only" store 50 days or so. But that's storing (in some intelligent fashion) every pixel of every frame. Could you reconstruct some given frame N exactly? For all N, 24 each second? Not a chance.

Please give some real evidence before rolling your eyes.

[ April 18, 2004, 01:58 AM: Message edited by: Richard Berg ]

Posts: 1839 | Registered: May 1999  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
I didn't say hypnotic recall. I said other techniques. There are not only relaxation techniques, but sessions one can go through to remember certain events. Also, memory can be tied to surrounding events, which can immediately bring up things you missed before by making connections. It's all about how one makes use of our ability to make connections in our head.

And I have known people with photographic memory who can remember mass detail, including numbers, pages of text, names, and other minutiae for very long periods of time. Heck, one of the techniques I had to learn when rebuilding my memory was to apply details to long term memory willfully instead of unconciously. It can be done, and the amount of information we're capable of storing is staggering.

Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
quote:
Please give some real evidence before rolling your eyes.
Read the posts between fugu and myself for the last few posts for just a fraction of how human memory surpasses modern technology by light years.

A hard drive stores as single data bits, while the human mind can store overlapping incongruent data with only the slightest of connections. Additionally, the amount of space to store even one single byte of information exceeds the space required for even a single cell to store massive amounts of data. As far as memory goes, just the motor control memory a person has—say, just to run (which requires a great deal of thought that you never even realize)—far exceeds anything the Google farm, or even a room full of your "300 GB hard drives" can store. Technology can simulate walking, but making a machine that can run like a human is, to date, nigh impossible due to the complexity of the act (which is controlled falling, put in its simplest terms). We cannot create systems like those that exist within the human being using today's technology, because today's technology does not have the efficiency or capability to actually imitate those things accurately.

But, to sum up: [Roll Eyes]

Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
skillery
Member
Member # 6209

 - posted      Profile for skillery   Email skillery         Edit/Delete Post 
Mr. L:

quote:
I am intolerant of wanton ignorance, where knowledge is assumed but not present.
Then why would you voluntarily join a thread based on such? Were you hoping to learn something? Or is teaching your avocation? Interesting technique.

Is photographic memory commonly associated with a high I.Q.?

Posts: 2655 | Registered: Feb 2004  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
quote:
Is photographic memory commonly associated with a high I.Q.?
Nope, but depending on the test, it can be mistaken for it.
Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
Richard Berg
Member
Member # 133

 - posted      Profile for Richard Berg   Email Richard Berg         Edit/Delete Post 
quote:
Technology can simulate walking, but making a machine that can run like a human is, to date, nigh impossible due to the complexity of the act
Complexity != storage capacity. There's no doubt we perform extremely difficult calculations, but all the evidence we have points AGAINST doing so via a giant lookup table.
Posts: 1839 | Registered: May 1999  |  IP: Logged | Report this post to a Moderator
skillery
Member
Member # 6209

 - posted      Profile for skillery   Email skillery         Edit/Delete Post 
quote:
but making a machine that can run like a human is, to date, nigh impossible
Sony has almost done it. The trick with running is that both feet are off the ground momentarily. Their Qrio robot can jump and land and maintain stability, which is a pre-requisite to running.

I got to see it jump at the CES show in January.

[ April 18, 2004, 02:46 AM: Message edited by: skillery ]

Posts: 2655 | Registered: Feb 2004  |  IP: Logged | Report this post to a Moderator
Richard Berg
Member
Member # 133

 - posted      Profile for Richard Berg   Email Richard Berg         Edit/Delete Post 
And notably, they didn't achieve this by adding more and more disks. They did it by working smarter.
Posts: 1839 | Registered: May 1999  |  IP: Logged | Report this post to a Moderator
Xaposert
Member
Member # 1612

 - posted      Profile for Xaposert           Edit/Delete Post 
Going back a little bit towards the original topic, Skillery's earlier suggestion that some memories might be "stored" in the soul some way might offer us an alternative explanation to how we might be able to remember so much data. This would require some sort of "talking" between the brain and the mind/soul, however, which seemingly would in turn require a rather complex bending of the currently accepted laws of physics within the brain.

At the same time, it's hard to say exactly how far so many billions of neurons can go. We can build hard-drives with more bits, but as I mentioned earlier, hard-drives operate on different principles than neural networks. The last I heard (and I could be wrong) we did not have the capacity to build an artificial neural network at anywhere near the complexity of the brain. Without being able to test it, I don't think we should presume to be able to guess how far it can go. Maybe it CAN vastly exceed the abilities of a room full of 300GB hard disks. I don't see why that would be hard to imagine.

Posts: 2432 | Registered: Feb 2001  |  IP: Logged | Report this post to a Moderator
Shlomo
Member
Member # 1912

 - posted      Profile for Shlomo   Email Shlomo         Edit/Delete Post 
This is a fair amount of oversimplification, but it seems to me that:

"Secularization" resulted (and perhaps still results) from new scientific discoveries because many theologians simply say, "Science/rational thinking cannot explain this. Therefore, God/whatever must be behind it." The result of this horrendous logic is that the more we know, the less power God/whatever has, and the fewer believers. So it doesn't work to justify things with "We lack a better explanation."

In this case, we are talking spirits. Ami never said (that I can recall) that we should accept the "interface" theory because we do not understand parts of brain function. If she did, she's setting herself up for a fall. But in the original thread, she justified herself simply by saying the theory made sense. It does appear to make sense. However, it also makes sense to say that hormones, collectively, comprise the spirit. Or that the spirit controls hormones. Or, beverly, if you prefer, the spirit is the brain. Personally, I do not see what difference it makes. Either way, the brain will evolve a certain way to better channel the person's "spirit" (personality?) and the hormones will evolve as well. And if a person is irrational/retarded, the spirit is getting lost somewhere in transit. A better question is where we will go with this knowledge. How to best channel the spirit, or maximize endocrine efficiency or memory storage or whatever else?

But none of this adequately explains emotions. If certain people say certain things to you, you will feel pain or pleasure. Not "physical" pain or pleasure (as in knife wound/endorphins) but "emotional" pain or pleasure. You could conceivably call emotional pain a "conditioned response". Dogs can be made to run to their dishes at the ringing of a bell, and people can be made to cry at a certain collection of sounds from the mouth of an individual.

My philosophy may seem to leave no room for religion, but it does. You could simply say (as I do) that God represents all that is good, a perfect emotional response, a perfect spirit-body interface, etc., and that getting "close to God" is getting closer to good (God with an extra O). It would still be possible to have a "son of God", as Christians hold. This isn't any religion's "accepted" doctrine, but it is actually one that will withstand science...many doctrines have no such claim.

Posts: 755 | Registered: May 2001  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
quote:
Complexity != storage capacity.
You are incorrect. The more complex the storage unit is, the more can be stored. This is even true in hard disks. Do you even understand how a hard disk works (if not, read this)? It is just a fancy writable record player. They are disks with magnetic blocks on them, crammed together tightly (link). The reason we have been able to fit more and more storage on a hard disk is because we keep managing to cram more and more on each platter (disk) in the hard drive, but there is a point where no more can be fit. Additionally, the way the hard drive access the data is exactly like a record player, with a little head that must move around inside to the certain block on the platter (link).

There are inherent flaws to this type of data storage, not just in way it's stored (like a record player), but in the simple fact that it's stored in binary bits to begin with. Human data storage is light years ahead of binary in terms of storing, because humans can store information on top of information on top of information, with no recordable end to capacity, only with retrieval in some (most people never realize how much they actually have stored of events). So, you see, you can list to me all the various hardware in the world, but there are set physical limits to their capability, all due to the basic concept of how they work to begin with. Someday, we might come up with technology to begin matching the human mind, but today's bits and bytes just can't match the capacity or complexity of the human mind, which is why there exists no computer or multi-computer system that can even come close to actually emulating the human brain. Memory and data storage isn't even the determining factor, and computers can't match it.

quote:
There's no doubt we perform extremely difficult calculations, but all the evidence we have points AGAINST doing so via a giant lookup table.
Calculations != intelligence. Doing a mathematical calculation is even lower than elementary, it's so basic as to be something that we do on a cellular level constantly. A computer, no matter how complex, can only perform, it cannot create. Not without human guidance. The reason even the most advanced computer systems need human guidance to do is because a computer system cannot imagine something up from nothing. Without a preprogrammed formula set from which to decisively draw from, nothing is done. Not even the most advanced computers can intuitively merge separate instructions on the fly and come up with alternates in a blink. There are CPU instruction sets that set themselves up to anticipate certain things under certain conditions, but even then those are still statically assigned instruction sets that the CPU must follow or else start over.

On top of this, the level of instructions a CPU could hold is nowhere near the capacity a human brain could hold. If you don't think that is pertinent, then I really can't help you, because it means you have a whole lot to learn that would take far more time than I'm willing to sit here and explain to you, both in terms of computer hardware and human biology. I know more about computer hardware than I do about human biology, but you don't even know enough about human biology to make an accurate comparison. You keep thinking "how many frames per second" is some kind of incredible indication of capabilities, when the human mind is so advanced it doesn't even need to break something down into frames. It sees real motion, and modern technology can't even come close to copying that.

However, Richard, you keep coming off as a typical "computer geek" guy who thinks that because he uses a computer a lot, he knows a lot about how all computers work. Your 300 gigabyte hard drives are a joke to even the brain stem alone, let alone the whole brain. Your CPU calculations are a joke, since the human body dwarfs that level of processes at the cellular level, let alone at the conscious level (once again, think stroll around the block). You need some more education on the basics of how a computer works, though, so here. When you can grok that, I'll introduce you to the more complex stuff.

skillery:
quote:
Sony has almost done it. The trick with running is that both feet are off the ground momentarily. Their Qrio robot can jump and land and maintain stability, which is a pre-requisite to running.

I got to see it jump at the CES show in January.

That is so laughable. That thing, as well as the other robots out there like the Asimo, can only simulate such activity. They can do it in a laboratory or on a display floor. In other words, they can follow very specific instructions in a vaccuum. The human body, when it runs, is not just executing a specific set of instructions. Every step is handling the controlled movements in direct relation to the body and the ground (YMMV depending on coordination). Those robots "run" on an even less advanced scale to a baby who is just learning to walk trying to run.

Nope, it looks the same, but is very much not. Hence the "simulation" and not "emulation."

Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
Richard Berg
Member
Member # 133

 - posted      Profile for Richard Berg   Email Richard Berg         Edit/Delete Post 
quote:
The more complex the storage unit is, the more can be stored.
You're misquoting me. I think it was obvious to everyone else that we were discussing the complexity of the problem (walking) and therefore the complexity of its solution. This is completely independent of the details of the storage system being used. Algorithms do take up space, yes, but they take up a lot less than you think. I'll bet the source code to every algorithm ever submitted to a major journal would fit comfortable on a CD-R.

I have seen no evidence offered that the internal logic of the brain requires several orders of magnitude more space. There are certainly no trends in CS research that indicate a correlation between code size* and efficiency.

*Let's be crystal clear and say we're talking about the text segments of object files (a gzip'd repository is too language-dependent, though still nowhere near creating order-of-magnitude differences). You have to link to so many libraries these days that executable size is misleading.

quote:
The reason even the most advanced computer systems need human guidance to do is because a computer system cannot imagine something up from nothing.
Neither can a brain, unless you're about to switch sides and resort to completely independent influences. (That stance practically defines dualism.)

quote:
If you don't think that is pertinent, then I really can't help you
The question is somewhat pertinent from an inquisitive point of view. I just don't give any credence to your answer.

quote:
You keep thinking "how many frames per second" is some kind of incredible indication of capabilities, when the human mind is so advanced it doesn't even need to break something down into frames. It sees real motion, and modern technology can't even come close to copying that.

I work with desktop video every from both a developer's and user's POV, so I know computer processing and storage is not magic. It is, however, far superior to a human's. "Breaking something down" is lossy data compression -- very, very lossy. Unless you have a startingly insightful definition of "real motion," I can't think of any way in which the brain can compete.

This is not an indictment of overall brain capability. There are many areas (which I've already listed) where its internal data structures perform vastly better than anything we've come up with. But it does debunk the notion that visual memory is one of them, instead more akin to tasks like multiplying 100,000-rank matrices that everyone has to agree give computers the edge.

quote:
However, Richard, you keep coming off as a typical "computer geek" guy who thinks that because he uses a computer a lot, he knows a lot about how all computers work.
Please. I've taken graduate classes in computer architecture, operating systems, AI, and numerical machine learning.

It's you who seems bent on assigning grandiose attributes to the brain that are not supported by the evidence. Just because our high-level and biochemical descriptions of brain activity haven't been unified into a single theory doesn't mean the intermediate processes deserve mythical status.

quote:
. The human body, when it runs, is not just executing a specific set of instructions. Every step is handling the controlled movements in direct relation to the body and the ground
Precisely. Our abilities don't hinge on 100 million if-then statements; the exact mechanism is unknown, but it's certainly much closer to fuzzy logic, pattern recognition, and other processes that drastically reduce the necessary storage capacity from what you claim.
Posts: 1839 | Registered: May 1999  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
quote:
Please. I've taken graduate classes in computer architecture, operating systems, AI, and numerical machine learning.

It's you who seems bent on assigning grandiose attributes to the brain that are not supported by the evidence.

Because I already said that I'm not going to teach a Bio 101 class in this thread. Obviously, you haven't even the knowledge you'd get from that general ed class, because you think your "Graduate level" CS classes make you somehow able to compare it to biological science. That's like automotive engineer (mechanic) thinking that because they know how to build a car from parts that it entitles them to equate it as being as complex as the human body.

But whatever, continue thinking that you know enough about the brain to compare it to a glorified record player (a 300 GB HDD).

That's why all you deserve is a [Roll Eyes]

Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
Richard Berg
Member
Member # 133

 - posted      Profile for Richard Berg   Email Richard Berg         Edit/Delete Post 
Actually, I think the limit is more like 10GB, maybe 50GB if we're feeling kind to our geniuses. I know you'll step in to get the last word, but I'm through discussing this subtopic until there's some evidence and/or reasoning demonstrated.
Posts: 1839 | Registered: May 1999  |  IP: Logged | Report this post to a Moderator
skillery
Member
Member # 6209

 - posted      Profile for skillery   Email skillery         Edit/Delete Post 
Richard Berg:

quote:
Algorithms do take up space, yes, but they take up a lot less than you think.
I'll agree with you there. Algorithms make up about 5-percent of the code I write. The rest is anomaly processing for unexpected input. Sony's robot can run...in a straight line, on a hard, smooth surface, but it probably can't handle running in the rain, on loose gravel. That's the remaining 95-percent of the code that has yet to be written.

As for processing visual information, I wish I could recall the exact color of my kitchen wall when I try to match it at the paint store. Sammy Sosa wishes he could remember the exact pattern of the wood grain on his favorite bat, so that he would know when someone swaps it with a corked bat. The brain seems to be cutting corners to save storage space when it comes to recording visual information.

Posts: 2655 | Registered: Feb 2004  |  IP: Logged | Report this post to a Moderator
Pod
Member
Member # 941

 - posted      Profile for Pod           Edit/Delete Post 
um, john, just to let you know, you're wandering away from both reality and the point of this thread [Wink]

As someone who studies both brains and computers, i feel at least like i know generally whats going on in the field.

Psychological models of how people's memory function look like this:

Sensory buffer:
under a second of massively parallel visual memory
~2 seconds of auditory buffer
--->
working memory:
(debated, but for arguments sake is typically acknowledged), here's where tasks which take conscious attention use memory. If you're multi-tasking and failing, it's cause you can't store all the input in working memory.
--->
long-term memory:
Where "perminant" information is stored. It's extremely unclear how this works on an neurological level. all sorts of junk goes in here. Gets queries all the time for previously stored info (and there are interesting computational models of this).

What i'll first point out is that the gigantic sensory store (as far as we can tell, basically all immediate sensory information is avaliable there for a very short duration of time. That way, if you miss something, you've got an instant where you can still retrieve the visual or auditory input, and stick it in working memory.). However, what makes it into working memory is -significantly- pared down. People couldn't function if a) all the sensory info avaliable to them had to be attended to, and b) they had to store (effectively, memorize) all the information avaliable to them.

Some people have good nmemonics for remembering things and such, but so far as we can tell, there is no such thing as "photographic" memory, just different ways of chunking input, and people who have more practice with these methods will remember more, but its done via a different storage algorithm, rather than simply piping raw input straight to long term memory.

Alright, on to more philosophical things.

The belief that there are things that science cannot solve (for instance, how people work), is called the Mysterion Hypothesis. Not dualism.

There are several sorts of dualists, and not many educated philosophers have believed in mind-body dualism in a long time. It's not a credible or tenable way to view the world. Many dualists in this day and age are property dualists which merely has to do with what it means to be somethign (say a person) and how thats different from being made of stuff (matter). I've also typically heard "mentalists" referred to as "Idealists" which is what i believe they were originally called (this harkens back to George Berkely).

Oh, i notice that tresopax has mentioned some things on the subject. eh.

As for raw storage capacity of the brain, this depends on two things. How much incoming data can be compressed in various ways, and how much actual information capacity we have. Since we don't know the format by which the brain stores stuff, we're out of luck on the latter question. Lets just suffice it to say it's a really big number.

Posts: 4482 | Registered: May 2000  |  IP: Logged | Report this post to a Moderator
Pod
Member
Member # 941

 - posted      Profile for Pod           Edit/Delete Post 
And skillery, that is exactly the wrong way to think about what robots, or anything else that walks functions.

The point is that there is a class of problems that needs a general solution, you don't just solve a specific instance of the problem (walking in ideal conditions), and call the whole issue solved. Running in other conditions aren't sub-problems of the ideal case that can be patched. The ideal case is what you get for free by solving the general problem.

Posts: 4482 | Registered: May 2000  |  IP: Logged | Report this post to a Moderator
Pod
Member
Member # 941

 - posted      Profile for Pod           Edit/Delete Post 
And just to let you know john, pyschology has been carried along by a good number of computer scientists.
Posts: 4482 | Registered: May 2000  |  IP: Logged | Report this post to a Moderator
beverly
Member
Member # 6246

 - posted      Profile for beverly   Email beverly         Edit/Delete Post 
quote:
It's mostly just familiarity with a situation, or remembering similar parts of things that make it look like something that has never happened before actually did. It has to do with how the brain saves certain details, and I don't know off the top of my head the contributing factors, but the way those things are only kinda remembered helps create deja vu. In other words, had you no experiences from which to draw analogous connections (and they don't have to be logical, and are often not), then you would not experience deja vu. Not that no experience of deja vu means you have no experience, it's just that without the prior experiences (no matter how dissimilar), there would be no deja vu.

Or it's a glitch in the Matrix.

I wonder if it has to do with the complex way the brain saves information. Perhaps a very similar thing did happen to me, and as my brain was logging the current info away across a complex net of neurons, a bit here, a smattering there, the pattern was recognized as being very similar to a pattern recorded once before and inadvertently brought up the fairly inconseqential but comparative memory. If my brain's OS were Windows, I would have gotten an error message telling me that there was already a file saved under that name.

Or maybe it is just a glitch in the Matrix.

Posts: 7050 | Registered: Feb 2004  |  IP: Logged | Report this post to a Moderator
Xaposert
Member
Member # 1612

 - posted      Profile for Xaposert           Edit/Delete Post 
quote:
Because I already said that I'm not going to teach a Bio 101 class in this thread. Obviously, you haven't even the knowledge you'd get from that general ed class, because you think your "Graduate level" CS classes make you somehow able to compare it to biological science. That's like automotive engineer (mechanic) thinking that because they know how to build a car from parts that it entitles them to equate it as being as complex as the human body.
John,
If we are getting into qualifications, what are yours, that you are under the impression that you can pass judgement on the extent and validity of other people's knowledge in all these areas? In addition to repeatedly claiming people in this thread have no understanding of biology, you have also rejected the capacity of the computer science discipline and the philosophy discipline to deal with this issue. It would be one thing if you WERE teaching Bio 101 in this thread, with reasons and explanations to back up your claims, but as it is you are expecting us to just take your word for it that everyone is wrong and you are right. What background do you have in ALL these fields that you can do this?

I can tell you from direct experience that even undergraduate A.I. programming courses get into quite a bit of detail on the capacities and functioning of the brain. It's a field that has shaped the way biopsychology thinks about the brain.

Posts: 2432 | Registered: Feb 2001  |  IP: Logged | Report this post to a Moderator
Pod
Member
Member # 941

 - posted      Profile for Pod           Edit/Delete Post 
And tresopax, that's not true.

Computer science has shaped cognitive psychology, not biopsychology. The biopsychologists i've talked to are extremely disdaneful of both cognitive psychologists and computational neurodynamics.

Posts: 4482 | Registered: May 2000  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
This is what I hate about computer geeks: they are limited to only speaking in terms of computers, which when talking about the human brain fall far short of the actual subject.

So, let me first use Peter Drake, grad student of Computer Science & Cognitive Science at Indiana University:
quote:
Any answer to this question should be taken with several grains of salt.
Digital computers and brains don't work the same way. For one thing, every
memory location in a computer is created equal. You can move stuff from
one location to another without losing any information. In the brain, on
the other hand, certain cells specialize in certain jobs. While there is
considerable plasticity (the ability to change what some part of the brain
does, enabling the brain to recover from injury), there's nothing like the
uniformity seen in a computer. Secondly, processing and memory are
completely separated in a computer; not so in the brain. Finally, data in
computers is digital, and not really susceptible to "noise". In the brain,
there are continuous voltages.

With those caveats, let's look at numbers. The brain contains 10^11
neurons -- in other words, 100 giganeurons. Each one has synapses
connecting it to up to 1000 other neurons. Many researchers believe that
memories are stored as patterns of synapse strengths. If we suppose that
the strength of each synapse can take on any of 256 values, then each
synapse corresponds to a byte of memory. This gives a total of (very
roughly) 100 terabytes for the brain.

For more info, see the book "Mind and Brain: Readings from Scientific
American".

Of course, a rather optimistic outlook for computer-level intelligence to match humans says:
quote:
It may seem rash to expect fully intelligent machines in a few decades, when the computers have barely matched insect mentality in a half-century of development. Indeed, for that reason, many long-time artificial intelligence researchers scoff at the suggestion, and offer a few centuries as a more believable period.
And all of this still makes the same point I do: it's not just silly to equate modern hardware/software to the human brain, it's downright stupid (especially for a CS grad student). Whether it's in a few decade or—as leading CS researchers say—a few centuries, the fact remains that comparing current technology to the human mind is misguided at best, asinine at most.

But go ahead, keep the incredibly ignorant assumptions. After all, they make watching your sci-fi TV and reading your sci-fi books so much easier.

Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
Xaposert
Member
Member # 1612

 - posted      Profile for Xaposert           Edit/Delete Post 
quote:
There are several sorts of dualists, and not many educated philosophers have believed in mind-body dualism in a long time. It's not a credible or tenable way to view the world. Many dualists in this day and age are property dualists which merely has to do with what it means to be somethign (say a person) and how thats different from being made of stuff (matter).
Actually, it only hasn't been popular since the 60s, which is not a very long time in philosophy. And there are a good number of philosophers who do favor that position today, so I don't think it's fair to call it not credible or tenable like that's an accepted fact or something. (I even suspect, given that materialists are beginning to resort to extremes like eliminativism these days in order to defend against dualism, the materialist movement is headed into decline. But that's me, of course, and I could be biased.)
Posts: 2432 | Registered: Feb 2001  |  IP: Logged | Report this post to a Moderator
John L
Member
Member # 6005

 - posted      Profile for John L           Edit/Delete Post 
And to make it simple to explain why they can't be compared, here:
quote:
Ever since computers have been around, people have tried to compare them with the human brain, but this really cannot be done. A megabyte is an exact measure of the number of bits (like light switches) which can be used to store digital information inside a computer. One megabyte is just over a million bytes (a byte is 8 bits, or switches) and so you know exactly how much information can be stored.

The brain is organic--not digital--and so memory is not made up of two-way switches or bits. The exact way the neurons in the brain work is still unknown, but they appear to mesh together so that memory is really a complex, developing network of cells. These cells gain value as they link to others so, as you learn more and remember more, your capacity for learning increases.

Human memory is governed more by feelings and emotional associations than by exact data. Computers easily store abstract numbers and require much more room for pictures or sound, but humans usually experience the opposite.

The brain is more flexible than fixed-capacity memory chips. It is designed to expand and no-one has ever completely filled their brain to the point that they cannot know anything else. Any nominal brain capacity would far exceed computer memory ranges. Be proud to own such a remarkable device.

Current technology cannot match the flexibility that comes along with the complexity of the human mind, let alone the ability of it to handle so many concurrent processes without direct monitoring (breathing, blood pressure, balance, etc.). Add to that the processes that we do handle on lower levels (walking, chewing, swallowing) and higher levels (hand-eye coordination, speaking), and you have capabilities that modern technology can't even come close to. The current level of technological marvel is at about the level as a housefly.
Posts: 779 | Registered: Dec 2003  |  IP: Logged | Report this post to a Moderator
  This topic comprises 5 pages: 1  2  3  4  5   

   Close Topic   Feature Topic   Move Topic   Delete Topic next oldest topic   next newest topic
 - Printer-friendly view of this topic
Hop To:


Contact Us | Hatrack River Home Page

Copyright © 2008 Hatrack River Enterprises Inc. All rights reserved.
Reproduction in whole or in part without permission is prohibited.


Powered by Infopop Corporation
UBB.classic™ 6.7.2