On Mon, Apr 23, 2012 at 01:11:08AM -0400, Dave McGuire wrote:
On 04/22/2012 06:57 PM, Guy Sotomayor wrote:
How coome
2 of use here (at least) seem to have contradictory findings?
Because it's anecdotal. BTW here's the definition:
(of an account) not necessarily true or reliable, because based on
personal accounts rather than facts or research
This is one thing that's quite backwards about our society. "I saw it
with my own friggin' eyes" is discounted as "not necessarily true or
reliable".
That is just plain fucked up.
Unfortunately, it is not. IIRC there have been several studies into the
quality of various types of evidence. Turns out that "eyewitness accounts"
are among the most unreliable type of evidence around, complicated by the
fact that the margin of error for it is at best very hard to quantify
outside of a lab setup.
Even when people are _not_ attempting to lie, evade, shade the truth etc
and are in fact doing their level best to give correct accounts it isn't
worth much. And this is pretty much down to the fact that the human nervous
system and brain have _not_ evolved as a high grade recording system:
- data aquisition is incredibly lossy - the brain tends to filter out all
the "unimportant" stuff very early on, static signals will fade into
the background, the signals preprocessing done in our sensory organs
and lower parts of the brain have evolved to normally work basically as
high pass filters: fast changes (like a tiger jumping at you from the
underbrush) get noticed and processed at priority, very slow changes
(like someone _very_ slowly sneaking up on you) are normally not
noticed (yes, with training and continious attention one can notice
things like that, but it imposes a high cognitive load)
- data processing is optimized for pattern recognition and will in fact
detect patterns where there are none (Rorschach test, UFO sighting,
"ghosts" howling in a drafty attic, ...), it also favours "well
known"
patterns, i.e. is prone to get into a rut even if it doesn't really fit
- data processing ruthlessy discards any information that is currently
not deemed important (parade a few pretty girls in bikinis in front
of 20something boys and they won't notice anything else)
- memory storage is done in stages and and each stage has a pretty
aggressive garbage collector running (standard test: say, for the last
5 years you always drive the some route to work - if someone inserts
low grade (not immediately attenting grabbing) events into that route,
would you be able to recall them 3 years later? I bet you won't
- stress does horrible things to perception and memory retention
- the brain is very, very fond of delegating routine tasks to "lower
level automated processing": on that above commute, after five years, you
won't be paying careful attention to the route you drive, but mostly
do it on "autopilot" and unless something really extraordinary happens
(you wait at a traffic light and it suddenly rains red roses), you
will remember almost nothing at all of that commute a few days later
- but you've done it so often, your brain will happily synthesize the
memory from of bits and pieces of previous times you drove the route
- also: false memories - with the right recall stimuli, you'll swear up
and down that something happened in a specific way for a specific
event when in fact it didn't happen quite that way
Yes, the limits of the human data aquisition, processing and storage
system _can_ be overcome to a significant degree, but it requires training,
dedication and constant attention - which implies a significant cognitive
load.
Example: drive about 30-50 miles at night in fall. Assume that on random
stretches of the road, frogs will be crossing the road in large numbers.
Your task is to flatten as few frogs as possible, ideally none. And yes,
there are plenty of almost-looks-like-a-frog-from-50m-distance-in-headlights
leaves on the road.
Then a few days later, do a similiar drive but this time your task is only to
arrive safely. Compare how fatigued you are each time.
The human brain aggressively uses data pruning and routine task automation
because otherwise the cognitive load would be utterly exhausting.
The human brain is also very, very vulnerable to biasing of any kind.
The research into the human brain can be utterly fascinating and at
the same time be very depressing. To use the title of a talk at the
OSCON 2011 conference: "All your brains suck."
Kind regards,
Alex.
--
"Opportunity is missed by most people because it is dressed in overalls and
looks like work." -- Thomas A. Edison