Friday, February 01, 2008

Now leaving Kansas

The Defense Advanced Research Projects Agency (Darpa) is researching how computers reading brain waves may one day speed up the ways intelligence analysts detect targets in satellite images and also alert platoon leaders when soldiers are losing situational awareness.

It's science fiction, right? Nope.

Aviation Week reports that with new technology you can know what you don't know you know.

The analyst’s brain is treated as a sensor: Electrical activity it produces is recorded from electrodes placed on the scalp, the same way electroencephalography (EEG) is used in hospitals to monitor brain activity. Then, when the analyst looks at one of the images flashing by, a scalp plot shows when there is increased brain activity.

As images flash by, the analyst is asked to look for a target such as an airplane. After viewing about 50 of the smaller images (chips), he is asked if he saw an airplane—and he may answer “no.” But digital signal processing of the brain wave activity reveals that, in fact, he did see an airplane on slide 32.

“This process allows us to do triage on large amounts of visual information we get from different soruces and improve an analyst’s ability to go through a large amount of imagery,” says Smith. In fact, the analyst can do the job 5-7 times faster using the triage system than unaided. This is because the triage system picks up brain waves showing recognition of a target even before the human analyst is cognizant he has spotted it.

The technology is apparently very useful in situations where an analyst is processing huge quantities of visual data. (I'm wondering whether this has to do with operators staring at video downlinks from UAVs). The article continues, "The NIA project aims to help the intelligence community deal with the growing problem of having an enormous amount of 'visual media' flowing in for review. It is currently taking the analysts too long to turn the data into usable information that can be acted on by decision-makers and war fighters."



6 Comments:

Blogger James Kielland said...

A very fascinating article. What's most interesting to me about is that it essentially marks a very interesting reversal. Up until now, humans have used computers to do dull data processing jobs. Now we have computers using human brains to do high speed data processing.

There was a buzz some years ago about "fuzzy logic" and other ways of enhancing pattern detection in software. As the Darpa process is further explored it seems only likely that we'll see experiments to subject human analysts to ever more data with ever more interesting ways of the computer using the human brains to detect patterns in visual data. Patterns that computer can't find on its own and that the human is unaware of detecting.

I'm imagining something like "The Matrix meets A Clockwork Orange."

2/01/2008 04:59:00 PM  
Blogger newscaper said...

Years ago, (late 80s/early 90s)there was a classic of a "gotcha" in attempts to draft AI for this very image recognition task. In COld War terms it was the difficulty of automating spotting of Soviet tanks in sat & high altitude aerial photos.

A system using a [computer based] neural network was designed and then "trained" with a data set containing photos with and without tanks.

The interesting thing about neural networks is that they can "learn" to do things without it at all being clear *how* they do it -- there is no visible, explicit algorithm in the usual sense, rather only self-tuning connection weights between "neurons".

Anyway, at first they seemed to be having great success, well on their way.

When another data set was used for more testing, in preparation for showing off the system to more of the brass, a big "Oh shit" was discovered -- the damned thing seemed to no longer work.

Turns out, the NN was not recognizing tanks at all, but rather cloudy days -- all the tank photos in the training set had clouds and those w/o were clear.

The net correctly learned the wrong thing.

2/01/2008 06:19:00 PM  
Blogger Wretchard said...

There was a guy I listened to who was designing a neural net to detect network intrusions. The idea was that his neural net would "learn" what his network normally looked like and hence recognize an intruder when it showed up.

During the Q&A I asked what would happen if the intruder used another neural learning system to learn what his inbound network traffic looked like so that it could be impersonated. Interesting times we live in.

2/01/2008 06:25:00 PM  
Blogger Mad Fiddler said...

My Neural Network
can beat up
YOUR Neural Network.

2/01/2008 11:57:00 PM  
Blogger Mad Fiddler said...

I recommend that anyone interested in Artificial Intelligence check out www.kurzweilai.net

2/02/2008 12:05:00 AM  
Blogger Mad Fiddler said...

Dear James K,

As an animation designer/producer I made the transition from traditional methods --- pencil drawings traced to acetate cels, painted on the reverse side, then photographed with a motion picture camera --- to electronic graphics in the mid-1980's.

Becoming increasingly immersed in computer graphics for entertainment, interactive arcade and CD games, and Flash-based web entertainment, I found myself repeatedly doing "pixel surgery" along with the other artists and animators on the teams.

That is, we human artists were repeatedly used to clean up, correct and enhance the raw output of the computer software. Imagine, for instance, what happens to text when it's scaled to a height of only four pixels. Each letter becomes an unreadable jumble. Making legible a phrase made up of such crumbed-up lettering could be an all-day task. (Atari arcade games in the 1990's had a standard screen resolution of only 320 by 240 pixels.)

Consider the thousands of hours anyone has to spend to become and stay fluent with a proprietary operating system, its file management procedures, and any software applications used to produce income.

Consider the plague of repetitive use injuries to the wrists of computer slaves.

To a very great extent you are right in saying we have been using computers to do a lot of dull data processing.

On the other hand, humans throughout history have been wakened from their fantasies of controlling their tools to shape their world, to the nightmarish reality that in fact, the tools are repeatedly re-shaping us to conform to their needs.

2/02/2008 12:53:00 AM  

Post a Comment

Links to this post:

Create a Link

<< Home


Powered by Blogger