It's not enough for the U.S. military to be able to monitor you from afar. The U.S. Army wants its drones to know you through and through, reports Danger Room, and it is imbuing them with the ability to recognize you in a crowd and even to know what you are thinking and feeling. Like a best friend that at any moment might vaporize you with a hellfire missile.
Of a handful of contracts just handed out by the Army, two are notable for their unique ISR capabilities. One would arm drones with facial recognition software that can remember faces so targets can't disappear into crowds. The other sounds far more unsettling: a human behavior engine capable of stacking informant info against intelligence data against other evidence to predict a person's intent. That's right: the act of determining whether you are friend or foe could be turned over to the machines.
That's a bit disquieting whether you are an insurgent warfighter or not. But back to the overarching topic at hand: The U.S. military is pulling in more ISR data than it knows what to do with these days, a lot of it useless noise that's inconsequential to ongoing operations. And, as DR notes, the strategy in Afghanistan has changed from one of winning hearts and minds through nation building projects to targeting specific bad guys.
The hard part is keeping up with the bad guys, and that's where Progeny Systems Corporation's "Long Range, Non-cooperative, Biometric Tagging, Tracking and Location" system comes into play. The facial recognition layer of its technology is pretty standard: take some 2-D pictures of a target's face, use them to build a 3-D model, and then use that 3-D model to recognize the face later.
But that's not necessarily easy. It's difficult enough for computers to pull off biometric facial recognition when the subject is stationary and looking straight at the camera. Toss in the many variables inherent in aerial ISR--a moving target who may be in profile or looking downward, a moving drone, low resolution cameras, etc.--and it's a major challenge.
Progeny's system, if it works the way the company and the Army envision it, needs just 50 pixels between the target's eyes in a 2-D image to build the 3-D model. "Any pose, any expression, any face," the company's lead biometric researcher tells Danger Room. From that model stored in Progeny's database, the system could identify the target from an even lower resolution image or video.
The closer the drone is to the subject, the better all of this works. But progeny also layers in a second kind of recognition that can work at more than 750 feet. This "soft biometric" system basically takes in a bunch of non-facial but otherwise outwardly relevant data--skin color, height and build, age, gender--to build a larger kind of model for its vision algorithms to work with. If a body is moving through the crowd, Progeny claims that a drone circling high overhead can keep track of him or her simply using this larger, whole-body identification system.
But what good is tracking if you don't know who your enemies are? Another contract handed out to Charles River Analytics seeks to develop a human behavior engine known as Adversary Behavior Acquisition, Collection, Understanding, and Summarization (ABACUS). It mashes up all kinds of behavioral data into a system that churns out an assessment of adversarial intent, determining if a subject has enough built up resentment toward the U.S. and its aims to be a potential threat.
So pretty soon the drones may know who you are, where you're going, and what you're planning to do when you get there.
From Enemy of the State to Eagle Eye. All we need now is a global military network to centralize the decentralized ISR activities (i.e. Skynet) and we'll be on our way to Terminator Salvation. Add in temporal traversal and we've got the entire franchise complete with first three films and the alternate universe Sarach Connor Chronicles.
Honestly I have two problems with this type of tech: invasion of privacy (if they are used round the clock in the CONUS), and the surrender of autonomy to machines. We've been operating automobiles and aircraft for over a hundred years now. Without a 100% safety rating on any of these devices, it's needless to say that design flaw is a major contributing factor to incidental mortality. Robotics and computer science are no different.
While we may not have a future of killer cyborgs trying to control or kill us based on a sort of universal logic, system and mechanical error over a widespread area of a given network could produce a similar result.
You gotta keep humans in the loop.
Although I am the first to say that people need to stop preaching that everything is going to be skynet, I will say this worries me. A tiny mistake in this computer could potentially mean taking out the wrong target, both exposing itself if it is undercover, and or killing an ally, both of which are horrifyingly bad errors that would be all to easy for it to see.
Also is that a histogram that it is using to judge the difference between something? I like to dabble in photoshop, and many many of my pictures have very similar histograms, especially if its something like a sky picture and a water picture, which are completely different to us, but when read like that would be a nearly unnoticeable difference. Once again, errors could be detrimental.
Also before people start to say, well they could add in a human interface to make sure that none of those errors happened, if they did that, then what is the point of the digital one?
The rights of the many out way the rights of the few, unless you’re one of the super rich.
Unless you are the programmer of this drone and your super pissed off at your boss, who is one of the super rich.
It is still important to be nice to your neighbors, eh.
lol I got ya bad sci fi movie incoming....angry employee programs robot to go after boss!! LETS MAKE IT BRO!!!!
Now that would be entertaining. But that's the problem with this type of techology outlined in Enemy of the State.
Even with humans in the loop, the rich and powerful have the assets at there finger tips to target whomever they choose.
For a disgruntled employee bred from years of social awkardness and outcast, this is the ultimate platform for Revenge of the Nerds: A Vengeance to End All Vengeance.
Hey! That's what you should call it!
YES!!!! Need to call SyFy Channel and pitch it
whatever gets the job done, I say do it, yes we should limit the amount of robotic systems due to the plausible fact that if we continue with further technology, our troops might be relying on the robot instead of themselves to survive, to me that's unacceptable, its never good to be dependent, you should always be independent.
This is what you get for tagging all those pictures on Facebook.
I guess I should take off the pictures of the tall glass challenge and photos of my ###. The nerds will blackmail me into office style servitude as the official coffee boy.
I just thought of spinoff. Office Space: The Xerox Machine Strikes Back.
Damn my bad grammar.
HA would be sweet if it transformed and attacked em as they walked away!! We could fade in on that scene as the opening then go back to "6 hours earlier" show the life of the machine lol
This idea was inspired by the Ninja Killer Copier from the third Transformers movie.
reminds me of the "Minority Report" scene where john anderton has to enter his old job building and there cameras everywhere steeping pictures every second.
The people of the world only divide into two kinds, One sort with brains who hold no religion, The other with religion and no brain.
- Abu-al-Ala al-Marri
Please be nice to Aldrons Last Hope. Rumor has it, he has a nuclear powered drone in the sky with Flux Compactor Laser site Fermilab Nuetroning beam connected to the internet and the IBM Computer listening for........ YOU!
Or not, this cannot be confirmed or denied.
Details at 10pm at your local News channel, be there.
Non-Seeps RULE! Give homage to aka Spartacus!
And thanks to the I-photo/Picasa/Facebook "Faces" application, we can all contribute to their image database. I recommend using this technology on protestors for effective crowd control. All you need is a well armed flying Sony heli-camera. Actually, Nikon takes a better picture.
@pheonix1012, yeah I saw on the news yesterday that there are private I's now that do data mining on a target via the web. And all pics taken with smart phones have GPS coordinates, so not only will they know what you look like, but where to find you.
My friend / co-worker had his facebook deleted by the RCMP because he's in the application process of becoming an officer.
Hopefully, it will recognize our troops and friendly troops, and eliminate friendly fire incidents by drone.
If it walks like a duck, talks like a duck, acts like a duck, looks like a duck, and associates with other ducks, and is in a place that ducks usually frequent, then it is a duck for all practical purposes.
If I were a military sniper, that's what I would think.
Why would you hold drones to a different standard?