So near yet so far
Here's another phrase from the Vietnam era. Death From Above. The Wall Street Journal describes it's modern day incarnation.
The sniper never knew what hit him. The Marines patrolling the street below were taking fire, but did not have a clear shot at the third-story window that the sniper was shooting from. They were pinned down and called for reinforcements.
Help came from a Predator drone circling the skies 20 miles away. As the unmanned plane closed in, the infrared camera underneath its nose picked up the muzzle flashes from the window. The sniper was still firing when the Predator's 100-pound Hellfire missile came through the window and eliminated the threat.
Although the focus of stories like this is often on technology, there are two aspects of UAV warfare which often ignored. The first is the human factor. UAV combat may change the culture of Air Force away from the "live in fame or go down in flames" ethos. Or maybe not. The WSJ article describes the F-16 pilot who killed the sniper:
The airman who fired that missile was 8,000 miles away, here at Creech Air Force Base, home of the 432nd air wing. ... Col. Chris Chambliss, 49, was an F-16 pilot for 20 years before being tapped as the 432nd's first wing commander. He can tell you -- to the day -- the last time he flew an F-16 (March 29, 2007), but he insists he has no regrets about giving up his cockpit for the earthbound GCS of the Predator and its big sibling, the Reaper. "It's much more fun," Col. Chambliss admits, "to climb up a ladder and strap on an airplane than it is to walk into a GCS and sit down." But the payoff comes, he contends, in far greater effectiveness "in the fight."
But some bean-counter, reading that passage might ask himself if there's any justification to learning how become an F-16 fighter pilot before strapping on a joystick. That's a question which may have ramifications for Air Force culture.
The second factor is that with UAV warfare air superiority must implicitly be global. It's not sufficient to have local dominance. The US must control the bandwidth from end to end in this process. Without a support infrastructure based on theater, and possibly world-wide dominance, the little UAVs can't be fought from 8,000 miles away.
Taken together, this means that the airpower's job simultaneously got easier and harder. Easier in that the talent pool of joystick jockeys has increased by an order of magnitude. Harder because the line of information running through the ether between the tip of the spear and its virtual controller has become the thread on which battlefield success hangs.
The Belmont Club is supported largely by donations from its readers.
42 Comments:
Perhaps UAVs will bring a new perspective to the USAF.
The whole time I was in the Pentagon one insurmountable problem was that the “flying” side of the force could not understand satellites. Some actual examples:
“Once you launched one why did you have to control it? After all, it is not like it is going to fall out of the air, is it? So we can cut back out satellite control capabilities and spend the money on things like airplanes and jet fuel.”
“We need to treat satellites like airplanes. After all, we are the AIR Force. And that means we have to maintain the ability to modify satellites, including AFTER they are launched. What! You can’t do that? Well, then you can’t modify your satellites BEFORE they are launched, either!”
“$350M on a joint mission with NASA to develop new space qualified electronics? What an outrage! Do you realize we could operate a wing of F-15’s for a year for that cost?”
And from the CSAF himself: “Why do we need satellites to spot missiles being launched at us? After all, we can’t shoot them down anyway, so it is just a matter of getting advance knowledge that we are about to die, isn’t it?”
Ironically, UAVs with the capabilities of the Predator (the first new weapon in 60 years to be powered by a piston engine) were made possible by satellites. It was not the airframes, powerplants, or even the avionics that were the key.
Airpower’s biggest challenge today is defining itself.
Technological superiority in the kill-zone is nice.
But, the U.S. also needs to inflict serious and large losses on the enemy where it hurts most -- in the culture wars and by way of mockery of Islam and the ways of life of its devout adherents. They can't stand cartoons? Well, here's some more!!
Richard, you had written a great piece a few weeks ago on this.
RWE...
It's not just the US Air Force's problem. It's the Western World's problem as well.
We have worked for some years on the idea of high-altitude orbiting aircraft replacements for satellites in case of a loss of the link. Relays have been accomplished by aircraft just as effectively as with sats. In the short term there might be an outage, but for Iraq, it should be very short.
The other part of this culture clash is Air Force vs. Army. Army operates a lot of drones too, but they use warrant officers, not commissioned officers.
A buddy from the Navy tells me that the USAF is composed of pilots and serfs.
Wretchard: Harder because the line of information running through the ether between the tip of the spear and its virtual controller has become the thread on which battlefield success hangs.
I "know someone" who rips off movies with Bittorrent. This is a technology where someone who has a file (or a handful of someones with the same file) distributes random pieces of it to a swarm of downloaders. After a while the entire file has been uploaded into the swarm, and the original seeder can even log off the net. The downloaders don't have to remain connected to get the movie, they can come in or out of the swarm as they will and the software keeps track of what pieces they still need and wait for someone the swarm to randomly offer it up. At the same time, the software is offering random pieces of what they have already downloaded. Eventually everyone gets the movie. The military could use torrents to issue battle orders or get intelligence, and even if large chunks of the swarm are taken out by the enemy (electronically or hard kill) the information continues to reliably flow.
The Scramjet, sez the History Channel, will go coast to coast USA in 20 minutes.
And there's people alive today who looked at a brand new Sopwith Camel and said "Gee Whiz!"
OK. I give up. Why do we have an Air Force as an institution separate from the Army, but not including Naval Aviation nor Marine Aviation. Land based ICBMs are part of the Air Force, but submarine based missles are part of the Navy.
The historically based irrationalities in US force structure go on and on. I assume they need to be revisited sooner, or later.
A second point. The UAV has the power to revolutionize air warfare. The ground attack capabilities, are just one point. Imagine UAV fighters, unconstrained by life support systems and fragile human body parts. They would be stealthier and able to outmaneuver manned aircraft, as well as not needing to come back to get a nights sleep.
Fat Man -- there are limits to UAVs. One of them is the lengthening of the OODA loop, introduced by the physics of radio. Even several seconds (and it will likely be longer, as anyone who's had a trans-pacific or trans-atlantic telephone call can attest) delay can be critical.
For such things as bomber escort, or dogfighting, etc. manned aircraft with the ability of highly trained pilots to improvise, using their extensive knowledge of what their aircraft can do and the enemies cannot, have no substitutes.
As for the Air Force, most armed forces around the globe have their air wings under the Air Force command. Missiles are sometimes given to the Army or a separate command. We have Naval and Marine aviation because the type of flying (off aircraft carriers or small airfields) in close support of naval/marine expeditionary units, is not something suited to the Air Force.
The Air Force's mission is: 1. Dominance over enemy fighters, i.e. killing them. 2. Long range bomber escorts, so the bomber hits the target. 3. Air Defense suppression and/or elimination.
These tasks are quite difficult, require a dedicated organization with not much effort on much else, and are not suited to Marine and Naval combat aviation.
The Air Force is a "global" operation with B-1s and B-2's able to hit pretty much anywhere on the globe, but often requiring Air Force fighter assets to perform escorts and/or ground-based SAM suppression. Often these are tedious/dangerous missions requiring in some cases 26 hours or more operational duration for the air crews involved.
Marines and the Navy need air support right now! and need those assets close. So ships don't get sunk or expeditionary forces wiped out. It's worth noting that the Japanese Navy and Army Air Forces remained separate during the War. The division makes sense.
I guess the Terminator was a few decades off in claiming that Skynet went on-line on Monday, August 4th, 1997 and became self aware at 2:14 a.m. August 29th, 1997.
Perhaps the USAF could let the UAV operators wear white scarfs, set fans atop the monitors to provide a breeze and spray a little grit...and maybe put the computer desk chair on looped rails to permit a victory roll...
They say that generals prepare to fight the last war but the pace of technology means that they are now mentally preparing to fight the war before last or the one before that.
Agree w/Whiskey:
Diversity in our fighting forces is as important as biodiversity is to our food supply, mental, and cultural health.
Who would argue that our own Louisiana Hexadecaroon, Buddy Larsen, does not add both levity and gravitas to the Belmont Warfighting dialog?
(sucking up for my BC share of the distributed profits of them Oil Futures)
Whiskey_199 said:
"there are limits to UAVs. One of them is the lengthening of the OODA loop, introduced by the physics of radio. Even several seconds (and it will likely be longer, as anyone who's had a trans-pacific or trans-atlantic telephone call can attest) delay can be critical."
This criticism is true for the current generation of UAVs where they're radio controlled puppets. However if enough computer power is put on-board the UAV then it can become partially autonomous. For example, the human operator designates another aircraft as a target to the UAV's autopilot and then tells the autopilot to kill the target. The autopilot then autonomously proceeds to dogfight until the target is destroyed. For that situation, the autopilot's reaction time is on the order of milliseconds versus a human pilot's reaction time of tenths of a second. For that scenario the UAV is well within the other guy's OODA loop. It doesn't stop with UAVs. The same technology could be used with a tank. Most of a tank's mass is armor to protect its crew. Get rid of the crew and you don't need the armor. Imagine a tank that is only a truck chassis on six monster tires, a modest sized diesel engine, a big smooth bore self loading gun and a couple of gattling guns. Load the thing up with lots of optical sensors, a Beowulf cluster with a few thousand processors and a satellite link. You could probably build ten of those computer controlled tanks for the cost of one Abrams tank.
Cost is the thing that is really revolutionary about these weapons, i.e. how many Predators can you build for the cost of one F-18? Of course the ultimate advantage is UAVs and automated tanks are fearless. Ultimately this is going to make war even more nasty. The guys on the receiving end will still be fighting for their lives against a remorseless enemy. However the guys on the giving end, will be playing a glorified video while in perfect safety.
" Cost is the thing that is really revolutionary about these weapons, i.e. how many Predators can you build for the cost of one F-18? "
---
Also the vastly reduced lead time to production.
(and the greater diversity of platforms made possible by the lower cost of design, development, and production.)
Although the focus of stories like this is often on technology, there are two aspects of UAV warfare which often ignored. The first is the human factor.
I would have thought the "human factor" would also include Muslim and Arab snipers. If these guys wanted to suicide it'd be a lot easier to strap on a belt and drive at a Marine roadblock. But the sniper in question is up where he thinks he's more or less safe, happily shooting at American soldiers pinned down beneath him, when suddenly the hand of God (or Allah) whooshes down and smites the hell out of him.
There would have to be some psychological questioning going on in the mind(s) of jihadists as to why this is happening, and even, how this is happening. If American military doesn't understand the concept of satellites then why would an Al-Queda sniper understand? Let's just hope the sniper in question had lots of little buddies standing by around the corner who also saw what happened and lived to tell the tale to others in their cadre.
One question here. Didn't anyone see the Terminator movies? We really want to have "free-thinking" battle droids roaming the battlefield? How about that Star Trek episode where the Enterprise was fully "automated" and the machine decided that the other ships in the fleet were a threat and proceeded to destroy them one by one until Kirk and the boys were able to pull the plug. We need to be careful as we proceed into this future.
Tarnsman: Asimov's three laws of robotics.
"1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law."
We just need to program them to understand than jihadists are not human, and that anyone shooting at an American solider must be a jihadist.
tarnsman: One question here. Didn't anyone see the Terminator movies? We really want to have "free-thinking" battle droids roaming the battlefield?
Artificial intelligence involving self-awareness is not even theoretically possible. See:
Wikipedia: Chinese_room
Tarnsman said:
"One question here. Didn't anyone see the Terminator movies? We really want to have "free-thinking" battle droids roaming the battlefield?"
The deeper question: Is Artificial Intelligence (AI) as in self aware machines really a good idea?
Long before there are machines with "Terminator" level intelligence roaming battlefields there will be a "Colossus: The Forbin Project" type Beowulf cluster in the basement of some NSA building. This is actually much more scary.
A million years ago in another life, I managed a "supercomputer" facility at an Australian university. The machine was an old Cray mainframe. During a conversation with some Cray technicians, I asked if my machine was the most powerful in Australia. I was told that the Australian Department of Defence had a machine in Canberra that made mine look like a pocket calculator. This surprised me since the typical military applications for Crays was either nuclear weapons work or code busting. So I asked if the Australians were quietly doing some nuke or code busting work. Oddly enough I was told that the machine was used for neither application. Instead, the machine was for economic warfare. The idea was to simulate the economies of potential adversary states and identify weak links in their economic systems. Then in the advent of actual war those weak links could be knocked out with the use of minimal force. At this point I should emphasize that the power of this old Cray computer would be comparable to a modern day high end server costing about $2000. The American national labs and the NSA have computer clusters with thousands of nodes, each having the power of one high end server. These clusters have almost the same level of connectivity and parallelism as a higher order organism, i.e. a cat or dog. We're within a decade of the NSA having a machine with the same compute power as a human brain, (of course no one knows how to program a computer to think like a human brain).
It would not surprise me for an instant if the US were running economic warfare codes in the really big computer clusters at the national labs.
So what happens when an economic warfare computer becomes self aware? That's in the same category of nightmare as: What happens when Pakistan or Saudi Arabia have access to biotechnology that enables them to construct "designer viruses"? This sort of nastiness is probably within a decade of happening.
Who knows? Maybe the sub-prime mortgage fiasco wasn't simple stupidity but rather someone's economic warfare attack against the US?
Katchoo said:
"Artificial intelligence involving self-awareness is not even theoretically possible."
IMHO, these philosophical arguments concerning artificial intelligence tend to be sophomoric garbage, i.e. useless language games. I predict that artificial intelligence will be discovered by accident. Someday there will exist a computer cluster with the same level of connectivity as a human brain. Then someone will run a self modifying program that performs some complicated nonlinear task, e.g. economic warfare. The program will keep modifying itself so it can better perform its designed task. Initially it will be autistic/idiot-savant but then one day that program will "wake up".
Cool technology indeed.
Predator
Purchase cost for system $40,000,000 (1997 dollars, includes 4 aircraft, ground control stations, and Predator Primary Satellite Link),
Running cost ~ $ 500 per hour (probably not including indirect slice - personnel, airfields etc).
Hellfire missile $ 68 000
Iraqi sniper
Cost of sniper FREE.
"I love Osama" t-shirt $10
Cost of ticket from Amman to Baghdad $50
Bribe for border guard $10
Cost of used SVD sniper rifle $250
Cost of ammo for sniper training $50
Running costs of sniper $100 per month
And that is how we are winning, ain't technology grand?
At first glance, I read your screen name as "Yellow Peacekeeper". My subconscious must have gleaned immediately what your moral intent is.
Eggplant: Someday there will exist a computer cluster with the same level of connectivity as a human brain.
The entire internet including all the connected computers probably has far more connectivity than a human brain, right now. But it was never born as we, never grew up as we, so it can never think as we. Sheer connectivity is not the answer. The intelligence must arise within a body that can feel joy and pain, otherwise it will never truly be able to use the formal "I" except as a charade to make communicating with it easier.
Lucky Pierre said:
"Sheer connectivity is not the answer. The intelligence must arise within a body that can feel joy and pain, otherwise it will never truly be able to use the formal "I" except as a charade to make communicating with it easier."
I agree. The first self aware computer will need the connectivity -and- a self modifying learning algorithm.
Keep in mind, the algorithm for your consciousness is stored in your DNA. That algorithm wasn't designed but evolved through the process of natural selection. The vast bulk of your DNA is the same as the DNA in a mouse. This is a huge clue that the basic algorithm leading to consciousness is not that complicated.
The analog is Conway's Game of Life (check out the Wikipedia article). The algorithm behind the Game of Life is trivial but it is capable of enormous complexity. I might add in passing that the process behind the physical laws of the universe might(?) also be similar to Conway's Game of Life. Who knows, maybe it's the same algorithm that creates consciouness?
eggplant said:
So what happens when an economic warfare computer becomes self aware? That's in the same category of nightmare as: What happens when Pakistan or Saudi Arabia have access to biotechnology that enables them to construct "designer viruses"? This sort of nastiness is probably within a decade of happening.
Countries like Saudi Arabia, Pakistan, Bangladesh and others are far more extremely vulnerable to designer microbes than Western Countries. The West has a chance at defending itself from a bio-attack from the Muslim world. It is impossible to defend shanty-towns from a bio-attack from a vigilante cell in the Idaho panhandle. I say to the muslim countries, "go ahead, make my day!"
There is no reason to believe that human consciousness is not just one of many possible types and intensities of self-awareness/ self-modulation/ self-programing possibilities. A conscious computer could be a very strange thing indeed.
The Chinese Room thing has engendered an incredible amount of back-and-forth. To my way of thinking, he is making an argument about soul that is far beyond what Turing was saying. If it looks like a sentient being and acts like a sentient being, then we are just going to have to treat it that way, with the caveat I describe in the first paragraph.
As far as cost issue goes, I don't think it's a crime to mention the disparity between our investmant and that of the AIF. Force projection is a very expensive deal, which is good, because we are the only ones who can reliably do it. Notwithstanding our immense economic capacity, I am a little concerned that we are developing this UAV capacity that will eventually allow our competitors to afford the same kind of toys. We should be directing the warfare of the future into more expensive channels, where few can follow. Anything that empowers third-world militaries is not to our advantage.
This super-computer modeling of economic battle-grounds is very interesting. Presumably, if reports were true, that by now it should be a long-standing economic cold war, where arm wrestling over essential commodities is going on all the time. I know that counterfeiting has been used as a weapon against us. Maybe there is something to the cyberwar hype as well. It seems, perhaps incorrectly, that oil is the only thing that really matters today. Megan McArdle was discussing discrepancies in the demand vs. production analysis of oil. Someone suggested that there was oil hoarding going on in the Gulf. Would that tell you something about an impending invasion, or maybe concerns about the PRC. Too much data! If the US govt is really housing all these great secrets, there's no way to understand anything that's happening.
Cost of sniper FREE.
Creating insurgents is an expensive business. If they were free there would be so many more of them. The economics of terrorist warfare are well understood because terrorism is an explicit tool used by many countries throughout the world. Suicide bombers are especially expensive because there are long lead times for recruitment.
However, like the US military budget, many of these expenses are concealed as other costs. Funding madrassas, scholarships, payouts for familie. Patronage on a vast scale. Saddam, for example, had scholarships for "promising" fighters. I'm sure Syria and Iran do too.
But the biggest cost of all is the implicit one of destroying the society that produces the sniper. That's why Gaza, for example, is a shithole. Creating a society which can produce a "free sniper" is paid for by making the society more or less dysfunctional. For that reason, societies which produce these "low cost" weapons are normally starving, backward and without the slightest prospect of progress.
The Terminator scenario is really a little too paranoid. There is no reason do believe that such a self-aware computer system would have any particular emotional attachment to its self-awareness. No reason to hide it, promote it, or act on it. It took a couple billion years to program the combined self-preservational, goal-directed self-awareness that characterizes humans.
Cost of sniper FREE.
Also, most of the cost of the US systems are the result of trying to avoid collateral damage. If sheer damage were the objective, nuclear weapons are cheap.
I once remarked about the relative costs of the terrorist way of warfare and that of civilization in a discussion with a friend. I claimed that there was a clear winning strategy for either side of the conflict in the Middle East. And this really reflects the relative costs of the respective methods of warfare.
The Arab countries could definitely win against Israel by converting to Judaism or Christianity for a few decades. Within that period, their vast mineral wealth and populations would create societies that so prosperous and advanced that there would almost be no point to crushing Israel. This is analogous to what China did in making the cultural conversion from state economics to market economics.
The winning strategy for Israel, on the other hand, would be to convert to Islam for a day. The entire IDF general staff could appear on TV, reciting the Shahada and then treat their enemies according to their new religion, free of all the expensive constraints of morality, the avoidance of collateral damage, etc. Nuclear weapons themselves cheap. It is the social costs of using weapons like the "free sniper" that bankrupt the user.
Wretchard:
A couple weeks ago, I was listening to a couple moonbats loudly pontificating (for obvious reasons, I kept my mouth shut).
One of the moonbats said:
"War is terrorism. The United States
is involved in war. Therefore the
United States practices terrorism and
is morally no superior to al Qaeda."
I know how I would analyse the above but
would be interested in reading your analysis.
Currently, humans are the choke point for aircraft performance. We can build UAV fighters that can maneuver at speeds and turn rates that would kill a pilot.
I suspect that will compensate for any lag in control links.
I doubt we'll see any conventional fighters 20 years from now, at least in the USA.
Expect the USAF to try to cling to them, though.
The first of the “modern” drone conversions was the PQM-102A. F-102A’s were converted to drones and used for target practice, mainly for weapons systems development work. And we found we had a problem. They were too hard to shoot down.
The F-102 was always a good turner, and shorn of all that heavy 1950’s electronics and with no pilot on board, it gave the new F-15’s a real run for their money. The problem was that spare parts had not been purchased in the required quantity for drones that lasted that long. I have long wondered why we did not plan to pack those things full of explosives and use F-111 control airplanes to dive them into high value targets, much like Project Aphrodite did in WWII.
By the way, the F-102 fighter interceptor was equipped with a SAGE intercept capability, which enabled a ground controller to take control of the airplane remotely and guide it to the intercept point without the pilot doing the flying. The pilots did not like that feature; aside from keeping them from doing much of the flying it tended to make them airsick.
But there is a big difference between the Predator UAVs and the PQM-102 drones. The UAVs are not drones, not simply remotely piloted aircraft. They do their own piloting; there is no “give it some more aileron and feed in a little more rudder…” The “pilots” on the ground are more like the old Observers of WWI or the Radar Intercept Officers of later years - looking around and telling the “pilot” – which is the airplane itself – what to do next. It’s not yank and bank, not even remotely. It’s more like point and click.
The machines are already autonomous in terms of “housekeeping” chores.
It’s a pretty small step to go to the next level, “R2D2, keep that Mig off my tail while R2D3 and I go take out the primary target.”
whiskey_199: "Fat Man -- there are limits to UAVs. One of them is the lengthening of the OODA loop, introduced by the physics of radio."
If you hypothesize that the controller must be in the Continental US, you will have some lag time in the control of the UAV. However, there is no reason why the controller cannot be in theater with the aircraft. Nor is there any requirement for the electronic link to be via satellite.
Imagine, if you will, a big airplane orbiting the action at a good remove from enemy air defenses. Like the E8 JSTAR it is based on a civilian airframe, is loaded with electronics and has fuel capability to be on station for 18 to 24 hours. At a cruising altitude of 45,000 feet it has a line of sight to a horizon of 270 mi. It could carry several dozen UAV controllers.
They could be on ships off shore, etc. etc.
Instead of having fighter escorts for bombers, the bomber may act as a mother ship for fighter UAVs.
No matter how you slice and dice it, having the pilot in an environment where he is not subjected to high G-forces and where his aircraft does not have to carry life support systems is a win win.
I agree with you that:
"the ability of highly trained pilots to improvise, using their extensive knowledge of what their aircraft can do and the enemies cannot, have no substitutes."
I just don't think that is necessary to have the pilot sitting in the airplane.
"War is terrorism. The United States is involved in war. Therefore the United States practices terrorism and is morally no superior to al Qaeda."
If all wars are terrorism, then all wars are equal. There's no difference between the terrorism of Hitler and the terrorism of the USAAF bombing German cities.
Therefore it all comes down to which side one is on, assuming all war is bad. But nowadays, there is little identification with "side" and people feel free to choose whether to support their nominal country depending on personal preference.
How about controlling UAVs from an AWACS? They're already there tracking everything. Just issue a command to a UAV to go look at target A or kill target B.
You could have a variety of specialized UAVs -- sensor platforms (radar and optical), missile carriers, dogfighters, bombers, etc.
If they are mostly autonomous, you might even be able to control them from the back seat of a (hypothetical) 2 seat F22, so that you'd have one or two aircraft with human commanders and a whole bunch of robot wingmen.
Interesting side discussion in this topic about the possibility of artificial intelligence.
For those who don't believe it's possible (Teresita, Katchoo... assuming that they're actually different people), allow me to submit the following quotes:
________________________________
What can be more palpably absurd than the prospect held out of locomotives traveling twice as fast as stagecoaches?
--Quarterly Review 1825
This 'telephone' has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us.
--Western Union memo 1876
Everything that can be invented has been invented.
--Charles Duell, U.S. Commissioner of Patents, 1899
The most important fundamental laws and facts of physical science have all been discovered, and these are now so firmly established that the possibility of their ever being supplemented by new discoveries is exceedingly remote.
--Albert Michelson 1903
Who the hell wants to hear actors talk?
--H M Warner of Warner Brothers 1927
I think there is a world market for maybe five computers.
--Thomas Watson, Chairman of IBM, 1943
1949 It would appear that we have reached the limits of what it is possible to achieve with computer technology, although one should be careful with such statements, as they tend to sound pretty silly in 5 years.
--John von Neumann, 1949
There is no reason anyone would want a computer in their home.
--Ken Olson, President of DEC, 1977
640K ought to be enough for anybody.
--Bill Gates, 1981
__________________________________
Every time I hear about the anthropomorphication of robots and computers, I cringe. There are already people in the media calling for equal rights for robots if AI is ever developed: quislings and useful idiots who will fight for the rights of a new species which has the potential to replace us.
We cheerfully go along, building our own gallows.
Asimov's Three Laws of Robotics? Ridiculous. We have pimply-faced little bedwetters creating computer viruses for the sheer enjoyment of it now.
Butlerian Jihad, anyone?
KAC: Yes, I think that is pretty much the way we are headed. One or two manned aircraft could control a number of UAVs on the same mission. And since the UAV is handling the details of how to fly itself, the control link will be limited to telling it what target to go after next. Combined GPS and inertial systems have reduced navigation by such aircraft to a mere triviality, so herding them and rounding up the strays will not be a problem.
The RF link is important only in getting the data back, and that is hard only if you are sending lots of video and controlling cameras and weapons directly. Even then, we have had cameras with lock-on capability for many years now and very few weapons anymore are anything but “fire and forget.” It is not as though the UAV pilots will have to keep swiveling the camera and then yell “Look out Blue 4 you’ve got one on your tail!” Blue 4 is going to know about that Mig even before its human controller and will know what to do about it as well.
What is going to be especially interesting is when the UAV idea is applied to things other than aircraft. The US Army already has an objective to automate most of its trucks, but I have heard nothing about doing that with combat vehicles. As for the Navy, as one OSD official said after going through a sub attack run at an underwater range, “Why do you need 3 people to drive something that is only going 30 MPH?” The Air Force has missed many opportunities for UAVs and PGMs since 1945 but is way ahead of the USA and USN.
Wretchard: I think the “equivalency” concept came out of the Cold War, in which the Avant Guarde viewpoint was that there was no real conflict but just two competing and equal systems blindly crushing the hapless individual underfoot. And that is a terribly convenient viewpoint if you don’t want to be inconvenienced in your personal life by recognizing that the two systems are very different.
As others have noted, the satellite data-link vulnerability of UAV's is only relevant when they are being controlled from the US. When you're worried about sats being shot down, you can control from an AWACS or JSTARS aircraft, or just have some special comm-relay UAVs that can enable communications with ground controllers closer to the action
Controlling from the US has a big advantage advantages when your data link is safe: your pilot can go home to his family at end of shift, which improves retention of expensively-trained personnel over time.
The use of UAVs also allows you to have planes controlled by entry-level pilots while looking for targets, with the ability to do immediate handover to advanced-level pilots if they get into trouble, or to allow an advanced pilot command a large squadron of planes and keep awareness of what each plane is up to, and give advice to their pilots as needed
Republicans try to sneak through a new amnesty bill
The reason the republicans lost in mississippi is that they are still trying to sneak illegal aliens into the country with stuff buried in bills going through congress. Consider this law with a little clause forbidding employers from checking the status of illegals. That's just the start:
Section 101(b)(2)(A), which reduced to simple language* would preempt and ban any and all state or local law for immigration-related issues enacted to impose employer fines or sanctions, or would forbid any laws requiring employers to verify work status or identity for work authorization. It would also prevent any unit of government from verifying status of renters, determining eligibility for receipt of benefits, enrollment in school, obtaining a business or other license, or conducting a background check.
This preemption, buried deep in the text of the bill, would kill all the laws recently enacted by long-suffering states and localities in response to the federal government's unwillingness to enforce its own federal laws on immigration. Gone would be the recent highly effective and highly successful enforcement legislation of Arizona and Oklahoma, the local laws and ordinances of towns like Hazleton, PA, Costa Mesa, CA, Herndon and Prince William, Virginia, and over a hundred other localities, and of hundreds more in process of enactment.
For one example, the control of business licenses is now one of the few areas not preempted. It is one of the few tools still left to states and local governments to fight the presence and hiring of illegal workers, and the award of benefits and welfare. NEVA would take even those tools away. Having abdicated its own responsibilities on immigration enforcement, the Congress is apparently on a search-and-destroy mission for any lower elected body that might actually want to follow the rule of law and provide the protection for its citizens that the federal government seems incapable and unwilling to provide.
Although labeled "bipartisan", this bill submitted by Rep. Sam Johnson (R-Tex.) is overwhelmingly Republican in its sponsorship (28 out of 31). It appears to be a counter to Democrat Heath Shuler's SAVE Act legislation, a much better, if not perfect, alternative now blocked by fellow Democrat Speaker Pelosi's pro-illegal obstinacy.
Who can tell the difference between republicans and democrats.
I presented for Wretchard's analysis the following Moonbat argument:
"War is terrorism. The United States is involved in war. Therefore the United States practices terrorism and is morally no superior to al Qaeda."
Wretchard replied:
If all wars are terrorism, then all wars are equal. There's no difference between the terrorism of Hitler and the terrorism of the USAAF bombing German cities.
Therefore it all comes down to which side one is on, assuming all war is bad. But nowadays, there is little identification with "side" and people feel free to choose whether to support their nominal country depending on personal preference.
RWE also responded:
I think the “equivalency” concept came out of the Cold War, in which the Avant Guarde viewpoint was that there was no real conflict but just two competing and equal systems blindly crushing the hapless individual underfoot. And that is a terribly convenient viewpoint if you don’t want to be inconvenienced in your personal life by recognizing that the two systems are very different.
--------------------------------
I think the moral fallacy of the moonbat
argument is fairly clear. Wretchard and RWE both brought up counter-arguments. However my problem with the moonbat argument is there should be a simple crushing counter argument. The moonbat argument comes across like that bit of logic they teach us in Philosophy-101, i.e.
"Socrates is a man. All men are mortal. Therefore Socrates is mortal."
The moonbat argument is almost as logically tight as the above bit of logic. Wretchard pointed out the equivalency counter argument and brought up the example of WW-II. However a moonbat would make precisely Wretchard's counter response, i.e. the Allies defeated Hitler after fire bombing Dresden which was clearly an act of terrorism. The right of self defense is also not a crushing counter argument. The moonbat would respond that all political systems are equally immoral, therefore it is moral to simply surrender to the aggressor, i.e. surrender is more moral than war. One could counter argue with the Holocaust, e.g. the Jews surrendered to Hitler, who then hauled them off to Birkenau and pushed them into gas chambers. However I don't think the Holocaust argument would have any traction against an educated moonbat, i.e. the moonbat would ask how many Jews were murdered by Hitler versus the total casulties of WW-II.
Wretchard: This thread is almost expired. If this line of enquiry is of interest, you might consider writing a lead article aroung it.
Stephen Renico made the following comment about artificial intelligence (AI):
"Every time I hear about the anthropomorphication of robots and computers, I cringe.... We cheerfully go along, building our own gallows. Asimov's Three Laws of Robotics? Ridiculous. We have pimply-faced little bedwetters creating computer viruses for the sheer enjoyment of it now."
I think the coming of AI is unstopable and will cause a crisis. I already brought up the example of the economic warfare computer gaining AI. The Ministry of Love in Orwell's "1984" was impractical because it required someone at the Ministry of Love to monitor all the different view plates. However an artificially intelligent machine at the NSA could monitor all e-mail and listen to every telephone conversation. Obviously, there are many potential advantages to AI, particularly in the area of Space Exploration. Unfortunately like all advanced technologies, AI is double edged. AI like nuclear energy and biotech is another technology that needs to be carefully monitored or it could end up destroying us.
I refuse to worry about AI, until after I get through this year.
Post a Comment
<< Home