The US Military's Epic Cyber-Fail

Image: Flickr/Jamie Cox

Charlie Reed, a fantastic Japan-based reporter for Stars & Stripes, thoroughly confounded me last weekend with a long story extolling the US military's efforts to court young, cyber-savvy "hackers" to aid the Pentagon's nascent Cyber Command. "As CYBERCOM assumes authority of the digital battlefront," Reed wrote, "the military is embracing the hacker community in its pursuit of people to help secure the DOD’s 15,000 networks and potentially exploit weaknesses in outside networks."

The piece was well-written and solidly reported, except for one thing: Not a single of its 1,200-plus words was "Bradley," "Manning," or "WikiLeaks."

Sure, the story has US commanders acknowledging that it takes a hacker to beat a hacker. It has them admitting that building "an effective cyberforce with these types of individuals requires fundamental changes to the military culture." (Under "understatement" in the dictionary, see that quote.) But here is the extent of the culture shift that Lt. Col. Gregory Conti, a West Point cybersecurity researcher, thinks is needed:

"Ultimately you want to create an environment that’s so cool, so bad ass, they don't want to leave," Conti said. "We can do that."

Allow me a moment to shudder.

Conti's unfortunate analysis on how to snag a choice code jockey isn't an isolated paradigm fail. Anyone who's worked in public affairs, info ops, or cybersecurity knows the Defense Department lacks a fundamental understanding of how hacking, or social media, or open digital systems, or really anything else cybernetic, works.

As a public affairs contractor for Multinational Corps-Iraq, I worked with the general staff to start a Corps Facebook account. Thus ensued a laborious process of having to get special SPAWAR computer terminals to work on sites like Facebook, which are blocked on the military's standard internet system, the Nonsecure Internet Protocol Router (NIPRnet). Even after that, staff officers expressed tons of misgivings about the project: Could it be hacked? Could people leave rude, obnoxious comments? Can we still edit and review content before it's posted? While some of the concerns were well-founded, most demonstrated to me that their speakers had no clue what Facebook and its media peers were good for -- just that they were somehow critical.

On the same day Reed's story was published, the Associated Press reported that liberal, anti-war filmmaker Michael Moore donated $5,000 to the defense fund of Private Manning, who's suspected of passing hundreds of thousands of secret US war reports and diplomatic cables to WikiLeaks. "He did a courageous thing and he did a patriotic thing," Moore said.

Manning's case is instructive. He was a cyber-savvy intel weenie, a person with precisely the sort of skill set the military wants. But he was also reportedly megalomaniacal, depressed, and gay -- three things that, along with hackerishness, the military doesn't exactly smile upon.

And that's exactly the point: As Reed's article proves, US officials want the hacker's skills -- which they don't comprehend -- and they don't want his temperament, which they comprehend even less. These sorts of kids don't want to feel cool. Most of them have spent their lives eschewing what's cool. No, to appeal to them, the military must do several things it's proven wont to do: It must argue how, in detailed ways, the defense establishment is a force for good, not for ill, in the world. It must be unafraid to devolve responsibility to these personal entrepreneurs, rather than trying to micromanage their mystifying work via a stultifying bureaucracy. 

And finally, it must stop criminalizing homosexuality.

The rub: The Pentagon must do all this and still be able to screen out people who have no business dealing with secret information, as Manning didn't. That may be the Stars & Stripes story's starkest implication: In its zeal to co-opt folks like Bradley Manning, it risks being co-opted by groups like WikiLeaks.

So be it, DOD: Unless you're ready for a real revolution in thought, you'll go to cyberwar with the hackers you have, not the ones you want.


Zombie Ants: Should Governments Be Scared?

Comes word this week that Harvard researchers have made an exciting breakthrough in their research on the origins of, well, "zombie ants." The story and its truly creepy supporting photograph have led me to some unsettling (if irrational) thoughts on human conflict and all that. (Of course, it's nothing Daniel Drezner wasn't already considering a year ago, almost to the day.)

You see, for 48 million years, a parasitic fungus called Ophiocordyceps has existed on forest beds, waiting to be picked up by tree-dweling ants that happen by in their travels. According to the Guardian, once locked onto its insect quarry, the fungus:

grows inside the ants and releases chemicals that affect their behaviour. Some ants leave the colony and wander off to find fresh leaves on their own, while others fall from their tree-top havens on to leaves nearer the ground.

The final stage of the parasitic death sentence is the most macabre. In their last hours, infected ants move towards the underside of the leaf they are on and lock their mandibles in a "death grip" around the central vein, immobilising themselves and locking the fungus in position.

This news simply is not the sort of thing one can prepare oneself for.

Of all the flora and fauna on Earth, fungus strikes me as one of the simpler life forms. And yet, even before the Himalaya mountain range was born, here was a creature that could occupy a living host and "steer" it as a vehicle: bidding the poor ant to reach a suitable fungus-growing altitude, then putting the host in "park" and letting it die, its function fulfilled.

Again: a fungus. No nanotechnology, no artificial intelligence, certainly no servos. A plant had one of nature's most industrial animals do its physical bidding, somehow bringing the neurons and synapses to heel in a coherent, productive way. The liberal arts major in me is mystified and repelled.

The armchair strategist in me thinks: How can our enemies use that?

I'm no chemical or biological weapons expert, so if you are, tell me if I'm crazy, please: Can you imagine a future powder solution, not unlike weaponizable anthrax or botulinum agent, that spreads a fungus capable of commandeering a human brain? Could particular strains be developed to direct hosts into this behavior or that: jumping out of windows, refusing to eat, choking strangers out? Could it even be used to turn reasonable, free-thinking individuals into PBIEDs -- that is, suicide bombers?

Setting aside the zombie genre for a minute (my preferred anti-zombie paradigm, social constructivism, doesn't get much respect anyway), mind-control has been mined by film producers before, from the suspenseful (original) Manchurian Candidate to Reggie Jackson's would-be royal assassin in The Naked Gun to the M. Night Shyamalan vehicle The Happening (which is laughably horrid even by M.-Night-Shayamalan-vehicle standards). So, much of my gathering horror over some "zombie ants" is likely rooted in my movie well as my lack of serious scientific training.

But still. I can't be the only person unsettled by this whole system of ant seppuku, can I? And if the fungal mind-control can be a reality, then we'd better go beyond a zombie theory of international relations: We'll need a zombie biological defence strategy. (Drezner, are you listening?)

Whether you find my imaginings poignant or plain batty, throw a comment below with your thoughts on the subject. And don't make me force you to do it. I know things. Things about ant brains.


The Big Brother BlackBerry Backlash

Forgive me if I'm scared and confused. I thought we were living in an interconnected, supercharged, 2.0, flattened McWorld. And yet some of the earth's most populous countries have the temerity to deny Tom Friedman's genius by banning that 21st century version of the wheel, printing press, gunpowder, and sliced bread all thrown together in a plastic skin: the BlackBerry.

OK, I overstate things. Friedman, of course, is a dunderhead. And I traded in my disintegrating BlackBerry last year for Apple's handheld Hal 9000. And most of the nations in question -- among them Saudi Arabia, the UAE, Indonesia, India, Lebanon, and Turkey -- seek not to ban the smartphones themselves, but may block your ability to use them. Unless, that is, the phone's manufacturer, Research in Motion, grants those governments access to its private servers, where your text messages, calls, and emails are encrypted and sent on their way.

This "domino effect" of hard-line telecom strategies could devastate American policy in the affected nations, where US military and diplomatic personnel routinely communicate with the uber-secure BlackBerries. Even Barack Obama's got one. (Top US officers in Iraq often rely on clunker cell phones from local providers like AsiaCell and Iraqna, but spying on them is like trying to jam a smoke signal.) But these countries want to mete out a punishment for RIM's inspired success at encryption, the geeky science that ensures your communications stay between you and their intended audience.

The funny thing is that right here in the United States -- that great bastion of free enterprise and cyber-ingenuity -- the federal government once extracted similar access agreements from private telecom carriers. As I type this, I can look outside my San Francisco office window and see 611 Folsom Street, the AT&T building at the center of the Bush administration's warrantless wiretap controversy. Most of AT&T's overseas phone and Internet signals, not to mention entirely domestic transmissions, run through this building. And it's here that the company in 2003 allegedly built an entire room -- a "shadow hub" -- that collected those transmissions for review by the National Security Agency. Big Brother has terabytes of your information, even if he lacks the resources to look at it all.

And yet. The US still is one of the freer nations on the planet. While warrantless wiretaps, cell-phone traces, and decrypted personal signals scare the bejeezus out of me, I can only imagine what that power can mean in less progressive societies like the ones considering BlackBerry bans. In Saudi Arabia, witchcraft is still a capital offense; best not to speak too colloquially on the phone, I guess, lest you be mistaken for an incubus. Should Indonesia get access to a BlackBerry server, as it desires, you'll want to think twice about discussing that adventure trip to East Timor over the phone. Likewise with your Nepal or Pakistan travel plans if you're phoning someone from India, which is also threatening Google and Skype over access to their decrypted signals.

In America, we've had our share of ruthless spies. Bill Belichick. Dick Cheney. And while they were both (in my considered estimation) irredeemably evil, neither of them had the expansive media access that these countries seek. That access warps governments. Whether or not they can handle and decrypt all those transmissions, they'll think they can.

At the end of the day, the Gulf states and other nations that attack BlackBerry encryption aren't being evil...yet. Mainly they're just being dense: Information today has too many potential conduits -- and too many methods of concealment -- to all be caught, tagged, and released by regulators like so many wild bears. "This is about the Internet," RIM's CEO told the Wall Street Journal. "Everything on the Internet is encrypted...If they can't deal with the Internet, they should shut it off."

Even Tom Friedman can understand that. Why can't the Saudis?


Analysis, Bad Analysis, and Damned Lies

I fully intended to start writing on national security here by now, but it’s time again to break that rule. Perhaps it’s a quibble, but when respected American liberal pundit Matt Yglesias (whom I admire) blogged Wednesday morning on the state of the US economy, he let his elitism show. It’s been bugging me ever since, because it highlights the death of a Clinton-era, measured liberalism and its replacement with an aloof, divisive Ivory Tower stridency that succeeds only in distancing itself from mainstream political discourse.

Yglesias’ brief post was un-ironically titled “Elite Isolation.” He cited a government study showing the unemployment rate among college-degreed Americans to be about 5 percent, while unemployment for non-college workers hovered around 13 percent. And here was his analysis (my emphasis added):

Virtually every single member of congress, every senator, every Capitol Hill staffer, every White House advisor, every Fed governor, and every major political reporter is a college graduate. What’s more, we have a large amount of social segregation in the United States—college graduates tend to socialize with each other. And among college graduates, there simply isn’t an economic crisis in the United States.

His point, apparently, was to argue that since the college grads in power don’t face an economic crisis, they fail to address the needs of those myriad unwashed noble savages who lack degrees and financial means. He's clearly never read any of the dozens of trend stories like this one or this one, churned out weekly during the Great Recession.

Where to begin with the inanity – and soft bigotry – of this analysis, I don’t know. But it’s more than a quibble. His analysis was picked up by at least two other prominent bloggers: Washington Post’s Ezra Klein (the founder of Journolist, of which Yglesias was a member – an isolated elite echo chamber if there ever was one) and my own colleague at Mother Jones, Kevin Drum (who, to his credit, mitigated Yglesias’ “let them eat cake”-ness with his usual dose of self-doubt and inquiry). A half-truth is introduced by a media elite, and within minutes, it becomes the new conventional wisdom of the left, propagating across social media.

I feel compelled to flout that “wisdom.” Only a happily employed, non-creative-class, Beltway-ensconced, elite scion of a comfortable family would assert that college grads aren’t facing an economic crisis. It’s not just a claim with no warrant; it’s insulting to a large swath of the American electorate, and it’s precisely the sort of myopia that gets American liberals, and journalists, painted as out of touch, wine-sipping, privileged naifs.

Beyond the prima facie wrongness and insensitivity of Yglesias’ claim, his argument’s fallacies stand as a cautionary tale to any analyst in any field:

1) Missing the forest for the trees. A single unemployment figure doth not a picture of an economic crisis make. Yglesias doesn’t look at other factors, such as indebtedness – college grads have more of it, in student loans, credit cards, mortgages, and the like – and underemployment: Americans ages 18-29 are 50 percent likelier to be employed part-time, or for low pay, than their elders. Half of those young people are college graduates. And when an early-career professional ends up underemployed, it can stunt her career growth and earnings for life.

2) Building on a sand foundation. Even if unemployment figures told the whole story of an economy, these unemployment figures wouldn’t, because they’re underreported. After a layoff in 2008, and another in 2009, I spent a total of seven months unemployed. Yet because I was so junior, I couldn’t apply for unemployment compensation, and so I wasn’t counted. Neither was my wife, a doctoral student who’s been unable to find work for two years. Lots of people in lots of situations aren’t counted in the unemployment totals.

3) Poor definition of terms. College graduates include 22-year-olds fighting for jobs, and people in their 60s with tenure and a retirement pension. They include people with only a bachelor’s, people with MBAs, and holders of Ph.D.’s, whose job fields are overcrowded and underfunded. They also include white and minority workers: Jamelle Bouie at the American Prospect points out contra Yglesias that “for most of the recession, the unemployment rate among black college graduates has greatly surpassed the rate for whites.” To make a monolith of “college graduates” is to show no sensitivity to the hardships individual college grads face.

4) A straw man – possibly a racist straw man. Yes, the poor don’t have the benefit of education. And their plight is pathetic. And they need us college grads to tighten our belts and take up their burden. As a progressive, I don’t find any issue with the substance of this, but the style - Yglesias’ explicit description of “college grads” as a powerful elite bloc, and his implicit depiction of the “real” unemployed as the exploited proletariat - represents the pinnacle of privileged guilt and the soft bigotry of low expectations.

5) Either-or. Social benefit is not a zero-sum game between “we” the college graduates and “they” the poor, downtrodden service workers. You can reform the student loan system, extend unemployment insurance and food stamps, and increase job opportunities for all at the same time.

6) Red herring. A college degree isn’t what renders the Beltway crowd elitist and out of touch: the Beltway’s status obsession is. But thanks, Matt, for lumping all of us Florida State, Navy, and Iona College grads in with yourself, John Boehner, and Ben Bernanke.

I think we could fall into an infinite regress here, so let me wind up with a personal note. My wife and I have seven degrees and one nonprofit job between us. Since October 2008, I was unemployed for a one-month stretch and a six-month stretch. She’s been unable to find gainful employment the entire time. Our debt burden is heavier than the non-college-educated – we owe student loan originators the equivalent of a home mortgage, but we don’t own a house. We’ve moved five times in the past two years for four jobs and two unpaid internships; only one move was comped by an employer. Both of our working-class parental households lack the ability to support us. (In lumping college grads, Yglesias, Klein, and Drum also made no attempt to distinguish between educated offspring of the educated, on one hand, and educated offspring of the working class, on the other. For his part, Yglesias is a graduate of Harvard and the elite Dalton prep school. I sense his safety net is stronger than ours.)

Obviously, one’s personal experience isn’t scientific or objective, but this is also the situation of the vast majority of 22- to 32-year-olds I know in the American South and Northeast. Not that we suffer any more or less than the “uneducated” poor. It’s not a competition to see who suffers more, and by effectively treating it as such, people like Yglesias not only miss the forest for a single statistical tree, they alienate the same skeptical electorate that they need to sell on a wider safety net for all.

Sorry, but it’s the sort of empathy for the worst-off that can only be delivered by the Beltway-embedded jet set. Yglesias is a smart, talented partisan for the progressive cause. But if he wants to be even smarter, he should ditch the Department of Labor studies, leave the macroeconomics to Robert Reich and Paul Krugman, and go actually talk to some college graduates in the workforce – outside the Beltway.


The New York Times' Twitty Orientalism

One can’t help but love-hate the New York Times. It has all the erudition, the exuberant intelligence, and the pure twittiness of a first-semester graduate student. Like that erstwhile pupil, a NYT trend-story reporter has a knack for unearthing a great insight...and teasing a dubious inference out of it...and framing it as the most important revelation the world has ever seen.

This Sunday, I didn't have to search long to find a golden Times contribution for a grad-level colloquium on silly: a business story titled "But Will It Make You Happy?" The trend is Americans spending less disposable income on retail goods, an alleged shift from conspicuous to "calculated" consumerism. Fair enough; but the localization of this trend is an order of magnitude of twitty. It nods to a common and insidious NYT theme, one that’s killing intelligent discourse on America’s wars in Afghanistan and Iraq: the cult of the simple.

Subjects of said Times story are Tammy Strobel and her husband Logan Smith, Californians of similar age and status to yours truly. Harried by the demands of her project-management job and lifestyle, Strobel "stepped off" what she calls the "work-spend treadmill":

Inspired by books and blog entries about living simply, Ms. Strobel and her husband, Logan Smith, both 31, began donating some of their belongings to charity. As the months passed, out went stacks of sweaters, shoes, books, pots and pans, even the television after a trial separation during which it was relegated to a closet. Eventually, they got rid of their cars, too. Emboldened by a Web site that challenges consumers to live with just 100 personal items, Ms. Strobel winnowed down her wardrobe and toiletries to precisely that number…

Now, Ms. Strobel and her husband live joyously and harmoniously in a 400-square-foot Portland, Oregon, studio. He's a grad student; she's a freelance writer. They have virtually no expenses, so she's free to hike, exercise, volunteer for a yoga nonprofit, and love life.

Sounds awesome, truly. We all want to believe that a simplification of the Thoreau type is possible. Even I bought a copy of The Four-Hour Work Week once upon a time. Then I remembered that I attended very expensive schools, possess a whopper of a debt burden, work at a nonprofit, and have a wife, a dog, and two cats. Simplification, in my case, ain't so simple. Nor is it for most people.

And yet the Times loves to titillate its well-educated, favorably heeled audience with this old myth that life can be reduced to its basic atomic elements, and the force binding them is pure wondrous joy:

Status quo living - (cars + mortgage + malls) + (organic food * outdoor living)



Quod erat demonstratum.

Faith in simplification and reductionism is also what drives many Westerners' fascination with "non-Western" things: from the cultural (see Buddhism, Sufism, Baha'ism, Theosophy, and Ms. Strobel's affection for yoga) to the commercial (oil and natural gas reserves) to the strategically significant (the Suez and Viet Nam, once upon a time). Many of us come to embrace the otherness of these things not merely because they're exotic, but because they're exotically simple - especially compared to our crazy, breakneck, technologically-fueled, dual-overhead cam-driven, democratically messy lives. I can fondly recall my first excursion to Uzbekistan as a high-school exchange student, mystified by a distant society that, it seemed, could be fully explained by Tamerlane, Islam, and hot chai.

Innocent enough, I suppose. Except that when I returned, Rotarians, local news reporters, and college admissions officers all sought my expertise on "the Uzbek."

OK, you say. Orientalism exists. Undergrads know that. And to be sure, at points in the past decade, there’s been a greater push toward nuanced understandings of global culture in some circles. We’ve witnessed the birth of a new post-9/11 generation of academics, think-tankers, and diplomats, as likely to read Said and Spivak as Galula and Van Creveld. In the US security establishment, the ascendancy of Gen. David Petraeus and his mandarins has mitigated a lot of the old simple conflict memes. But he’s no savior, and the very fact that we now talk of “surges” and “human terrain” as game changers suggests the military’s cultural advance has been a very incremental one.

More concerning, though, is how in the past year or two – since the election of Barack Obama, really – public discourse on the wars in Iraq and Afghanistan has fallen into a feedback loop of stupid simplifications. War skeptics begat war cynics, who begat isolationists and conspiracy theorists, who think WikiLeaks is the savior of mankind, the Taliban are the rightful “owners” of Afghanistan, and everything will be better once Western forces leave both countries. And skeptics of immediate withdrawal have followed suit with simplistically stupid arguments, selling US-led troops as the saviors of civilization and counterinsurgency as a can’t-miss strategy. (See: Time magazine and this distressingly racist post.)

Both sides, whether they realize it or not, are being fatally orientalist. First, they assume that they understand the complexities of the conflicts and the distant cultures they affect. Second, they purport to be interceding on behalf of those cultures. Finally, they assume that what the US and its allies do – leaving or staying the course – will make all the difference to those cultures. At a time when everybody should be acknowledging that there are no panaceas, everybody is offering one. 

I have no counter-panacea. I will, however, recommend two books that should help to complicate your views on the wars, while hopefully purging the Times trend-story simpleton out of you: 

Aftermath: Following the Bloodshed of America’s Wars in the Muslim World, Nir Rosen. Anyone who’s read journalist Rosen knows where he stands; The Weekly Standard angrily wrote that his access to insurgents beat other reporters’ because “he’s on their side.” But in this postmortem on the waning wars, Rosen tells a richly reported, nuanced story that often gives well-intentioned military interveners their due. Especially fascinating is his argument of how the US didn’t just fuel Sunni-Shia civil war in Iraq; it virtually created them out of thin air. Said would be proud. Aftermath is due out in November, and I’ll publish my review of an advance copy at my other job.

The Betrayal: A Devastating Report on the Sabotage of Our "Other War" in Southeast Asia, Col. William R. Corson. Published in 1968, and distressingly never reprinted, this book should be on your Amazon used-books list. Corson commanded the Marines’ small-unit pacification program in Vietnam, and his account – part jeremiad, part self-advertisement – is still one of the best doses of realism injected into the COIN debate. Even today, the smartest Marines read Corson to appreciate the limits of American power and know-how.