Riffing on politics in his most recent stand-up
routine, the comedian Chris Rock laments that nobody thinks anymore.
Nobody considers an issue and lets it roll around in his head for a
while; we have become a nation both addicted to snap judgment and
suspicious of anyone willing to say, "It depends," or "I'll have to
think about that." Rock's raucous audience offers loud applause on this
point, not laughter. If retired Adm. John M. Poindexter had been there
to hear Rock's rant, he likely would have applauded too. For he and
Rock share this concern over America's growing reliance on snap
judgment and resistance to reasoned debate—two trends that played a
part in the not-exactly-total destruction of Total Information
Awareness.
Poindexter still slips sometimes and
talks about Total Information Awareness (TIA) in the present tense.
Despite the fact that he resigned from the Defense Advanced Research
Projects Agency (DARPA) a year ago, despite the fact that DARPA
subsequently dissolved the Information Awareness Office (IAO) he had
built, and despite the fact that DARPA ostensibly canceled TIA (the
broad-ranging program designed to apply technology-based intelligence
as a counterterrorism measure), Poindexter still firmly believes in TIA
(he pronounces it Tia, like the woman's name). In fact, he
says, TIA has gone away in name only. And he cautions that if the
debate about its merits remains emotional, rather than reasoned, the
nation may well end up with a less effective, but more invasive, set of
technologies to combat terrorism.
Hardly humbled by the public
maelstrom surrounding his project and his eventual resignation (events
he says he largely foresaw), Poindexter seems energized by the
controversy. If anything, he says now, TIA didn't go far enough. It
needed to encompass more of the national security infrastructure (not
just intelligence) and more of the national policy (not just
technology) infrastructure.
"One of the reasons I continue to
speak out is that the solutions to the counterterrorism problem involve
other parts of the national security community—especially other
elements of the Department of Defense, State, FBI, Homeland Security
and the [National Security Council] staff," he says. "They all play an
essential role."
Poindexter spoke about TIA at the
CSO/CIO Perspectives conference in April in Carlsbad, Calif. It was his
most broad-ranging discussion of TIA since leaving DARPA; and he used
the opportunity not only to promote the concepts behind TIA, but also
to defend himself against criticism from Congress, the media and
privacy advocates.
The connotations associated with his
name are legion, but the one that doesn't readily spring to mind is
that of Poindexter as technocrat. Today he appears ruggedly fit in a
way that belies his age (68). He looks distinctly trimmer than the man
who testified before Congress nearly 20 years ago, wearing Navy dress
blues that somehow made him look more like a sedentary CEO than an
intrepid sailor. His tan now sets off a white dustbroom mustache. He is
sharp-eyed and slightly wary in his manner. But when he steps outside
into the bright California sunshine, producing a pipe and beginning to
work its barrel, he rhapsodizes about sailing his yacht and looks every
bit the Navy admiral.
Nonetheless, he is also a bona fide
geek. Enthusiastic about new technologies, Poindexter is devoted to the
idea that ambitious, creative IT systems can help solve complex
problems such as the risks posed by asymmetrical terrorist threats.
"The 9/11 Commission is identifying the exact problems that we were
trying to get technology to solve. So I keep pushing the idea," he says.
Yet, ideas like TIA must negotiate
the roiling confluence of security and technology with democratic
principles, including privacy rights. Can the nation strike a balance?
(See "With Liberty and Surveillance for All.")
Where is the line between security and invasions of privacy? To what
extent should citizens control intelligence activities that probe data
about their lives?
These are some of the questions that
we were curious to pose to Poindexter, who until recently has been
largely absent from the debate that his DARPA initiatives triggered. "I
think it is very difficult today to have a reasoned public discourse on
any controversial subject," says Poindexter with characteristic
understatement. "Certainly, election years present a complicating
factor."
TIA's Origins
The generative spark for TIA was John Hinckley Jr.'s attempted
assassination of President Ronald Reagan in March 1981. Poindexter, who
was then a White House military assistant (he became national security
adviser in 1985), credits that event with getting him and others
thinking about the problem of "crisis preplanning." Poindexter set up a
crisis preplanning group at the White House, as an adjunct to the
Situation Room. Spurred by the assassination attempt and subsequent
events like the bombing of the Marine barracks in Beirut, its scope
soon widened. The group considered terrorism scenarios even then, and
explored the tantalizing possibility that ambitious data analysis might
reveal the outlines of future events. However, after the Iran-Contra
affair, which eventually led to Poindexter's conviction on five felony
counts, including lying to Congress about it (a conviction that was
subsequently overturned), crisis preplanning efforts stalled.
Fast-forward to 1996. DARPA issued
what is known grandly as a Broad Agency Announcement, or BAA. These are
exactly what they sound like: proclamations or calls to arms for some
broad problem the agency intends to research. In this case, says
Poindexter, the BAA announced that DARPA wanted to develop information
technologies that would help "identify potential future crises and our
options for preemption and prevention—which sounded a lot like what we
had been doing in the 1980s," says Poindexter.
Eventually, the BAA led to Project
Genoa in 1997. That research, which later morphed into TIA, was meant
to encompass many specific projects under the one umbrella. The data
mining application that most people associate with TIA was simply one
of the most prominent projects.
"Now, you've got to understand that
in the R&D environment, you try to generalize problems, make them
as expansive as possible, so that the technology you develop will have
broad applicability," Poindexter says. "Nobody—myself included—believes
that we could ever achieve total information awareness. But the
government needs to set goals and long-range objectives. Total
information awareness is a good [research] goal."
In large part, the I in TIA refers
to information about transactions. Poindexter had been thinking, as
early as the crisis preplanning days in the Reagan White House, that
terrorist operations require preparation. And preparation can be viewed
as a collection of transactions—even everyday, innocuous ones such as
buying an airplane ticket or signing up for flight school. It can also
include somewhat less innocuous and more suspicious ones such as buying
large amounts of fertilizer or a crop duster.
The problem, of course, is that the
few suspicious transactions are embedded among many innocent ones. "It
would be ideal if we could have an uncontrolled flow of information,"
says Poindexter, meaning ideal from an intelligence perspective. "But
we realized you can't do that." So, it was understood within Project
Genoa that technology would need to be developed to seek activity
patterns that fit the intelligence community's idea of suspicious
behavior.
Within a year of Project Genoa's
founding, it was clear to Poindexter that TIA's most important work
would be to help preempt asymmetric threats, what he calls the "new
brand of terror," relying on the use of unconventional weapons and
tactics against an overwhelmingly superior military force. The phrase total information awareness
was presented publicly as early as 1999. Project Genoa, including the
project that would become TIA, even experienced some technical success
from 1997 through 2002, a period in which it received $42 million in
funding.
Then 9/11 happened. Some Project
Genoa managers felt that the technology they were working on could have
prevented the tragedy. Poindexter is more circumspect. "Now, I don't
think I would say that officially. But certainly I felt a great
frustration that we had not been able to avoid 9/11," he says. After
the attack, he suggested that DARPA establish a Total Information
Awareness office and invest a significantly greater amount of money in
the effort.
Overcoming Controversy
Thus, Poindexter joined DARPA to head the Information Awareness
Office in January 2002. He was mindful of his own controversial profile
and concerned that it might be a problem for him and for TIA—especially
in the eyes of Congress. "But we thought I could stay long enough to
get the R&D programs started, and we achieved that," he says. "I
didn't particularly want to come back into government and run it.
However, in the end, it seemed like the only way we could get it off
the ground.... And I truly felt that the country had a serious problem.
I had ideas about how [it] could be solved, and felt that I could make
a contribution. But I had never planned to stay very long."
Anticipating controversy, Poindexter
says he felt that it was important to move quickly. So he suggested
something radical for DARPA: Develop the technology and the
policy to govern its use in parallel rather than serially. He
understood that policy-based objections to TIA's underlying technology
might retard the technology's development. But if the policy were to
evolve concurrently, and to forthrightly address anticipated
objections, then the project stood a chance of surviving to fruition.
Poindexter saw privacy as the mother of all objections. "We were not
blindsided by the reaction to TIA," he says. "I knew from the beginning
that privacy was going to be a huge issue, especially with regard to
applying Total Information Awareness in counterterrorism. Because if
the technology development was successful, a logical place to apply it
was inside the United States."
So, he says, part of the early
policy development was to initiate a "reasoned, open public discussion
of the privacy issues." The National Academy of Sciences (NAS)
initially expressed interest in studying the problem, Poindexter says,
but backed out, anticipating a maelstrom (correctly, as it turned out).
"I took the money I would have used for NAS and enlisted the aid of
some Washington think tanks to begin seminars and conferences about the
issue of what kind of policy framework would make sense to put around a
set of technologies like TIA."
If anything, DARPA projects such as
TIA are remarkably open to the public—especially when compared with
corporate initiatives, where competitive advantage is at stake. All of
IAO's privacy work, done in tandem with the actual technology
development, occurred more or less in plain sight—probably, according
to Poindexter, in plainer sight than most other DARPA projects because
of the IAO's decision to pursue the policy formulation track.
Yet despite the transparency, TIA
was still savaged as the incarnation of some Orwell-ian nightmare. Was
it? Poindexter certainly doesn't think so. But he sees that as almost
beside the point. More troubling to him (and more illogical as well) is
the fact that no one took advantage of the openness of the project.
There was no debate. Instead, there was an invective-laced rush to
judgment.
When he talks about what happened
("a discussion that was not totally open, and certainly wasn't
reasoned"), Poindexter displays hardly a trace of emotion. Instead, he
speaks of the public fracas over TIA with the dispassion of a judge,
though also without disconnecting himself from an absolute faith in the
virtues of TIA. "A lot of our critics feel that the way that you
preclude some future policy that you don't particularly like is that
you prevent the technology from being developed," he says. "And I think
that's a very serious problem that we have—the idea that if you limit
technology development, then that is the policy."
Misconceptions
Poindexter seems more baffled by the media's treatment of TIA than
he is by TIA's ultimate undoing. Poindexter believes that his effort to
engage the privacy issue both in technology and in policy was a rare
gesture. If anything, he says, DARPA got very few good ideas back from
the R&D community on how to protect privacy. (The Palo Alto
Research Center did have some excellent ideas for creating a "privacy
appliance," he says, for which PARC received a contract.)
In addition to the media's painting
the project with broad, Orwellian strokes, Poindexter says some
reporting was just dead wrong. He never intended to build a single,
central database to collect data on every transaction by every
American. Architecturally, he thinks it's a poor idea. Ditto on the
idea for warehousing all this transaction data.
He also took umbrage at the notion
that he was going to manage some TIA-based "product." He cited a
privacy advocate who leaked news about TIA to John Markoff of The New York Times.
Poindexter wouldn't name this person. But he says that either "through
ignorance or through mischievousness," the advocate suggested that
DARPA was going to implement the technology it was developing.
"DARPA was not ever going to implement
TIA," says Poindexter. "I mean, DARPA is not an operational agency,
it's [intended for] R&D. Again, we were starting an R&D
program. We wanted to be as expansive as possible to make sure we
didn't preclude some good ideas."
Although Poindexter was often cast
by the media as some sort of evil genius bent on invading citizens'
privacy, he regards that as inaccurate and unfair. For the record,
here's what he says would be out-of-bounds in a TIA-like project:
"Uncontrolled access to data, with no audit trail
of activity and no [outside] oversight would be going too far. This
applies to both commercial and government use of data about people." To
be acceptable, he insists, TIA would have required the "privacy
appliance" proposed by PARC. (Poindexter saw a potential solution to
the problem of identity theft as an ancillary benefit of the PARC
concept.)
But what about abuse of the TIA
system? What would stop the government from using it against common
crimes rather than for counterterrorism? What would stop insiders from
improperly using the data?
Nothing, says Poindexter. That's a legitimate concern.
"I don't think it's a technology
issue. It's a policy issue," he concludes. And this is exactly what he
had hoped to address from the beginning of the IAO process—the focus on
policy, the transparency of the process. Showing rare emotion, he
admits to being flustered by the inability of politicians and the media
to accept that he thought seriously about these issues, and they were
in fact being addressed. It's the job of "Congress and the judicial
branch and the executive branch, after appropriate debate, [to]
establish whatever policies are appropriate," he says.
But to simply put a halt to
promising technology out of fear that policy will fail to control its
use? "It's like saying that we shouldn't develop M16 rifles because
they may be used by criminals." He shakes his head incredulously.
Successes
After an initial furor, during which Poindexter battled these
misconceptions, the outrage dissipated. The IAO even managed some
successful trials while dealing with public fallout—including the
creation of "Vanilla World," a virtual world not unlike the popular
Electronic Arts Sims computer games. Vanilla World's 2 million
virtual citizens eventually incorporated potential terrorists making
suspicious transactions. Other programs made progress too, and were
eventually wrapped into TIA to be tested in an operational setting.
(Poindexter is careful to note that they all remained experimental;
none ever replaced operational systems.)
"TIA was being used by real users,
working on real data—foreign data. Data where privacy is not an issue.
And those users were working on real problems. And the experiment's
metrics were being measured so we could figure out whether the
technology was really helping or not. We also got feedback from users
on what needed to be modified."
In other words, at this point it was
a typical big IT project. But Poindexter believes it was better
designed than most because it focused on iterative development. "That's
the way you develop these big systems. You do it on a small scale. And
you accept failure as a possible outcome of some of the experiments. If
you don't get failures, you're not pushing hard enough on the
objectives."
Poindexter likes to talk about the
"bathtub curve." The three phases of intelligence are research,
analysis and production. If you chart the amount of time spent on each,
you see a curve that looks like a bathtub, with most resources going to
research and production and the least going to the most important part:
analysis. One of TIA's objectives was to invert the curve, take time
out of research and reporting and put it into analysis—since "humans
are still the best thinking machines for analysis." It worked, says
Poindexter; TIA appeared to upend the bathtub curve.
Assassination Futures
But this momentum collided with yet another controversy that
erupted last summer (a TIA project called "FutureMAP") that would
ultimately be the undoing of the Information Awareness Office, TIA and
Poindexter.
FutureMAP (or future markets applied
to prediction) was an experiment to see whether a futures
exchange—wherein terrorism experts could bet on potential future
national security events—might have value in predicting the likelihood
of such events. Economists have lately become enamored of futures
exchanges. The idea is that if you give people an economic incentive to
make accurate predictions, they will produce better-formed judgments on
future events to make a profit. Such exchanges are being widely tested
and have proven to be at least partly effective in other domains, such
as predicting future telecom policy.
One of the contractors working on
FutureMAP posted on its website such potential futures as the
assassination of Yasser Arafat, the overthrow of the King of Jordan and
a missile attack by North Korea. When these postings came to light,
critics argued that they amounted to an online casino where people
could profit from betting on death and disaster. A vituperative
political feeding frenzy ensued.
"Oh, I think the concept [of a
futures market] is clearly sound," says Poindexter, coolly analyzing
the controversy. Give smart people with information an incentive to be
right, and they will be more right than if they have no incentive.
Another benefit to the incentive system: It provides an avenue for
disgruntled terrorists to attempt to profit from their insider
knowledge.
"If the concept had proven
successful, it would probably have been implemented in a couple of
ways, " Poindexter says. "One would be open markets on some questions
and closed markets (maybe within the intelligence community) on the
more sensitive kinds of questions. The problem we were struggling with
within the closed market was what the incentive would be. You probably
wouldn't use dollars. But those are all questions that need to be
explored."
However, after FutureMAP was outed
and the so-called assassination futures unearthed ("We never would have
approved those questions being put out to the public," says
Poindexter), the reaction was swift and terminal. Sens. Ron Wyden
(D-Ore.) and Byron Dorgan (D-N.D.) wrote to Poindexter: "Spending
taxpayer dollars to create terrorism betting parlors is as wasteful as
it is repugnant. The American people want the federal government to use
its resources enhancing our security, not gambling on it."
Poindexter resigned two weeks later,
though he denies that the FutureMAP furor spurred his resignation. He
says he'd been planning to leave anyway.
Flashing Lights
"I think if I had to do it over again," he says, "I'd do it the
same way. I would just put more resources into getting the public
diplomacy part much stronger than we were able to. DARPA has a $3
billion budget, and there's a single public affairs person and a single
legislative affairs person. There's no full-time [legal] counsel. I
told the director of DARPA that I think it's a significant problem if
DARPA is going to continue to take on controversial issues. A full
public affairs, legislative affairs and legal staff has got to be on
hand."
Poindexter cites one instance where
a bigger support staff could have helped him in his own presentation of
TIA. He recalls a particular schematic diagram of one TIA project
where, in the middle, there was a little box: "a filter," he says,
referring to the so-called privacy appliance. "The purpose of that
filter was very complicated. But essentially it was there to provide
privacy protection. But we didn't make the box very big, and it wasn't
really clear what it was for."
If you presented the project without
focusing on that filter, he says, "it was a scary thing." He concedes
that he should have been more sensitive to the privacy issue from a PR
standpoint. "Although I knew it was a huge problem, in our public
materials we probably should...have tried to be more precise in talking
about privacy. Explain it in bigger detail, and put it up in flashing
lights."
In other words, a full-fledged
marketing team could have helped win TIA a more even-handed reception,
or limited the negative spin that overtook it. Either way, says
Poindexter, the damage to TIA pales next to the possible long-term
effects on DARPA, if it becomes reluctant to tackle controversial
projects.
"It's very important that DARPA and
the government continue to do controversial research," he says. "DARPA
has been successful in the past because they take on some of these
controversial issues."
Regrets
Does John Poindexter regret having gone back to the government?
"No. No, I think that we raised a lot of interesting issues. That's one
of the advantages of DARPA. This brainstorming we do, once DARPA begins
to think about a problem, that provides a lot of leverage. You get
universities thinking about the problem. Furthermore, once good ideas
surface and the R&D community begins thinking about the issues,
potential solutions are imagined." The work then takes on a life of its
own, though, he notes, "not necessarily with government funding. It
just doesn't happen overnight."
Already, he says, Carnegie Mellon
University has created a center to address the interface between policy
and technology, especially privacy protection technology. Syracuse
University's graduate schools of law and public administration recently
hosted a joint event focused on security and privacy.
"I also think it's important for
commercial companies; they need to be much more sensitive to the way
that personal information can be used for marketing.
"See, I really believe that we don't
have to make a trade-off between security and privacy. I think
technology gives us the ability to have both. Privacy issues are being
discussed. There's a lot more discussion. And so the reasoned, open
public discussion that I wanted to achieve is finally beginning to take
place. But unfortunately, in my opinion, Congress overreacted too
early, for political reasons."
Appropriations
One of the reasons Poindexter talks about TIA in the present tense
is because large portions of the work begun at the IAO are continuing—a
fact that at least one GAO attorney says might surprise even some
members of Congress. But the ongoing work has been moved onto
classified, or "black," parts of the defense budget—where it's free
from public scrutiny.
Ironically, Poindexter argues, the
politicization of TIA led to an even worse scenario for privacy
advocates than what they had before; now, because much of the work is
classified, there won't be any public discussion.
"The defense appropriations bill,
which is unclassified, says that we're going to close down the
Information Awareness Office, we're going to close down TIA. But, oh,
by the way, some of the parts of TIA are not controversial, [and] we're
going to move them into the classified annex of the budget. And where
they are moved is classified. Exactly what they do is classified.
"However, I can tell you that PARC,
which had a major [TIA-related] contract on privacy protection, has
publicly acknowledged that their contract has ended. So, what Congress
has done is that they've stopped the research in the privacy protection
area. And, in my opinion, that eventually is going to be a problem for
the administration.
"The privacy work was part of what
was canceled. But I think it should continue. And I think that
eventually it will be continued. I'm an optimist."
Reach Senior Editor Scott Berinato at sberinato@cxo.com.
PHOTOGRAPHY BY DRAKE SOREY
Most Recent Responses:
cutting edge in algorithmics incl.
topics such as info security, info
freedom, cybersecurity, surveillance,
govt IT, etc
http://vznuri.orgspace.com/theory-edge/
discussion list:
http://groups.yahoo.com/group/theory-edge/
v.z.n
Print