A matter of TRUST

CITRIS Director Shankar Sastry will lead the newly dedicated Team for Research in Ubiquitous Secure Technology.
(photo by Aaron Walburg)

computers running everything from financial institutions to oil, gas,
and fuel pipelines to virtually all communications, the need for
reliable, trustworthy systems is greater than ever. Yet with the
introduction of each new application and platform, the opportunities
for and frequency of sabotage have risen exponentially, creating a
societal-scale Catch-22, in which our technological advantages have
provided us greater functionality but have left us more vulnerable to

Enter the Team for Research in Ubiquitous Secure
Technology (TRUST). Led by UC Berkeley, TRUST brings together partners
from industry and academia (see box for a complete list) in a
well-coordinated research effort not only to address the technical and
social aspects of cyber security but also ensure that information is
rapidly transferred to those who need it. The National Science
Foundation launched the center in April with a contract of $19 million
in funding to be distributed over five years, with an option for
another five years at roughly the same level of support. Formal
dedication of the center took place on October 21.

In the
following interview, CITRIS director Shankar Sastry, who will be
leading TRUST, discusses some of the reasons why our systems are so
insecure nowadays and the unique approaches TRUST is taking to make our
computer infrastructure safer in the future.

seems like a week doesn't go by without some new worm being unleashed
or some new vulnerability being exposed. With all our firewalls,
education, and other protections, why are our systems still so

Because they’re inadequate. The
Internet and computers in general were developed for a smaller group of
people, mostly scientists and engineers, who used it for their work.
The culture of development of software and systems on the Internet has
been one of putting in more features and adding more functionality,
with a sense of it being a trusted place where people would all behave.
But as the Internet has grown up and become an engine of supporting our
daily life, the full spectrum of people who show up in
society—including the criminal element, hack-activists, and disgruntled
people—have been let into this infrastructure. As a result, the current
infrastructure is simply unable to cope with the variety and number of

As this chart from a Carnegie Mellon University study highlights, with each new tool comes an increase in attacks.
(Source: Carnegie Mellon University)
(click to enlarge image)

What can be done?

can’t throw away the Internet and start over, so we have to fix it as
we go. Doing that requires more than building firewalls and better
cryptography. What’s needed is a whole system that features defense in
depth, so that when you breach one layer of defenses there is another
layer and then another. It has to address what you do when things are
compromised, how you operate through attacks. At the same time, we have
to bring back societal trust relationships, the notion of developing
trust with people you communicate with, and then communicating more
with them if you have greater trust with them.

How is TRUST different from other attempts to build more secure systems?

is ushering in the third generation of cyber security. The first
generation focused on preventing attacks. The second worked on
detecting intrusions and limiting damage. The third, and what TRUST is
working on, is operating through attacks. Even if attacks do succeed,
how do you keep critical infrastructures from going down? How do you
think about network weather? How do you build systems that degrade
gracefully? Our goal is to do a combination of research in these areas
that gets transitioned into industry immediately. At the same time,
we’ll be working on longer term solutions. That may involve more
substantive rethinking, which would be harder for industry because they
have such an investment in the status quo.

TRUST emphasizes developing new technologies. Do you have some examples?

is obviously an important part of the cyber security agenda, but
equally important is TRUST’s social-economic and privacy agendas. Even
if we do all the most fantastic research in the world, if HP, Juniper,
Cisco, and others don't put it in their routers and if Microsoft
doesn’t use it in its secure products or its trusted computing systems,
it would be completely irrelevant, because in some ways we are all
exposed to even a few of these systems. So what we need to do is not
only think about the technology, but the societal context, primarily
the privacy.

For example, say you’ve got a service
provider who says, “I’d like to keep you safe from attack, but to do so
I need to know everything you’re doing on your computer.” The issue
there, of course, is privacy more than technology. Maybe you trust the
service provider enough to let them in to do it, but maybe they’re
cashing it and using it to troll for trends in what you do. That would
be objectionable. They might say we need to do that if you want to
protect everyone else that is using our services. The question is where
do you draw the boundary between left alone and the common good.

Carnegie Mellon University
Cornell University
Mills College
San Jose State University
Smith College
Stanford University
University of California, Berkeley
Vanderbilt University

British Telecom
Cisco Systems
ESCHER (Boeing, Lockheed Martin,
Ford, GM, Raytheon)
Hewlett Packard
Oak Ridge.

How will the ideas and solutions that come out of TRUST be tested and shared?

have a number of test beds. We’ve built a 1/64th replica of the
Internet, which is going to be used to test wireless and worm defenses.
And there are other test beds which address vulnerabilities in physical
infrastructures, like monitoring devices for oil and gas pipelines and
electrical power grids. The NEST test bed for wireless sensor networks is another example.

has received major funding from the National Science Foundation and
industry partners, and is working in tandem with several other academic
institutions. Why is it important for us all to work together?

important to work together because the problems are too big for any one
university or any sector of industry to do by itself. Also, it won’t
matter if we fix it at University of California, Berkeley, because the
problems are really at the societal scale.

How does TRUST fit in with the larger aims of CITRIS?

reason that TRUST is such an integral part of CITRIS is because CITRIS
is about information technology and societal systems. If we are going
to trust information technology to be the bricks and mortar of our
infrastructure, it has to contain within it everything we know about
society, which are trust relationships. And until we get that, it’ll be
a brittle infrastructure. So it’s absolutely critical to me, it’s one
of the most critical parts about getting IT into our infrastructures.


For more information:

TRUST’s Web site

“2 professors go fishing for phishers” by Carrie Kirby (San Francisco Chronicle, July 25, 2005)

“Campus to Direct New Research Center” by Cristina Bautista (Daily Cal, April 14, 2005)

“U.S. Grant Offered to Team Studying Computer Attacks” by David Bank (Wall Street Journal, April 12, 2005)

“A Nest of Sensors” by David Pescovitz (Lab Notes, Oct/Nov 2005)

Network Embedded Systems Technology Web site