>> Good afternoon again
and welcome to GW.
My name is David Dolling and
I'm the Dean of the School
of Engineering and
Applied Science here
at the George Washington
University.
I don't think it would
be news to anybody
that there's been a lot of
documents out there recently
with respect to cyber security.
We've seen the White
House's executive order aimed
at protecting our
critical infrastructure.
We've seen a report from the
Security Company Mandiant
that exposed a Chinese
military hacking group.
And very much in the news
now although it's faded
after the press conference
yesterday afternoon have been
documents related
to NSA's prison
and telephone surveillance
programs.
Well today, we are
very fortunate
that have several
experts with us
to discuss their various
view points on surveillance
on cyber security and the
future of the internet.
As we try to be secure
while protecting privacy,
we do need to be guided by
knowledge from many disciplines
and in fact, this was stated
quite explicitly last December
at the launch of the GW
Cyber Security Initiative.
The initiative's chairman,
former DHS Secretary Michael
Chertoff spoke of the need
for university's
like ours and others
to give students an
understanding of the triad
of technology, business
policy and how they interact
to create difficult cyber
security challenges.
While I'm on the podium here,
I'd like to just take advantage
of a captive audience because I
know you all too polite to cheer
or get up and move out to say
a few things about the School
of Engineering and
Applied Science here at GW
which is a very thriving
and growing enterprise.
Our Department of
Computer Science
in fact anticipated
Secretary Chertoff's remarks
by launching last fall
the Master of Science
in Cyber security in our
Computer Science Department.
It's the first program
in Washington DC designed
specifically to respond
to the large and
fast growing need
for technical cyber
security experts locally,
nationally, and internationally.
Underscoring our commitment
here to cyber security education
and public service, our NSF
and DHS sponsored CyberCorps
scholarship programs.
These programs fund students who
are going to be future leaders
in the federal government
on cyber security
and indeed some of
them already are.
Each week, the CyberCorps
students meet in class often
with current or former
federal employees as well
as industry experts to learn
about the major cyber security
issues so they're prepared
to meet the challenges that
will certainly great them
after they graduate
here and take jobs.
This course which was
previously only available
to CyberCorps students is now
available to all GW students
as a hybrid cause
conducted mostly online
with the short intensive
in person learning
experience or the beginning.
Starting next year,
some doctor or students
in the graduate school
of education
and human development
here at GW on human
and organizational learning
will also be taking this class a
required part of their
cyber security focus.
Well, let me say that
today's event which looks
to be very well populated
and looks like it's going
to be very interesting
is sponsored
by the cyber security policy
in research institute.
We abbreviate that around
here to CSPRI of the School
of Engineering and
Applied Science.
In addition to overseeing
the CyberCorps program,
it facilitates and execute
into disciplinary research
in education in cyber
security across GW.
One example of this is its
work with the graduate school
of education in human
development supporting the
national cyber watch
center in a joint project
to develop the nation
cyber security workforce.
More information on that program
on CSPRI Cyber Security
Scholarships and School
of Engineering and Applied
Science can be found
in the material available
the registration desk.
We're happy to have
as a cosponsor
of this event today the internet
society and it's my pleasure
to now introduced Paul
Brigner, Regional Director
of the North American Bureau at
the Internet Society to come up
and tell you a few
words about it
and its role and
in today's event.
So, on behalf of GW, the
School of Engineering
and Applied Science,
CSPRI and the Department
Of Computer Science,
welcome to GW
and I hope you have
a great afternoon.
Thank you.
[ Applause ]
[ Pause ]
>> Thank you Dr.
Dolling and thank you
to the George Washington
Cyber Security Policy
And Research Institute
for partnering
with us on this event today.
And thanks to all of
you for joining us
and that means not everyone here
in the room but also to those
who are at a remote
hub in New York City.
That remote hub is
being sponsored
by the internet societies
New York chapter.
I know we don't have
a good view of them
with the camera right
now but what's going
to happen is they
will walk up to
that camera and ask questions.
And by the way, they've
been meeting
on this topic already
this morning.
They've had their
own meetings today,
so maybe that gives you an idea
of how passionate that group
of individual is about
the topics we will be
discussing though.
So, thank you to the New York
chapter for getting together
to join us as remote hub.
And definitely, thank you
to all those on live stream.
We have a very good crowd
already gathering on live stream
to watch us from
all over the world.
My first order of business today
is makes sure that you're aware
of the internet society.
If this is the first time you're
joining from one of our events,
we are a global nonprofit
cost driven organization
with a very straight
forward vision.
That is the internet
is for everyone.
We work to achieve that vision
by promoting the open
development evolution and use
of the internet for the benefit
of tall people throughout
the world.
Some of our key functions
involve facilitating the open
development of standards,
protocols, administration
and a technical infrastructure
of the internet.
Supporting internet education
in developing countries.
Promoting professional
development in community meeting
to faster greater
participation and leadership
in areas important to the
evolution of the internet
and fostering an environment
for international cooperation,
community and a culture
that enables internet
self governance.
Well, I saw staff lead specific
projects on these topics.
Much of our work is achieved
through the work of our members
and chapters around the world.
And speaking of chapters,
I would like to give
special recognition
to the local DC chapter.
I am very fortunate that our
headquarters is just not far
away in western Virginia so
I'm able to interact personally
with our DC chapter
on a regular basis.
And I can attest to the fact
that they are really a
first class group of people
that regularly hold very
interesting meetings related
to the internet societies
mission.
So, if you're a local
and you're not involve,
you're really missing out, I
do hope you will get involved.
And if the leaders who are here
from the internet society DC
chapter would stand briefly,
I just want to make sure
that you're recognized.
So, great.
So, you know who they are.
Please make a special
effort to talk
to them today during
this event if you'd
like to get involve
with the chapter.
[ Pause ]
One of my key functions
at the internet society is
to work closely with our
chapters in the region not only
to plan iNET events like this
but to assist with their events
and to have an open
dialogue on policy issues.
That dialogue in turn
helps me and my colleagues
at the internet society to form
our policy positions and help
to set the agenda for
our regional meetings
like the one we are
attending today
and that brings me
to today's agenda.
The topic of this iNET
has really been driven
by the interest of our members
in the North American
region and around the world.
It so happens today that one of
our panelist is the President
and Chief Executive Officer
of the internet society
that's Lynn St. Amour.
So, she will have
the opportunity
to address the concerns
of our constituency today.
Now, without further delay,
I would like to introduce
Professors Lance Hoffman.
He is the Director of the Cyber
Security Research and Policy--
I'm sorry, the Cyber
Security Policy
and Research Institute here at
George Washington University.
I first met Professor Hoffman
when he sponsored a debate
on PIPA and SOPA and
I knew that he was,
based on that experience, I
knew that he was game to talk
about some very controversial
topics.
So, I guessed right, he
was interested in talking
about this one as well and
it's been really a pleasure
to partner with him
on this event.
So, I want to sincerely
thank you for that.
It's really been a
great experience.
I'd also like to thank all
of our fantastic panelist
who have agreed to
speak here today.
We really have an all
star group of people
who have taken their time out
to share their views with you.
I'd like them all to come up
and take their sits on stage now
if they would and I
will turn this event
over to Professor Hoffman.
One thing I forgot to mention
is that if you want to Twitter
or tweet about this event,
please use the hashtag
iNET DC, iNET DC.
Thank you all over much.
[ Applause ]
[ Noise ]
[ Inaudible Discussion ]
>> Good afternoon everybody.
Let me add my welcome to that of
Dean Dolling and to Paul Brigner
and let me also thank the
internet society, our cosponsor
for this event for
their programmatic
and financial assistance
in making it possible.
I also want to thank from
CSPRI, my Associate Director,
Dr. Costis Toregas and our
intern and research assistance
who have worked on this and are
helping here today tray hair
[phonetic] Dustin
Benanberg [assumed spelling],
Jaime Moore [assumed spelling],
and Greg Ziegler
[assumed spelling].
My role today is to set the
stage for our excellent panel
and into moderate that panel
discussion on cyber security
in the future of the internet
given the recent revelations
about prism and other
surveillance programs.
The panelist of each degree
to summarize their thoughts
in a four minute statements
so we could have lots of time
for questions from the
audience including the audience
in cyber space.
After the audience question and
answer session in a short break,
participants will return for
a very interesting session
around table discussion where
we will see their world views
on this, where there
world views agree,
and where there are tensions and
talk about this in a round table
which will be informed by
their previous statements
and the audience
questions before hand.
I'm delighted that
for this event,
we will have an experience
moderator Professor Steve
Roberts leading the
round table discussion.
Many of the students and
faculty here may recognize Steve
as the superior professor
of media
and public affairs here at GW.
But others including
those viewing on this
on the internet maybe
more familiar with him
from his appearances
as a commentator
on many Washington
based television shows,
as a political analyst
on the ABC radio network,
and as a substitute host
on NPR Diane Rehm Show.
Another rule where he-- this
is material from experts
and clarifies it to facilitate
informed public discussion.
But before the round table,
let me identify the
participants on stage.
We will not have lengthy
introductions here
since their bio sketches have
been made available to you
in the brochure which you
have gotten or can get
at the registration desk.
But anyway, let me
introduce very briefly
and you see their name
tags in front of you,
Danny Weitzner is the
Director and Cofounder
of the MIT Computer Science
and Artificial Intelligence
Laboratory.
>> Not the whole lab.
>> Not the whole lab?
>> I'll explain.
>> OK. Lynn St. Amour
is the President and CEO
of the Internet Society.
Let's see, who's next.
Laura, Laura DeNardis
is professor of School
of Communication at
American University.
Melissa Hathaway is President of
the Hathaway Global Strategies.
Let's see.
Who's down there?
I don't have you on my list.
>> Oh no.
>> I apologize.
>> That would me.
>> Oh Randy, of course.
Randy from-- I don't have here.
I mean, OK.
Randy-- where are you from?
I'll let you introduce yourself.
I'm sorry.
[Inaudible Remark]
>> I'm the University
IT Security Officer
at Virginia Tech.
>> I didn't want
it to get it wrong
and say something
wrong-- wrong university.
Sorry Randy.
OK, let's see.
Leslie Harris is from Center
for Democracy and Technology.
And last but not the least,
John Curran, President and CEO
of American Registry
for Internet Numbers.
OK, so without further ado, I am
going to suggest to be each go
down the-- I have to
decide, I'm going to leave it
to I guess either Danny or
John on who wants to go first.
>> No, no, no.
Go ahead.
>> John?
>> OK.
>> OK?
>> OK John, go on.
Four minutes please.
>> Well, thank you.
I'm happy to be here and I
want to thank the opportunity
to present at this panel.
Keeping my remark short,
I will say that we're
in an interesting period of
time with respect to the future
of the internet and
how it's governed.
People may not realize
it but the fact is,
there is a coordination
function that's existed
since the earliest days of the
internet which is necessary
so that we can actually
use the internet
so that it actually functions.
For example, computers
on the internet have
unique identifiers.
Several, actually, one of
them is the IP addresses
which is part of what
Aaron [assumed spelling] is
involved in.
We can't have computers
using the same IP addresses.
Every computer needs
its own IP address.
Likewise, there's coordination
of things like domain names
to make sure that a
domain name is a sign
to one organization,
et cetera, et cetera.
This is called technical
coordination
and it's been a function of the
internet, necessary function
for the protocols
to work as designed
since the very beginning.
People may not realize that
this use to all be done
by one gentlemen, Dr. Jon Postel
who we no longer have with us
but was an amazing individual.
And in the late '90s, we
actually went about setting
up infrastructure to enable
this technical coordination
so that IP addresses and
domain names could be available
to everyone globally for
their use on the internet
but coordinated so that
they actually interoperated
so that they work as expected.
This structure involves
a lot of organizations.
Aaron is one, there're five
regional registries throughout
the globe.
Aaron is the one that handles
North America, so with Canada,
the United States and about
half of the Caribbean.
Though there are four
other regional registries
and then there is the DNS
coordination that goes
on through organizations
such as ICANN.
The international
coo-- [laughs].
Sorry. International--
[Noise] --
Organization for assigned
Names and Numbers.
So, the fact is that these
bodies all work doing technical
coordination is very important.
But, they've been doing it based
on a historical trajectory.
A trajectory that originated
with projects and programs
out of the US government
which have
since been increasingly
moved out to independence
but were not quite there yet.
In fact, some of you maybe aware
that these organizations were
under supervision somewhat
indirect but still supervision
of program such as NTIA in
the department of commerce.
And so, one of the things that
we now find ourselves facing is
that there is a unique
role in the US government
with respect to the internet.
And this unique role is one that
is based on-- it's experienced.
It's a steady hand,
behind the scenes,
helping the internet
mature and grow.
Recently however, we've now
seen that there's another hand
that maybe busy doing
other things
such as surveillance
in cyber security.
And so, the question that comes
up is how do we reconcile those?
How do we reconcile an open
global transparent internet run
for everyone with
the possibility
that there is also surveillance
going on and other things
that may not meet
the expectations
of global internet users.
And so, this is the
challenge in front of us.
It's particularly
highlighted because of the fact
that the US governments
unique oversight role in this
and it's brought
to the forefront
by the recent discussions where
we now say that it is not simply
for the benefit of
all the internet,
but there's also some
national priorities
and national initiatives that
the US government also pursues
over the same infrastructure.
I think that's the challenge
that we will now face more
so than ever in light
of the revelations
for the last few months and
the net result would be a lot
of the discussion in
forum such as this.
Thank you.
>> Thank you and I'll try
to make this iPhone
behave a little better--
>> OK.
>> Although--
>> I thought I was over on time.
>> Yes.
>> You were right on
time, you were great.
>> OK.
>> Leslie?
>> It's time, I guess.
So, I also want to get us out
of the DC bubble and the focus
on the rights of people
in the United States.
I've been talking about
that for the last six weeks.
And congresses about to
considering an amendment that's
going to reign in
the NSAs ability
to collect our metadata
from phones.
I want to pivot away from that
'cause I think there're three
other things that we ought
to be considering at least.
What-- and the reactions of
governments and internet users
around the world and what this
means with the architecture,
not just the governance, the
architecture of the internet
and also for the
protection of human rights
of global internet
users and that turns
out to be a very
thorny question.
Certainly for governments,
I think this entire kerfuffle
is reinforced the concerns
that the global internet has
undermined their sovereignty
in control over their citizens.
And it's further
illuminated this sort
of privileged position
of the United States.
Certainly, with respect
to ICANN and some
of the other critical resources,
we continue to have a
disproportionate share
of internet traffic, all
that's declining and obviously,
we have a dominance in our
global internet companies.
For some, and the EUs, I
think I have a good example,
this illuminates
left an inability
to protect the human
rights of their own citizens
and we have a major fight coming
up in the data directive there
about whether they're
going to allow data to come
to the United States at all.
At the same time, it's an
extraordinary opportunity
for authoritarian governments
to exert more control
over the people in data in their
own boarders and it's, you know,
rushes now dusting of a
legislation that has to do
with national servers.
You're going to see
that elsewhere as well
and I think it's fair to worry
whether this carefully honed
narrative as US narrative of the
why is trusted neutral stored
of the internet can
really hold and it's fair
to ask whether we're moving
towards the balkanization
of the internet as
government sees these moment
to impost local server
requirements or worse
and we saw some of these
proposals last year.
I know we're supposed to be
acronym free at the wicked.
I'll explain-- routing
requirements
to either avoid the
United States of literally
to direct the routing of traffic
and the worst outcome
being literally ring fans
to national or regional
networks.
If I were in Latin America and
notice that there was 80 percent
of my traffic still coming
from the United States,
I may start to wonder why
we're still relying on Miami
to reach most of the world.
Some of that might be
a weird salutary effect
that we finally get internet
exchange points in some places
that have not had them.
But I think governments becoming
involved directly in the rooting
of internet data would
opposed a profound challenge
to the open end-to-end internet
and I think all these things
are going to be on the table.
I think you are also kind of
expect to see a reexamination
of the world's relationships
with our cloud providers,
already seeing that.
You know, there's a Netherlands
provider and no prism,
no surveillance, no
government backdoors.
And finally, I think we may have
lost some of the rapprochement
that happened in the governance
wars in the last year.
There are number of important
forums and discussions coming
up about various treaties
and how much various treaty
buddies ought to be able
to impose their--
post themselves into internet
debates, I think we're going
to lose some ground there
and it could be I
think quite troubling.
Last point, global
internet users are furious.
They believe their human
rights have been violated
and I think they
don't yet realize
that it's not entirely clear
whether they have any recourse
for that.
There's a lot of ambiguity
in human rights law.
Human rights law are
pretty much applies,
if countries make commitments
to protect the human rights
of people within their
boarders or their control,
I don't think we began to
understand or have any clarity
on how-- who's responsible
for human rights
where there's non-physical
action
such as electronic surveillance.
And they're not going to be
happy when they understand
that FISA and all
the conversations
in the United States about
safeguards and minimization are
to protect our rights but
have absolutely nothing
to do with them.
I don't think congress
is going to care.
But I think there's a
massive laws to respect
and when we're talking about
five zeta [inaudible] whatever
that is, a non-US
person data going
into a new Utah data center, I
think that the irony at the end
of the day will be that one
of the FISA permissible
activities is collecting
intelligence for foreign
affairs and foreign policy.
And I think at the end of
the day, we may see people
of the world uniting with
their governments around kinds
of restrictions and ring fencing
of the internet that will come
to both underlying rights
and undermine the
openness of the internet.
So, that is my anxious
persons guide to the--
to what's been happening.
>> Thank you Leslie.
Randy?
>> How many of you noticed
accounted how many surveillance
devices you've past
to get to this room?
Yeah. I just walked in from the
restaurant a couple blocks away.
I counted 22 surveillance style
devices just in the walk outside
of this building and there's
at least in this room right now
that not counting your
smart phones and all that.
I hope you all can find the six
because they're pretty
obvious as to what they are.
My point on all of this
is that as a practitioner,
I'm the Sisa for Virginia Tech.
My office is responsible for
monitoring and responding
to any attacks against
our network infrastructure
at Virginia Tech.
We have our main campus down
the Blacksburg, Virginia.
It's about four and a
half hours from here.
We have a Northern
Virginia campus right
across the river in Boylston.
And so my charge,
my office is charged
to monitor any attacks
from there.
Anybody who has ever managed
an internet infrastructure has
known since the very beginning
that the internet has never
been anonymous in the sense
of tracking a machine.
From day 1, we've
always been able to track
where our machine was.
We were not able to track
who is at the machine
with any reasonable
amount of accuracy
but we've always been able
to track where machines are.
The big thing was of course
data, storage capabilities.
We didn't have enough
data storage capabilities,
disk drives.
In 1992, when I've
got involved--
first involved with the computer
security stuff, you know,
we had a one gigabyte drive and
we thought that was, you know,
the entire disc storage
in the entire free world.
And, you know, nowadays, with
the huge disk forms that are
out there, that is allowing
this collection of data.
So, from our standpoint,
that's always been the case.
So, my challenge is,
you all let this happen.
You all let knew that this was
happening and you let it go on.
There is nothing new about the
prism stuff and all of these.
It's kind of ironic for people
in my world 'cause we all
sit back and we go, you know,
in 2010, the Washington
Posts, Dana Priest
and William Arkin wrote an
excellent series called Top
Secret America where
they're talking about all
of the built way companies
that are building the
surveillance technologies
that the federal government
and other entities are using.
All out there was an excellent
series on the newspaper
that I read it back then.
Newsweek came out in
2008 where they talked
about a whistle blower
who mentioned
about NSA's warrantless
wiretapping so to speak.
It was all out there.
If you go back and you look
through the press things,
there was always something
there and nobody reacted to it.
So, it's ironic from
my viewpoint
that everybody is having
this big flop about it now
because it's been there
and you let it happen.
So, my charge to you is don't
let it happen again, OK?
When you do read
about these things,
you need to influence your
legislatures about what's going
on 'cause they do not
understand the technology.
One quick example, in Virginia,
there's data bridge
notification laws
of social security is disclosed.
You know, somebody
gets a spreadsheet
and it explodes out there.
Yet I can go to a county
courthouse website and look
up public records deeds,
divorce decrees, whatever,
and what's on those documents.
Yet the kind of clerks can't
redact that information
because until recently,
the law forbid it, OK?
Everybody was in a rush for
eGovernment but never thought
about what's in those
documents that going there.
So, we need to be
able to influence
and educate the legislatures
who are--
who are building the policy.
So, this is the whole thing.
I'm a technologist and everybody
in here, the computer scientists
and all that, you all
learned in here with me.
We help build this
infrastructure.
We may have had a
misgiving where we thought
that as builders we could
control the people who use it.
But as, you know,
previous history shown
with the atom bomb development
and all these other things,
that's not always the case.
Builders do not control
the controllers.
That's my first point.
The second point is that, again,
you all have smart phones.
Actually I don't.
I have kind of a dumb phone
and people always laugh.
They go, "You know,
you're a technologist,"
and I just have a little,
you know, Samsung Integrity
with the nice little
flippy thing.
And I said, "Because the
guys in my lab have busted
into the smart phones
and tracked everybody
all over the place."
But actually the reason
why I don't this is
because the battery life is not
long enough for my taste, OK?
But a great quote, Lyn Paramore
[assumed spelling] wrote a great
article and in there quoted
things that are supposed
to make our lives easier.
"Smartphones, Gmails,
Skype, GPS, Facebook,
they have become
tools to track us.
And we've been happily
shopping for the bars
to our own prison one prison at
a time, one product at a time."
So, we're in the-- we're
to blame for all of this.
We are allowing that to happen.
The last thing I want to talk
about is this Federal
word play thing.
Whenever I confront a student
or a professor whose been
violating a university policy,
you let something
go out on the net.
And I go. "Why did you do that?"
And it's kind of like watching,
you know, a six-year-old.
"I don't know, you know, it
wasn't my fault," you know.
That type of stuff.
And you get this
word play, you know.
Richard Pryor, a long time
ago had an excellent thing
when his son broke something
and he asked his son happened.
And all the word play that his
son came out with, you know,
"Some invisible man came
out and broke the thing."
That's what we're seeing now.
We're seeing this when they
say we're not collecting data
on you.
Well, OK, your not collecting it
technically but you're buying it
from people who are
collecting it from us.
So, that's-- that's
one thing there.
The network companies that say,
you know, the data providers,
the Verizons, the Googles,
they say, "Hey, you know,
we're not willingly giving it
to the federal government."
Well, of course not.
You're being subpoenaed
for the document.
You're not giving it
willingly to the-- to there.
You know, Zack Holman
wrote a great little tweet.
It says, "We don't give
direct database access
to government agencies."
That quote has become the
new "I didn't inhale," OK?
And so the key word is direct.
Marketers are being collecting
information on us all the time.
And, in fact, that was
one of the big things
that helped us with 9/11.
On a positive side,
that did help us
in 9/11 identify the hijackers
when the marketing companies
and the credit card companies
realized that they could help
because they had that data.
So, we need to be sure that
things are done correctly.
Everybody says we're
doing it legally.
And that's correct.
The laws stated they can do it.
It's the creation of
that law that's the flaw.
And that's my surveillance guide
telling me it's time to quit.
Those are my points.
>> Thanks, Randy.
Melissa?
>> Thank you.
Well, I think it's
important to look
to our past to inform
our future.
And I'm going to take a
little bit different direction
than my colleagues.
The very first transmission of
the internet was October 29th,
1969 and it was an e-mail
between two universities.
And today we have more than 204
e-mails are sent per minute.
More than 1,300 new mobile
users are added per minute
to the internet, 47,000
applications are downloaded,
100,000 tweets, 1.3 million
videos are uploaded to YouTube,
two million searches to Google,
and six million Facebook views.
The internet is part of
every part of our life
and over the last
45 almost years,
we have embedded the internet
in every part of our society.
And it is the backbone of our
core infrastructure of every--
every country's core
infrastructure.
It represents e-government,
e-banking, e-health, e-learning,
the next generation of
air traffic control,
the next generation
of power grids
and every other essential
service has been concentrated
onto one infrastructure,
the internet.
And that is putting
our businesses
and our national
security at risk.
And I-- when I speak about
our, I'm really speaking
in as a global citizen.
If I were in France I would be
speaking about it in France.
If I were Iran or
Israel, it's the same way.
And so I think that
that has really began
to change the conversation
that we can't have any one
single point of failure,
economic failure and/or national
security failure to any one
of our-- of our core
businesses or infrastructures.
And so when we're talking about
cyber security and we're talking
about different government
actions and we wrap it all
into one conversation, I ask
you to start to think about it
as multiple conversations.
And it's not helpful
to bundle into one term
of cyber security
and/or surveillance.
So, I'd you to-- I'd like
you to think about it
in six different ways
of why our governments
and why our industry are
talking past each other.
The first is is when we're
talking about cyber security,
we're sometimes talking
about political activism.
Those who would like to bring
transparency to policies
and to our initiatives
that they don't agree with.
In the United States, one could
say that that was WikiLeaks
that brought about a great
amount of transparency
to US policies as they
leaked our information
into the internet.
All right, one could argue
that that was also Snowden.
But in other countries
political activism is being used
by Twitter, using Twitter and/or
Facebook to organize people
in these squares like Taksim
and/or in Turkey or Egypt
to express political
discontent with their government
with the intent to
overthrow the government.
Political instability.
And so when our governments
are starting to talk
about surveillance on the
internet and/or filtering
of the internet, some believe in
political democracy and freedom
of that speech on the
internet and others do not.
And it has different
mechanisms of how they're using
that surveillance
or the technology
around political activism.
Now that should not be
confused with organized crime
on the internet, the real modern
day bank robber who's stealing
ones and zeros which
is real dollars
out of your credit accounts
or out of your real banks,
and as being passed on as
a cost for our citizens.
We have many of our
governments who are talking
about the importance
of organized crime.
When I was just in Europe
just a few weeks ago,
it was 30 million dollars stolen
out of 45 different
cities in 30 minutes.
That's a real problem
for our banks.
It's a real problem
for our credit cards.
And we're having to deal
with that organized crime
and real theft of
ones and zeroes.
Now that should not be confused
with intellectual property theft
and industrial espionage.
That's very different.
And many of our government
leaders in the United States
and many government leaders
in Europe are talking
about the unprecedented theft
of intellectual property.
Thefts, meaning,
illegally copying the plans,
processes and/or next
generation technologies
out of our corporations
for the economic advance
of their companies
and/or countries.
And so the intellectual property
theft is then not the same
as espionage.
And there are governments
that are conducting espionage.
Most governments do to steal
the plans and intentions
of other governments and
know their capabilities.
In the United States, we
sort of bundled the two,
IP theft and espionage.
When were talking about it, in
fact, we're talking about it
as Pearl Harbor and other
very exaggerated terms.
If we are going to bundle
intellectual property theft
with espionage then we have
to be willing to put espionage
that we would walk from
it as a government.
And I don't see any
government willing
to walk away from espionage.
So, why don't we talk about
what's really the problem
and that's intellectual property
theft and/or the protection
of intellectual property
and patents, et cetera.
There are two other areas that
are becoming more concerning
for most companies and
countries that are--
the first is disruption
of service.
And this is the distributed
denial-of-services actually
degrading real services
in your e-banking
and your e-infrastructures
that are preventing our banks
from allowing you to actually
access those infrastructures
and/or capabilities.
And we just had a--
and a significant,
in the United States,
we're having a distributed
denial-of-service
against our financial
institutions
and so are many others
in Asia and Europe.
And then finally the
destruction of property.
There was just recently a
malware that was released
against Saudi ARamCo
which destroyed 30,000
of their computers.
And when we start to actually
think about destruction
of property and how one
might recover from that
that is a different set of
capabilities than you would deal
with from organized crime
and/or political activism.
So, as we talk about this
on our panel over the course
of the next hour, I ask
you to start to think
about which problem
are you talking
about because we're not
talking about the same thing
in each of our conversations.
I'd like to wrap up and that
over 100 countries have
these capabilities,
and they are using these
capabilities to deal
with these different problems
whether it's political activism
to overthrow government
or its political activism
to bring transparency
to policies they don't
like is different
than organized crime
or intellectual property theft.
And there are three strategic
things that are happening
in the global order of things.
First, some are using disruptive
technology like a Stuxnet
or a Shamoon to bring
down core infrastructures
or core businesses
around the world
or some are implementing
surveillance tools to bring
about transparency or to
show the vulnerabilities
for those infrastructures
to be brought down.
So, disruptive technologies
are being used.
Second, strategic alliances
are also being wielded
to actually gather that power
and control over the internet.
And that's playing out in
the UN and the ITU and NATO
and ACION [assumed spelling]
and other of the forum
where countries can
align against each other
for particular motives.
And then finally there
are strategic properties.
And I mean that in the
very sense of it is.
There are 25 internet
service providers
that control 90 percent
of information flow
on the internet.
There are internet
exchange points
that actually control the
flow of technology and/or ones
and zeroes from continent to
continent or within a continent.
And there are data aggregators
who have actually more
information on us like a Google
or a Facebook than any
foreign intelligence service
of any other country.
So, you have to think
about where are the
strategic properties
and how they're being used
by all governments not
just by one government.
Thank you.
>> Thank you, Melissa.
Laura?
>> Good afternoon everyone.
I'm very delighted to be here.
And I wish to thank the Internet
Society for the invitation
and for GW for hosting
it as well.
I view PRISM as an
opportunity to draw attention
to the implications of broader
global internet governance
conflicts that have
implications for economic
and expressive liberty.
I recently-- recently the
last two or three years spent
that time researching
and writing a new book
called the Global War
for Internet Governance.
And it's going to be
published later this year
by Yale University Press.
Now in this book, there are
approximately four pages
at the end of acronyms.
So, what I've done is I've
challenged myself today
to not use any acronyms.
So, what I would like to ask you
to do is to pound on the table
if I use an internet governance
acronyms, so pay attention
to that, and I know some
of you will do that.
But what I tried to do
in the book is describe
the various layers
of how the internet is
already governed and what some
of the current debates are that
I expect to shape the future
of freedom and innovation
in the coming years.
Now what is internet governance?
This panel has already
described it.
If I had to give one definition,
I would say internet governance
is the design and administration
of the technologies
that are necessary
to keep the internet operational
and then the enactment
of substantive policy
around those technologies.
But there is no single system
of internet governance.
John said it best when he was
describing the various roles,
names and numbers,
administration,
standard setting, private
interconnection, arrangements
between telecommunication
companies, the privacy policies
that are enacted by social
media, by search engines
and other information
intermediaries.
And, of course, cyber security
governance not necessarily
enacted by governments
but by entities
such as certificate
authorities that are handing
out digital signatures
and things like that.
So, those are just
a few examples.
So, we need to take this
conversation outside
of discussions about
just governance.
However, one of themes that
I do take up in the book
and in my work in general is
that internet governance
conflicts are the new spaces
where political and economic
power is working itself
out in the 21st century.
We see this with PRISM,
we see this with Stuxnet,
we see this with the
turn to intellectual--
to infrastructure
for intellectual property
rights enforcement,
with governments cutting off
access during political turmoil.
And as Mellissa said,
we had denial-of-service
attacks often used
to suppress human
rights and expression.
So, internet governance points
of control are really not just
about keeping the internet
operational although
that is absolutely vital.
But there are also a proxy
for broader political
and economic conflicts.
So, the fact that PRISM
draws attention to some
of these broader global
internet governance issues is
an opportunity.
But keep in mind that government
surveillance ad censorship
for that matter which is in my
opinion an even greater problem
around the world
is not something
that happens in a vacuum.
So, it's delegated and
it is made possible
by certain arrangements
of technical architecture
and by private ordering.
So, infrastructure governance
just to give you a few examples,
infrastructure governance is
directly tied to privacy issues.
So, I have an information
engineering background.
I am also a social scientist
who studies the politics
of technical architecture.
And I can say that, you
know, you probably could say
to use a Harry Potter analogy.
There are some dark arts
of internet governance
that have a good intention but
they can be used for other uses.
So, one of these, for example,
is the deep packet inspection.
I didn't use the acronym
so no pounding on the table
which is a capability that
allows network providers
to inspect the actual
content of packets sent
over the internet rather
than just the packet headers.
So, this can be used
for a variety
of very important function
such as network management,
detecting viruses and worms.
It could also be used for
customized advertising,
now getting outside of
the operational role,
or for surveillances
or for throttling
and blocking of traffic.
So, this is a very significant
development made possible only
by advances and processing
power and storage.
And we need transparency
and accountability
in issues like this as well.
So, another area of
infrastructure related
to privacy is the hidden
identity infrastructure
that makes possible business
models that are based
on online advertising.
So, this is a good thing
for freedom of expression
because there are free products
that we are able to use,
but we are almost at the
point where the prospect
for anonymous speech considering
this identity infrastructure is
almost impossible.
We have technical identifiers
at the level of hardware
like Ethernet cards at the level
of virtual identifiers
locationally via cellphone
location, wireless fidelity
AKA wifi, it's my one acronym
or global positioning
system and through things
like platform mediation and real
identification requirements.
So, if you put this all
together, this is at the heart
of business models that
we need at the heart
of online advertising, at the
heart of having free software.
But it also is the
technical capability
that can enable new
form, even newer forms
of surveillance in the future.
So, these are new
opportunities for surveillance.
Two other infrastructure issues
I'll mention I'll mention
quickly include a current
rethinking and redesign of the
"who is" protocol
which keeps track
of who is registering a
domain name and, of course,
also the issue as has
already been mentioned
of internet exchange points.
How these are governed and
how they are distributed
around the world is something
that is very related
to civil liberties.
The Internet Society and we'll
hear from next has really been
at the forefront of
this area which--
having more internet
exchange points,
looking at the criticality
of them for human rights
for infrastructure development
and as concentrated points
of information flows.
So, the truth is that global
internet choke points,
of course, exist.
And the internet is governed
not by any one entity
but multi-stakeholder governance
which I'm sure we will take
up in the roundtable later.
But that this governance
is not fixed anymore
than architecture is fixed.
So, the architecture
is constantly changing
and the governance is
constantly changing as well.
The basic theoretical
or conceptual framework
of my own work is
that arrangements
of technical architecture are
also arrangements of power.
So, it's critical for the public
to be engaged in these debates
because the future of
internet architecture
and governance is
directly related
to the future of
internet freedom.
So, I appreciate the opportunity
to discuss that here today.
Thank you very much
for listening.
>> Thank you, Laura.
Lynn?
>> Good afternoon.
This has been a very
comprehensive set
of speaking points, I
have to say to date.
So, the Internet Society is
a cause-based organization.
We advocate for an
open global internet.
In the recent revelations
about the mass scale
interceptions not only
by the US, the UK, but
many, many other countries
around the world have
serious implications
for the open global internet.
ISOC is an international
organization.
We have members, org
members and chapters
in virtually every country
of the world perhaps even
every country of the world.
We have headquarters in
just outside of D.C. here
in Western Virginia and
Geneva, Switzerland.
And we have a very senior policy
and technical staff in little
over 20 countries of
the world often times
in the same individual
which is more
and more the future in any case.
I spent 27 years in
Europe, just moved back
to the US a little
over a year ago.
And the one thing I'd really
like to do is to make sure
that here in D.C., in
this country's capital
that we recognize that this
is not just about US citizens
and it's not just
about the foreigners
that the US is surveilling.
This affects every
individual in the world.
It affects some of
them very directly
but it will affect the
internet we all have access
to going forward to tomorrow
and to generations to come.
If we are not careful, we will
actually rob both individuals
today, tomorrow and future
generations of all the freedom
and the benefit and
the innovation
that the internet has brought.
The Internet Society actually
deals an awful lot in principles
and the principles that make the
internet what the internet is.
First and foremost,
it's a platform.
It allows everybody to go out
and develop what they choose to,
to access what they choose to,
to innovate on material that's
out there and make
that available.
If we're not careful,
we'll loose all of that.
So, the-- in particular
recently,
the unwarded collection
storage and the ease
of correlation amongst all
the data that's collected.
And I actually don't
differentiate a lot
between metadata and content.
You can get so much
information from metadata
that we shouldn't kid
ourselves by saying,
"It's just metadata we're
collecting, it's OK."
That-- though the collection
that will undermine many
of the key principles
and relationships.
And in particular some of those
natural conclusions will start
to impact the physical
infrastructure
of the internet itself whether
it's using some of the IXPs
as choke points or
whether it's using some
of the technical
capabilities that exist
to help with surveillance.
Those are all things
we want to I think--
think through very,
very carefully.
One of the principles we argue
for is multistakeholder
dialogue.
And it's because so much of what
we're all facing whether it's
in a policy or a technical
or a social environment
has never been done
in the world before.
We're breaking barriers
every single day and we need
to bring everybody to the
table for a discussion
and move forward
thoughtfully and carefully.
That is even more so when we
look to governments particularly
in their role in
protecting citizens.
We believe that the internet
must be a channel for secure,
reliable, private communication
between entities
and individuals.
And surveillance without due
process is simply unacceptable.
And as some other articles
have said recently, frankly,
it's very creepy as well.
And if that makes it more
personal, this is good
because we need everybody to
care about what's happening now.
We also challenge the
view that policies
to ensure security
must always come
at the cost of user's rights.
I'd also argue with the fact
that we all know it was coming
so what are we concerned about?
Due process wasn't followed.
That's what were
concerned about.
Did most people that
pay attention
to this field understand that
you could do all these sorts
of things with the data?
Yes. Did we believe our
governments were doing it
without due process
and certainly
without an adequate
level of transparency?
I might answer yes
but hopefully--
hopefully we didn't
and [inaudible]
to happen as Frank said.
One of the things we'd actually
like to do is really
get everybody to focus
on the multistakeholder.
A lot of the principles
that have given us the internet
find new forms, structures
and processes to address that.
Don't revert back to we
need a new institution.
I don't think that is the answer
in almost every situation.
Some of the recent proposals
that have been put forward
to address some of these
aspects would call for treaties.
Treaties are largely
intergovernmental.
They don't allow for
the private sector.
They don't allow
for civil society.
And honestly I don't
know how you get some
of those private
sector companies
to sign on to a treaty.
So, I don't think treaties
are the answer either.
We're going to need to create
new processes and new forums
and certainly at the
core of all that ought
to be thoughtful
informed dialogue.
So, I think, you
know, it's our hope
that as these discussions
continue across the world
that we recognize and
come to agree again
on some other high
level principles.
Some of the ones I'd throw
out for further debate,
is it unwanted surveillance
even in the furtherance
of national security
is not acceptable.
Unwarranted surveillance
is not acceptable.
The disproportionate
surveillance is also
not acceptable.
That surveillance
without accountability is
not acceptable.
And further, turning
to something a little
more positive,
that there should be
transparency with respect
to policy and its
implementation,
that we should be harnessing the
expertise of all stakeholders
to discover better ways
to protect citizens
in a global community.
And most importantly,
encore to everything we do
that we uphold human rights.
And so I look forward to
the discussion for the rest
of the day, and thank you.
>> Thank you, Lynn.
Danny?
>> Thanks, Lance.
So, thanks to the Internet
Society and GW, Lynn and Lance,
for getting-- Paul for
getting us together.
And at the end of the
panel, I'm always reminded
of this very distinguished
member of Congress said
at the near the end of a very
excruciatingly long hearing,
everything's been said but
not every one has said it.
So, I subscribe to much
of what has been said
by many of my fellow panels.
I want to just make three points
about what I think we've
been experiencing as a result
of this surveillance
debate over the last month.
I think fundamentally what
we've had is a certain degree
of a crisis in confidence
and a crisis in trust
about the internet environment.
The question is, should
you trust it or not?
And as Mellissa noted,
it's, I think, useful to try
to breakdown to some extent
our current sources of trust
in the internet in particular,
in society in general.
I want to just make three
points about the institutional,
legal and political levels of
trust that we tend to look to.
To start with the
institutional, you heard from--
you heard from John and
from Lynn about some
of the institutions that
make the internet work.
You know, I think
it's interesting
when you look purely
technically at the internet.
People pretty much trust that
the pack is going to arrive more
or less in the right order.
And enough of them
will get there
that you could get your
message through it.
You can watch video.
We don't-- we don't have a
lot of debates about that.
I will note just as a kind of
a observation of the sociology
in a certain way of people
in the internet technical
community.
There is a sense in some
ways as Lynn suggested
that if any third party
can get in the middle
of a communication stream
between two parties,
but that is in a sort of
very idealized sense a
technical failure.
It's the internet
not working properly.
Now that's a narrow
technical view
of the internet environment.
We, of course, have a
broader legal and social
and political view of our
societies, and we do recognize
that that surveillance
and espionage will happen.
But part of the challenge, I
think, in closing the trust gap
that we have is to articulate
better what those sorts
of expectations are.
We do have administrative
institutions
like that run the
domain name service
and IP address assignment.
Thanks to people like John.
They just kind of worked--
people grumble about
I can't [phonetic].
But so far, you know, people
are getting new domain names.
They are being maintained.
Nothing has completely
fallen apart at that point.
What I think is more complicated
on the institutional trust
front is that in many cases,
we're used to looking
to government
to establish trustworthiness
in society.
And certainly governments
believe their job is
to establish trustworthiness
in society.
But as you've heard, many
of our sources of trust
in the internet are
actually not governmental.
They are working perfectly well
largely without governments.
And I'll come back
and talk about that.
But again because of that
somewhat unusual circumstance
that Lynn alluded to, these
multistakeholder processes,
we probably have to understand
that a little bit better
so that people can trust
those environments.
You know, on the-- when we
look at legal institutions,
we rely a lot on our
legal institutions
to establish trust in society.
We hope that our legal
institutions make it
so that most people and most
institutions mostly do the right
thing most of the time.
We don't expect perfection
but we do expect our legal
institutions to set standards
and that there are consequences
when they're not followed.
I think when we look at
the privacy issues raised
by the current surveillance
practices
that had been revealed, the
problem that we have, I believe,
is that every one would like
there to be a sense of privacy
in our communications
environment.
But I think we have a real--
a lot of confusion about
just what that ought to mean.
We tend to think about
privacy particularly
in the computer network
environment as being able
to keep things secret.
Well, I think we all understand
now that we don't have a lot
of secrets, and we rely
on lots of third parties
to maintain our information.
So, we don't keep secrets as
well as we may be used to.
And my own view is that that
means we have to start thinking
about privacy more as a question
of whether information is used
properly whether it's misused,
whether it's used
to discriminate unfairly
against people.
But that's going to require
certain amount of discussion.
Lots of people have
mentioned accountability.
We're going to need better
accountability mechanisms
for these more complex
privacy rules.
When privacy is not
a binary phenomenon
that is either it's secret or
it's not, we need mechanisms
to assure trust in the way
personal information is handled.
I think there's a very
simple analogy actually
that we can draw from
the financial world.
Huge parts of our economy,
huge parts of the
world's economy run based
on a pretty well understand
set of accounting rules.
We're used to looking at
balance sheets for corporations
and having some sense of trust
that those balance sheets
reflect what's actually going
on in the financial life
of the corporations.
We don't expect to see
all the transactions
in the general ledger in order
to look and get a picture
of whether the corporation is
profitable, not profitable,
paying its taxes
correctly, not, et cetera.
Now there are-- this doesn't
always work perfectly,
but I the analogy particularly
to the NSA surveillance
situation is quite strong.
We do as citizens of the
United States want to be able
to have a sense of confidence
that our intelligence agencies
are following the rules
that they say they're following.
My guess is they probably do
about 90 percent of the time.
But we want to know that
there's some accountability
to those rules.
I think we understand that
we're not going to be able
to send auditors, independent
auditors inside classified
environments and can have
them come back and report
on everything they found.
But if we follow this balance
sheet model, if we follow
to a methodology of assessing
how information is used,
we can get that sense
of trust back.
And finally, as a
matter of political trust
and by political,
I don't mean kind
of small P Washington politics.
I really mean politics
in the sense
of how we organize
ourselves as a society.
As people on this
panel have noted,
our sources of political
trust are complicated
in the internet environment
and unusual.
We are used to the
idea of states
of government's exercising
authority directly
on institutions often
on intermediaries,
but in the internet
world and we're--
and if you considered
the analogy
between telephone
networks in the past
and internet service providers
today, telephone networks,
broadcast networks were
really creatures of the state.
They were authorized by
the actions of legislatures
and therefore controlled in that
way at the local state federal
and even international
legal level.
The internet doesn't--
has not happened that way,
it was not a creature of the
state ever, it was really
in many ways a creature of
individual and voluntary action
by some people on this panel,
by others all around the world
who participate in making this
internet institutions work.
But now when we're
nervous about how they work
and how the state work
we have rethink some
of these relationships.
So, we're going to have
to get use to the fact
that governments-- I
would submit cannot reach
into these internet institutions
like the internet engineering
test force like I can--
like the worldwide
web consortium
and achieve exactly
the result they want.
But of course at the same time
governments will make laws
and rules about how individuals
and corporations act whether
for intellectual
property protection
or privacy protection
or anything else.
So, I think that what this--
the whole surveillance
experience has revealed is
that the rules and expectations
that we have are quite
a bit more nuanced
than the very binary
technical behavior
of the internet environment.
And I think our challenge
is now to do a better job
of articulating just
what we expected,
all these different
institutions at all these levels
and how we're going
to find accountability
to those expectations.
Thanks.
>> Thank you Danny and thank
you everybody on the panel.
I think what we'll do now,
I'm going to open up first
to the panelist for
five or 10 minutes
so they can ask questions
of their fellow panelists,
make further, you know,
comments, go some back and forth
that way for five or 10 minutes.
In the meantime if
you in the audience,
if you have a question would
you please if you're physically
in the audience here
at GW step up to one
of these two microphones in
either of the isles, form a line
and when your turn comes please
state your name and affiliation
and then ask your question.
If you are in the internet
world, if you're off
in the clouds somewhere and
want to communicate send it in,
Paul any other directions
on that
or you have some
questions already?
While you're waiting, I want to
give the panelist a chance first
but any other protocol?
>> So, they type the
questions into live stream
and I'll be reading them
here at one of these mics.
So, that's for the
live stream panelists.
In New York they can just line
up at their microphone there.
>> In New York they can
line up their microphone
and we'll see them
live up there?
>> We'll see them,
yes, that's right.
>> OK.
>> Very good.
>> All right.
This will be interesting.
OK, but first let's get to the--
give the panelists a chance
to ask questions, Leslie?
>> So, I want to ask a question
of Danny because talking
about the concern or persuading
people that, you know,
winning people of off
government is sort
of the governance structure
for the internet and trying
to educate them that it's
not really government,
it's all of these other
institutions and we do them
in a multi-stakeholder
way and many people
up here have spent
considerable time trying
to move people to that model.
It just seems to
me that the rest
of the world right now is going
to think we've been involved
in a slight of hand
that basically we're
saying governments stay out,
governments don't get
too involved here.
All these other into multi
stakeholder institutions are
really the core of
the internet and then
at the same time
building an environment
where the United States
uses historically--
historical dominance
to basically trump all
of those governance.
So, I hear what you're saying
but it was a hard argument
to make to the rest of
the world beforehand.
I'm just curious
whether you're continuing
to say it with a straight face.
But I'm having trouble.
>> It's a habit.
I guess what I would say
is the ability to make
that argument really
depends on--
as several people
have said the layer
at which you're making
the argument.
I don't believe that the NSA
surveillance changes one bit the
question of who should be
setting standards for TCPIP
or who should be determining
how domain names get assigned.
I do believe that as you
said Leslie that the question
of government espionage
activities whether it's the NSA
or GCHQ or MI5 or, you know,
the German Intelligence Agency,
the French Intelligence
Agency, anyone of them,
all of which you're doing
exactly the same thing.
I do believe that we now have
to have a more public discussion
about what our expectations are
for surveillance
including espionage
in the internet environment.
But that shouldn't get confused
with the question of whether all
of a sudden we need
a treaty about how
to set internet technical
standards.
And I believe that, you know,
I'm highly uncomfortable talking
about the rest of the world.
But I think that what we saw in
the internet governance debate
at the United Nations at the--
in Dubai where the proposition
from Democratic countries like
Iran and China and Russia was
to exert greater control
over the internet environment
was rejected not just
by the usual suspects, that
is the 34 OACD countries
who you could somewhat
expect to do that.
But by another 20 countries,
from Africa and South Asia
and different parts of
the world, that I think
at Brazil have recognized that
while we could always do better
with the current internet
governors arrangements
that they're working and
messing with them has a cause
for all those countries.
So, I think as long as we
keep these issues distinct
and don't confuse them
I think there's a way
to have both discussions
in a coherent way.
>> Well, so I agree with you.
>> I know you do.
[laughs]
>> But it seems to me
the opportunity here
for many countries to
combine them together
to not make them distinct.
The opportunity to
reclaim control
by basically bringing
this together.
>> Sure. So, what
did Russia say?
What did Russia say this-- you
know, Russia said in response
to the NSA we need more
government control--
>> Right.
>> -- of both the
internet infrastructure
and the internet companies.
It's a-- I mean, I think the
question is how does anyone say
that with a straight face
when the concern was undo
government intrusion?
But, yeah, I mean it--
people who want to make those
arguments will find ways
to make them.
>> I worry that many of those
in between are suddenly going
to become more persuadable
because of the outrage.
>> So, I have a question
for my fellow panelists.
And it's sort of following
what Danny just raised.
In December we had an
interesting conference,
the World Conference
on International
Telecommunications.
I'll use the acronym
WCIT not 'cause I want
to have the table bounded.
I explain the acronym first.
>> You did.
>> But you hear the
term WCIT so I want
to pronounce it as
you might hear it.
But the World Conference
on International
Telecommunications was look
at some treaty arrangements.
So, some tariffs.
Regarding interconnection
and there were number
of very interesting proposals
that shut up in Dubai.
And I'm struck by the
timing because that happened
and now we now have a lot of
events regarding surveillance
and then I say and
so on and so forth.
And I guess with my panelists
I'd ask the question,
does anyone care to speculate
if water had been reversed?
What the outcome
would have been?
Because I've had a
few people suggest
that that's an interesting
exercise.
I am not sure whether or not
the outcome that we had in Dubai
in December is what
we would be seeing
if in fact it was being
done instead next month.
>> Hold on, that was my concern.
That's my concern.
Well, I think that if you
look at the World Conference
on International
Telecommunications,
it was an important treaty
negotiation, renegotiation
that hadn't been renegotiated
since 1988, is that right?
'88. And as Americans we look
at these negotiations as one--
as a one negotiation and I think
that that's a poor perspective
because the WCIT was really
about how does one monetize
the internet and to pay
for court infrastructure.
And it is one of a multi-series
of negotiations, the world--
there's another
telecommunications conference
on policy, WTPF, it
just happened in Geneva
and they'll be the world
summit on information society
that will be culminating
in 2014 I think, right?
2014, '15.
>> So on, yup.
>> And there will be 24
negotiations to go between now
and then that will ultimately
be where things will land.
And then from-- the world summit
on information society
will be the strategy
by which will be executed
in the WTPF for policy.
The WCIT for regulation and
in the internet standards--
in the international
standards organization
for the overall technology.
So, I think if you are a
person in the corporate world
and you're worried about these
things you shouldn't just look
at one of these forums
as one off.
And if you're worried about it
from a government perspective
it's not just the policy forum
where the regulatory forum,
it's all of this forum
and they're all interconnected.
So, I would say that yes, I
think you actually suggested
that it was a positive outcome
at WCIT in Dubai and I think
that the United States lost
in that negotiation but--
and I think that the United
States and many are going
to continue to lose and it's
going to be a quick erosion
of our stance not a slower
version of our stance.
>> I'll take one more
question from the panel.
Any other panelist has
a question before we go
out to the audience?
Going once?
OK.
>> Mike Nelson with
Bloomberg Government
and with Georgetown University.
I want to commend the internet
society for great panel,
we got the lawyers,
we've got the techies,
we've got the scholars,
we've got the activists,
and that's great but we
tend here in Washington
to talk about the policy.
And Danny I tweeted your--
a [inaudible] of your talk
which I thought was exceptional.
And that was--
>> You got it to 142 characters?
>> Less than that.
But you basically said, data
will flow, we can't really focus
as we used to on
controlling that flow,
we have to control the
misuse of that data.
Policy makers are starting
to understand that.
But I think we also
have to figure out a way
that techies can start
implementing systems
that reflect that.
And the first way to
do that is to make sure
that systems are more
transparent so that we can see
where data is being misused.
And I guess I'd challenge the
audience to think about privacy,
not to tell the audience
and the panel to think
about transparency by design.
We've heard about privacy by
design but has anybody thought
of examples of where we're
building in the transparencies
so its in the technology
and other places
where we could do that better?
Just an open question.
>> Laura?
>> OK. I think that's a really
great question and a good point.
I'll give one example of where
I think the transparency is
excellent and then
a couple of examples
where I think we need
more transparency.
What is one of the oldest and
most vulnerable institutions
of internet governance?
The internet engineering
taskforce jumps to mind.
So, this is the-- one
of the standard setting
organizations for the internet.
There are many others
but they have set many
of the core standards so they
have a tradition of being open
in three different ways.
They're open in the
development of a standard
and that anyone can participate.
Now, granted there are a lot
of barriers to participation,
it requires a lot of
technical knowledge,
it requires in many
cases money to go to some
of the events in time.
But it is basically
open to anyone.
They are also open
and transparent
in that the actual
specification.
So, a standard is
not really software
or hardware specifications
that are written down
and people can go
online and view them.
So, I would differ with some
of the panelists and say that,
"Oh the technology
is-- it is political."
So, the technology designers
make political decisions
in the design whether they
like to call it that or not.
So, sometimes privacy
is designed in.
Think about encryption
standards for example.
Think about unique identifiers
and the privacy implications.
Yet the specification is
open so there are some degree
of accountability where
people can view it.
It's also open in
the implementation
because it results in
multiple competing products
that are based on that standard.
So, that's an example.
In other cases we don't have a
lot of transparency and I agree
with you that we need more.
So, here is an example, how do
we look at interconnections?
So, this is an area where
I'm worried because of all
of the calls for greater
government regulation
of interconnection in an area
that has worked fairly
well up until now.
Well, part of the reason
we're seeing this calls
for a regulation is that we
can't really see what the
agreements are between
these private companies.
So, I think it would be more
helpful to have transparency
in an area such as that as well
as other infrastructure areas.
>> So, I want to go
back to Danny's premise
which you apparently
support which seems to be--
data will flow, everybody
will collect it
and we should only
focus on the uses.
And that's certainly a
discussion we've had on sort
of the consumer side
of the ledger?
You know, Google will
collect it and where we have
to focus our attention
is how they're using it.
I have to submit the government
is collecting information
and certainly the US government
collecting information really
to-- is subject to-- I mean in
our own country it's subject
to this little thing called
the Fourth Amendment and I know
of no case and I stand-- I'm
willing to stand corrected
that says, you can go in--
I mean, this is essentially
what the governments is arguing
in NSA collection.
And they're calling it
acquisition and not collection
that acquiring the
information is not collection.
And that any rights that
might attach don't happen
until you open and
look at the packets.
I don't know, you know, I've
never heard anybody come
into somebody's house, walk
away with their desk drawer
and say, "We're acquiring."
To let you know if we ever
get around to looking at it.
And so-- Danny and I have
some disagreement although I'm
getting persuaded
more and more by his--
by the question of use
in the commercial side
of this big data world.
But, I'm not willing to go
there in terms of governments.
And I think it's a really
dangerous thing to do.
>> So I-- just to
clarify, my observation
that data will flow is not
meant as a moral conclusion.
And I think the extension
which we choose to put limits
on how much data government
can acquire from those
who have already
collected it, right.
Because that's what
we're talking about here,
it's very important and I think
that historically we have relied
on technical barriers to large
scale information collection
as a way to limit how much
power the government has
and we don't have those anymore.
So, we're going to have
to get very explicit
about what limits we think
government should have
on both the collection
and the use front.
My only observation kind of
back to your point Mike is
that we are much
better technically
at managing collection
limitation
than we are managing
use limitation.
Just as a pure matter
of the kinds
of computer science techniques
that we have available
and I would submit
that's because, you know,
the computer security community
cryptographers have taught us a
huge amount about how
to keep data secret
and control access to data.
There's a whole other set
of disciplines developing
in computer science that try to
characterize information usage,
track information usage,
but it is a different--
but there is less progress
on that because I think
that that's not a
view of privacy
that computer sciences
have previously focused on.
So, I think you're
exactly right to point
out that we need
more work there.
>> I'd actually hear
from the techies.
Because I wasn't thinking
so much about transparency
at the institution
level at layer eight.
I was thinking more
the lower levels.
Well, like--
>> Oh, I'll give the--
let me give the--
>> -- like route to tracing,
you know, when we send an e-mail
to each other you can find out
where it balanced along the way.
>> Let me give the techies
about two minutes before we move
on to another question
because we will have a chance
to circle back in
the round table again
so we can take another
bite at the apple here.
>> Yeah.
>> Go ahead.
>> I just want to--
just very quickly.
Stealing is stealing.
It doesn't matter whether
it's in the physical world
or in the cyber world.
So, whatever the techniques you
have to collect data of a theft,
in the physical world you
can apply the same techniques
in the cyber world.
The difference is speed.
I mean it takes me
physically a long time to go in
and steal a laptop from
the gentleman's tester
but I can steal all the
information on his laptop
in a second, probably as long
as it took me to do it as long
as I can connect
it over the net.
So, the comment about
the government, you know,
and the search warrant in
the constitution, you should,
you know, that's why it's there.
That's why we have
the constitution is
to follow those laws.
And you can still achieve
the same goals and stay
within the constitutional
limits.
It's been done.
It's just that what's happened
is we got a bunch of people
at the policy level that
didn't understand what the
implications are.
And that's where-- everything
that they've done
has been legal.
It-- the law is the problem.
Not what whether
they're doing it or that.
>> I would question that.
>> But--
>> OK. I'm going to move
this along because I want
to give Joanne a chance and
I'm sure Steve is getting a lot
of farther for the round table
to come back to just later on.
>> So, I don't want
to pick up on thing
which is the internet has
a tradition or convention,
I wouldn't say tradition,
convention of protocols
which behave in an open manner.
And so, this is for
example, your--
when you're looking at how
packets flow, there are commands
like traceroute that let you see
how they go through the internet
and you can map them through an
exchange point most of the time.
When you look at mail headers,
you can look at e-mail headers
and you can actually see,
wow that's not my e-mail,
it got from point A to
point B and it's visible
and you can see those
most of the time.
When you look at a packet that
you've received you can look
at the source address and
say, "Oh, where is that from?"
And you can look it up and
find it most of the time.
Now, there is no
obligations that any
of this information is accurate.
It's all sufficiently accurate
to keep the internet
running or at least so far.
And we hope it will
just keep it running,
but there's no actual
obligations and you need
to be careful because while
there would be some benefit
to having an attribute that says
there's an actual obligation
to make this accurate, then
I see people emerging saying,
"Well, wait a second, now I'm
worried about my anonymity
because now you can trace my
IP, or you can trace my e-mail."
So, we have just a
convention of transparency
which has been enough to
keep the internet running.
There is a question on
whether or not that's going
to actually work long term.
The point that was made by
Randy which is that the rate
at which you can attack
something digitally is a lot
faster than physically.
And this means that the ability
to have an accountable internet
where we can actually figure
out who sent the bomb threat
or figure out who sent the
e-mail which cause the problem
at the school is
potentially a very large duty
and it may require us making
a tradeoff between anonymity
and curated anonymity in
order to have accountability.
Right now it's not
clear that one
or the other is the
right answer.
>> So, we may-- we'll circle
back to this in the round table.
Let me move on to the
next question over here.
>> All right, I'm [inaudible]
cofounder of Codex for Africa.
This is more to as
the techie side too
but at the same time
from a global view.
We represent thousand
of developers in Africa,
in the upcoming years it's going
to be more software
developers who're are going
to be creating thousands of
applications on mobile and web.
And one of the things is what
are the strategies there,
when you have emerging countries
or emerging economy jumping
on a bag of bandwagon of using
the internet to do transaction
with the US or the
western world.
What do you think what would
happen because I could be
in a country like Synagogue,
you know, do some hacking,
I can do anything I can because
you don't have access to the--
maybe the continent
level network.
But as well as the
western world,
what are your strategies
and all that?
>> Well, this is where
the law comes into play.
What we hear on the
techie side for instance is
of course there is laws
in hacking in the US,
but there is no laws
in hacking in China,
as we would understand it.
So, each country gets to define,
you know, what a hacking is.
Some may take it as an
assault against the government
if you are hacking, you know,
some might just say it's
a plain and simple theft.
So, you're going to have to-- I
would think you'd have to look
at each individual
country's definition
of what they considered to
be hacking in that case.
Now, on the other side, again,
if you're writing an application
for someone like that then
you're going to do logging,
you're going to do all of this
type of stuff because you want
to provide an auditor
if you will
if it's a financial
application with some record
of whatever transactions
your software handles.
So, there's always going to be
a record of something that you--
either your app did or
where your app went.
Whether it's accurate or
not that's another question
but there's always
going to be a record.
And again, metadata
analysis I can use that
and make some inferences
as to what you did even
though I can't see what you--
what's in your individual
packets if you encrypted them.
>> John?
>> So, that's an amazing
wake up call here.
So, we have this internet
that's remarkable.
It allows people to interact,
people in different countries
with different expectations
to interact.
And yet, the conventions
by which governments work
haven't evolved fast enough.
The idea that citizen in country
A is interacting with citizen
in country B and it might be
illegal in one country and not
in another is a whole
new concept
that governments are going
to take some time to try
to figure out how to deal with.
So, we have a problem.
We literally have an
internet that has capabilities
that governments haven't
come to grip on how
to handle their duties
and responsibilities.
I know governments
have feel very strongly
about protecting their
citizens against pornography
or against certain
types of content.
I know governments
in other countries
that feel very strongly that
that's up to each citizen.
But the reality is that that
intersection has now happened
because of the internet.
Two governments can have
very different views
on what their responsibilities
of their citizens are.
So, to answer your
question in a general case,
the internets move
faster than governments.
Governments literally do
not know how to interact
with the situation you describe.
On a practical matter, if
you end up doing something
of significance, something
that causes a lot of harm
or actually there's a very
nice list of types of attacks,
distraction of property, theft
of intellectual property.
There's different types
of extortion or DDoS,
if you actually do something of
a major magnitude you'll find
out that law enforcement does
cooperate between countries
and it works very well.
It's just not set up for
the scale of the internet.
And it's in the age
of the cooperation
between law enforcement
and various computer
response teams is one step
above fax machines ringing
and going back and forth.
We have the mechanism to
handle the really bad events
fairly slowly.
We don't have anything
to handle the scale
of automatic attacks
happening 24 hours a day
around the entire
globe which don't--
>> Let me move to Danny.
He wanted to say something
and then I'm going to move
on to the next question.
By the way, I can't
really talk to anybody
in New York once
I ask a question.
So, what they do I guess
they'll stand up or something
or else go through Paul.
Danny, go ahead.
>> I just want to make one
observation about the challenge
of international law
enforcement cooperation.
Most people in this room
are probably familiar
with the SOPA debate that was
proposed in United States.
In a certain sense we only
have that debate with all
of its cataclysm because of the
failure of current mechanisms
in international law
enforcement cooperation.
We have that debate, the
Congress was considering
that law blocking access
to websites outside
the United States
that might have infringing
content because people
in Congress were concerned
that US Law Enforcement didn't
have an effective way of working
with law enforcement
authorities from the countries
where the infringement
was actually happening.
I think that we have to get a
lot better as John is suggesting
at add enforcement cooperation.
There are realms where that
works reasonably well but it's--
the problem is it's mostly
cooperation in the form of how
to make a criminal conviction
against someone to stick.
Law enforcement cooperation
mechanisms are good
at exchanging evidence,
it make sure
that you have the information
you need to, you know,
bring someone to
trial but not good
at actually stopping
the behavior
that maybe harmful
as it's happening.
So, I think it's a
very big challenge.
>> All right, go to Paul for a--
>> I actually did see that
David Salmon [assumed spelling]
of the New York Society's
chapter president would
like to have a question.
So, why don't we go
to him if he will--
I think they have a
little bit of a delay.
So, he might just be
hearing this in a second.
>> OK David.
[ Inaudible Remark ]
>> OK, that's your cue David.
>> OK. I'll speak to the
camera then or-- yeah.
OK, so.
[ Inaudible Remark ]
OK, we have some delay here.
My question is what sort
of scenarios do our
panelists envision
and what would be some
alternative solutions
if our government specifically
Congress is unwilling or unable
to rain in agencies such as the
NSA in terms of surveillance?
If they can't do
it or choose not
to what would be
the consequences?
Are there technical solutions
or are there possible solutions
that could be implemented
without US government?
>> OK. The panelists are being--
the panelists are being asked,
is there a work around congress?
>> I would just to
speak to it just
as an individual not
as a panel member.
But if congress can't rain
in the government agency we
have a lot more serious problems
than what's going
on in the internet.
>> When was the last time
we passed a federal budget?
>> I mean, they are
government agency.
Congress has to do that.
Now, again, there is
classified information
and all this other stuff that
can influence how law is passed.
But quite frankly if congress
didn't have that power
to do it then we wouldn't see
General Alexander making a run
up to talk to members, I
think was it today or--
>> No, it's tomorrow.
>> Yeah, the vote is tomorrow.
So, you know, doing that,
so certainly I would think
congress has the ability to do
and should have the ability or
else we don't have a democracy.
>> OK, any other
panelist winging on that?
Lynn?
>> Well--
>> You can just say quicker
than I'm-- oh, I'm sorry.
>> No please.
>> That, you know, there
are ways for people
to manage their traffic.
I know I'm certainly certain
that would be even more ways
for people to manage their
traffic and choose the routing.
We actually are quite
concerned about that
because it will make the global
internet much less resilient.
It will-- putting something
in a box doesn't mean it's--
it maybe protecting it from
one country actually examine it
but it doesn't protect
the other country.
And so, I'm not quite sure
what David's question was but,
you know, if it's about
routing and the ability to route
around what you see as a
problem, I think we need
to be very, very
careful about what some
of those potential solutions are
and because I don't think
it will address the question
you're-- the problem you're
trying to route around.
And in fact it will overtime
make the global internet much
less resilient.
>> Paul?
>> OK, I have a question
from live stream.
It's from Garth Gram
[assumed spelling].
He asks, is the real
issue autonomy
and self determined
choice rather than privacy?
And if so, what is
the role of identity
in addressing the
issue of trust?
>> Well, I'll take
one step at that.
I think that in the
discussions of privacy
over the last 10 years or
so, maybe longer I think
that we've gotten a little
bit distracted by the promise
of individual choice as
somehow the key to privacy.
I think that a lot of
the privacy values--
and what I mean by that
is the dialogue box is
that everyone sees in, you
know, one website or another
where you have to click here
to accept the privacy policy
or swat away a dialogue
box to proceed.
And I think certainly in-- it's
actually a remarkable point
of convergence between the
United States and Europe
in the last couple years
both the White House
and the Federal Trade Commission
issued major privacy policy
statements that noted
the limitations of this
so called noticing choice or
individual determination model.
The European certainly have
pointed that out as well.
A lot of the things
that we value associated
with privacy are
collected values.
We want to make sure
people can associate freely,
can engage in politics,
can engage in commerce,
can seek medical
care, et cetera.
And I think that giving
people the choices to opt
out of those things or
somehow control their identity
when they are trying to
speak to their doctor or make
up public political
statement really seems
to be exactly the opposite of
some of our core privacy value.
So, I think that
there are situations
in which individual
autonomy is quite important
but a little bit also to the
last question about the NSA.
I don't think we get ourselves
out of these privacy problems
by just giving people, you
know, 20K long encryption keys
to wield against everyone else.
>> Laura?
>> That's a really
interesting question and I want
to tie it back to something
that I think Danny said
before about data will flow.
And, you know, while
I agree with that,
that's also not necessarily
the case.
We have interconnection disputes
that have resulted in outages,
we've had countries
that have cut off access
for their citizens.
We have areas of the world
that have infrastructures
of complete censorship.
We have digital divide
issues, we have trends away
from interoperability where
we're going in the cloud
to more proprietary
protocols for example.
So, it's-- so data will
not necessarily flow.
But one of the things
that is required for it
to flow is this issue of trust
that the questioner brought up.
So, I think I can mention just
a couple of areas or maybe three
that are very important.
So, the trust has always excited
between network providers
to exchange information about
IP addresses that are either
in their control or that they
can reach on the internet.
But we have seen
examples that's done
by through a boarder
gateway protocol.
There have been examples
where false routes have been
advertised whether intentionally
or not and outages
have occurred.
So, there is-- our effort
is underway now to secure
that which are very necessary.
So, we have to build
trust into the network.
It's not something that
we can just assume.
It has to be designed in,
it has to be build in.
The same thing with how the
domain name system works.
We have servers located around
the world that resolve queries
of domain names like maybe I'm
up here looking at cnn.com,
I'm not but the domain
name server would resolve
that into its IP address
and route the information.
Well, that can be gamed also
and that there can be a
false return of a query.
So, having things like domain
name system security extensions
has to be continued
to be implemented.
You know, these are just
a few of the examples,
an answer to the question
that it has to be designed in,
same thing with website
authentication.
If I'm saying that correctly the
role of certificate authorities
and how they verify
through a digital signatures
that a website is who
the website says it is.
So, again, this is an
example of the politics
of the architecture, it's
no just about agreements
between people but about
designing this trust
and identity into
the infrastructure.
>> I'm going to move on
to the next question just
because I want to get the
audience as much opportunity
as I can because the
panelists are going to be able
to circle back on this
later on over here.
>> So, I'm Luke Wadman [assumed
spelling], I'm a student working
as a policy analyst intern,
analysis intern for
IEEE this summer.
I'm working on internet
governance issues.
And does the panel
think there is any--
would it be a good way to
frame the debate on privacy
and internet governance and
et cetera in economic terms
because if I've learned
anything in my time in DC it's
that catching the ear
of our congressional
representatives is easy
if you start talking about
jobs and job creation
and we've already talked a
little bit about the impact
of things like prism on US-based
IT companies like Google
and Facebook, particularly
in the EU
but also around the world.
One case is that Google isn't
really competitive in China
and that's probably going to
continue in that direction.
So, would that be a good way to
frame this whole conversation
and actually encourage some sort
of positive congressional
action?
>> So, I would never say
that I have any idea how
to encourage positive
congressional action.
I just want to put
that on the table.
I think it is true
that everything that's
happened this sort
of last six weeks inside this
sort of NSA bubble has happened
without any reference
as to what it might--
the impact that it might
have on US industry.
And I think it is possible
that the impact at least
in the short run maybe severe.
On the other hand a
lot of people may carry
on about being unhappy
about discovering
that the NSA is sucking
up their data.
Historically, when
the big comp--
if you look at the SOPA fight
which I think was
probably the first time
that US internet industry sort
of held hands with activists
and technologist, it does
get congresses attention.
I will say though that
National Security is different.
It is just always
different and the arc
of National Security has been
more, more, more since 9/11
and the question is whether
these revelations have sort
of pushed us beyond the
more and more place.
>> OK. Let's move over here.
>> Hi, I'm Susan Aaronson
with GW, I'm a professor here
and I work with the
Worldwide Web Foundation
on measuring internet openness.
And I want to ask you a
question that relates to trust,
the trust of policy makers.
So, in the last couple of
days we've seen [inaudible]
and Angle a Miracle [assumed
spelling] make these delightful
comments about sever locations
and threats in terms of privacy.
And again, I wonder
if there are--
so, they're basically saying
if the server can't be located
where we can control, where our
privacy rules dominate we might
not accept for example some
of the things the
United States wants
in the trade agreement or--
and we see similar things with
the Trans-Pacific Partnership
and I just wonder if you
could talk a little bit
about this now.
You know, you can always, for
national security reasons,
you can always say you have
a particular policy in place
and it's not protectionist.
But this is opposite
of that, right?
They're saying that
for privacy reasons,
they want to essentially
protect their citizens
from their information
being traded
by having the server
location in the United Stated.
If I may add one other
thing which is Frank La Rue,
who works for the-- who is
the UN Special Representative
on Freedom of Expression,
he has said basically
that the US' failure to protect
the privacy is a violation
of its human rights' obligation
because that is a
basic human right
under the Universal
Declaration, blah, blah, blah.
So I want to hear your comments.
>> Sure. I'll do this.
Pull it out there.
I'll-- the Trans-Pacific
Partnership
and the US Free Trade Agreement
is and always will have had
to address the data
privacy laws.
The Safe Harbor that
we had in place
in 2001 will actually
expire with the agreement.
And the United States, as you
know, has no national umbrella
for data breach and
data privacy.
We have 47 individual states
with their individual programs
and no national umbrella.
And so, if we're recalling
for congressional action
that had an economic, you
know, significant impact,
there are 52 pieces of
legislation currently
in 113th Congress
around cyber security,
about 10 of which
around data breach.
It would be wonderful
if we could get
to some bipartisan
agreement on that
so we could enable the
overall Free Trade Agreement
to move forward between
the two continents.
More specifically though, noting
the 47 different state laws
and noting the difference
between the Europe
and the United States,
cloud computing
and where the data is
stored follows the geography
and will always follow
the geography.
So if there's a data breach
here in the state of Virginia,
it follows a different
set of rules
than Massachusetts
and in California.
And a company, whoever the
company might be, actually has
to know all sets of laws
for that particular state
in this case in order to follow
the regulatory compliance,
et cetera.
That also is the same
for if it's stored
in the United Kingdom,
it follows the United Kingdom's
laws, and the Netherlands,
and Brazil, and you
pick the place.
And so when the EU and
the leaders of Germany
and elsewhere are talking
about the data protection
and data privacy, and they are
looking at the United States
and worried about how our data--
we're protecting
data and the privacy,
then it would be also important
understand how the European
companies are mirroring data
in other countries like Brazil
or South Africa or
Egypt or China or India,
et cetera because the data
always follows the law
of the geography
that it sits in.
>> I'll just add to that
that one of the things to--
I mean, obviously, the
NSA revelations have sort
of strengthened the EU's
hand in a discussion
that was going long before.
We have a particular view
of what privacy means.
It doesn't match up, going back
to this question earlier about,
you know, what economic kind of
motivation might move Congress.
One would think the US
companies would move forward
on a comprehensive
data protection regime
in the United States that
might be more flexible
and perhaps reflect the internet
more than the European one,
but I haven't seen them
step forward on that.
I think it would be interesting
for somebody to ask the question
about the various EU countries
in their surveillance regimes
because as much as I say
some very unpleasant things
about ours, I think you
would find that it--
with the exception
of our capacity
for the just incredible
scale of collection,
that the actual legal
protections are no better
at best and probably
a lot worse.
>> Just one comment on
the trade discussions.
Suzanne, I think
it's a very good,
it's a very important question.
I certainly think that,
you know, as you well know,
better than probably anyone
in this room, you know,
trade agreements have always
made exceptions for things
like national security,
public morals,
sometimes consumer
protections, things like that.
I think what we see now is
that simply pushing them off,
those issues off into the
exception category is not going
to work so we need
some kind of mechanism
that on the one hand
respects the fact
that governments do
have a legitimate
and important interest in
protecting their citizens
against unsafe products,
against human rights violations
if that's the way they do
privacy, against, you know,
security breaches,
what have you.
But tying that to the
location of data is, I think,
just an overly simplistic way
of accomplishing that purpose,
and I think, you know,
you can make fancy--
well, you can make fancy trade
arguments about why that's not,
you know, most favored
nation treatment.
But I think the bottom line is,
we used to have trade agreements
that were fundamentally about
tariffs and we've mostly dealt
with those issues, and now we
are going to trade agreements
that are fundamentally about
the non-tariff barriers
that exist between economies.
So we're going to have
to deal with that either.
I think that in the, you
know-- for some period of time,
I think the surveillance issues
will cloud those discussions
but we'll come back
to them at some point
and they will be
the same issues.
So-- and I-- the only thing I--
the only final thing I
would say, I think that one
of the big challenges
we're going to have
in the trade context on
internet issues is the challenge
that we found with ACTA,
that the ACTA was making--
[Inaudible Remark] Sorry, oh,
sorry, oh God, Steve is going
to throw that look at me.
So there was a trade agreement
involving intellectual property
enforcement whose
acronym is ACTA
and was very strenuously opposed
by civil society groups all
over the world because they
didn't know what it was,
they didn't know what was in
the agreement and they argued,
I think-- I thought,
quite legitimately
that if there are
going to be rules made
about intellectual property
rights that affect individuals,
there should be some
public discussion
of what those rules are.
Trade people believe they
somehow can't negotiate
in public.
And that's a sort of an article
of faith in the trade world.
They're going to have to learn
to be a little bit more public
if they're going to get anything
done on these issues is my deal.
>> Lynn, you got the last
word before the break.
>> Just quickly.
Not only-- it wasn't that they
wouldn't negotiate in public,
they would not authorize
a release
of the documents
post-negotiation.
They were not available
publicly.
>> So this sounds
like a thread going--
circling back to NSA and FISA
and all that, but I am sorry
that we have hit the point
where we promised we were going
to take a very brief five-minute
break that you're going
to have a chance to stretch
your legs, then we're going
to come back and people on the
panel will mix it up some more,
even better I suspect,
moderated by Steve Roberts.
But let's take a quick
five-minute break.
There will be one other
opportunity for you all to meet
and greet at least
most of the panelists
and there's a reception that
starts at 5:15 afterwards.
But for now, five-minute break,
we'll convene at 4 o'clock.
Thank you very much.
[ Inaudible Discussions ]
Well, thanks for
sticking around.
I know it's been
a long afternoon.
As Lance said, I'm
Steve Roberts.
I'm a professor here at
GW in the School of Media
and Public Affairs,
right across the street.
And my job is to try to
crystallize some of the issues
that we've been discussing
and [inaudible] some
conversation among
the panelists.
And I want to start by quoting
a couple of things I heard.
Lynn, for instance, said that
"unwanted surveillance
is not acceptable."
Leslie talked a lot
about human rights.
But there was a phrase
that I did not hear the entire
first hour and a half except
in passing, and that word
was "national security."
And so I want to pose this
question to the panel.
Isn't national security
a human right?
Isn't safety a human right?
Isn't the unwanted surveillance,
one person's unwanted
surveillance is another person's
protection from danger
and from terrorism?
So, I want to ask everybody,
what's the tradeoff here?
The whole idea in the first half
of this panel was the importance
of the freedoms in the internet.
But what are the limits,
and what are the
legitimate tradeoffs,
and how do we balance
legitimate human rights
against legitimate
rights to be safe
from terrorism and
other threats?
Who wants to start?
Go ahead, John.
>> So I'm going to almost answer
the question, but not quite.
It is true that the
governments have certain roles
and responsibilities.
And one of those roles is
there's a certain protection,
a certain defensive role
that a government feels
it has to provide.
And the question is how do
governments do what they see
as their obligations,
their actual responsibilities
given the internet.
The internet has not been
very good at this, OK?
So, --
>> Would you excuse me.
It's not just an obligation,
it's not just something
that they passing, this
is the first obligation
of government, is to protect--
>> Right.
>> -- people.
Isn't it the first obligation?
>> Ideal. One of the things that
[inaudible] does, of course,
is we're responsible for
maintaining a registry
of IP addresses and that
registry is often used
because someone does
something in cyberspace
and the first thing
law enforcement would
like to know is, where is
that in the real world?
Because real world has the
people in organizations
and the cyberspace has to do
with domain names
and IP addresses.
And so, governments
feel they have
to enforce laws for example.
And yet the internet wasn't
built with an interface
for government is saying, "If
you're trying to do your duty,
here's how you go about
finding that person.
Here's how you go about doing
what you see as an obligation."
So, I know a lot of
people look at this
and they go this entire area of
governments and what they want
to do with the internet and
they want to take control
and they want to
do surveillance.
If you turn it around
and look at it,
remember that from the
government's perspective,
in many cases, these governments
feel they have an obligation
and the internet is
actively preventing them
from doing something
they're required
by their citizens to do.
So, we need to not omit the fact
that the internet doesn't
provide a friendly interface
to government.
So, me people would see that
as a feature but it's a fact
that shouldn't be
overlooked in the discussion.
>> Tradeoff.
What's the tradeoff?
Leslie?
>> So, I want to back this up.
This is not a new question
and it doesn't have
to do with the internet.
I mean, we have a--
an international
human rights framework
that explicitly makes national
security an exception for--
exception for-- in
human rights treaties.
It is an obligation of
the countries themselves
to protect our national
security.
But we also have an entire
developed jurisprudence
about publicly enacted
transparent law,
proportionate law, fair
process, remedy and oversight.
And, you know, I think
we just throw this
into the internet context, we're
just missing, there's a bit--
you know, there's a basic--
there's bodies of law and norms.
And the question here is not
whether there's a tradeoff,
it's whether you
reach a balance.
And I think when you have
most of this process secret,
the judicial process
secret, the oversight secret,
the interpretation
of the law secret,
then you cannot achieve
that balance.
The question at the
end of day is balance.
I certainly would take
issue with the idea
that the internet has been
unfriendly to law enforcement
and it's always this, you
know, we're going dark
and we can't see anything.
I think if we learned anything
over the last couple of months,
it's that law enforcement
really has access
to much more information
and that there is
therefore a temptation
to use what technology
has created
to go beyond what a balanced
human rights frame would allow.
And I think that's
what our problem is,
not that they're not
supposed to be [inaudible].
It is a first order of
business for government.
They are supposed to
protect their citizens.
But Steve, if you look at this
framework that's been created,
it was framework that was
created on a battlefield in Iraq
to make sure that you had every
less possible bid of information
to make sure you would
know about the IED.
And this whole haystack
and needle analogy assumes
that collecting a
haystack is proportionate,
and I don't think it is.
>> But the President
has said this program
of surveillance is
essential, the Director
of national security said that
it's essential, the Chairman
of the Senate Intelligence
Committee,
a liberal former mayor
of San Francisco had
said it's essential,
what's your quarrel with this?
And how do you rebut their
argument of everybody
who has had access to the
secret, say this is justifiable.
I'd like other people
to deal with this.
>> Please.
>> Yeah. [Multiple Speakers]
>> OK. So, it's-- there's
so many portrayals
of this as a binary.
And, you know, I think
the issue is more one
of degrees [phonetic] and
a more granular issue.
But there are some
binaries here, right?
We either have a
constitution or we don't.
We either have a fourth
amendment or we don't.
But if you start looking
at the actual practices
and I don't think we have a full
picture of exactly what's going
on but based on some of the
things that we've heard,
there are ways to enact the
necessary national security
without crossing the lines
that would be unacceptable
to the majority of the people.
So, having low level-- so
layers of control and layers
of accountability in the
processes of surveillance,
right, so not just having any
low level analyst being able
to take a fire hose of
information and download
that unto their computer just
to exaggerate the point, right?
So, there's the granularity,
there's getting the
information that's necessary,
there's the process
of accountability,
there's the issue
of judicial review.
So, I think that there are ways
to enact the necessary
national security now
that our public's
fear is online and now
that we have the privatization
of that public's fear
without going to the extremes
of basically having
the fire hose analogy
and just downloading
whatever data about anybody.
>> Anybody else on the panel
who want to pick on this?
Please, go ahead.
>> So, you want us to talk about
tradeoffs but I actually want
to suggest that I think a more
accountable clear system is
better for national security.
I think what's happening
now where you have very,
very broad authorities that
are unclear is almost the worst
of all possible worlds
for both civil liberties
and national security
because you have--
you have policy officials,
other governments, activists,
et cetera, poking
around in the business
of the intelligence community,
and that's not a very good thing
for them frankly, they want
to able to do what they do
of quietly and-- but in
order for us to do that,
in order for that to happen,
there has to be a clear sense
of what the rules are.
One sense I which-- I
largely agree with Leslie
but the one sense in which
I think this is an internet
problem is it's really
kind of a 9/11 problem,
and then I think a lot of
what's going on is going
on under the kind of exceptional
basis that we've handled a lot
of national security
issues post 9/11.
And to quote the President
again, ironically enough,
just a few weeks before this
whole surveillance story broke,
the President went to the
National Defense Universities,
you know, and gave a
speech saying, number one,
it is bad for a country
to be perpetually at war,
and number two, that
the threat level
from Al Qaeda was basically
below the level that it was
at 9/11, and that we should--
and that we should start
treating that threat,
that national security threat
as part of the norm not
as an exception that
we have to respond
to with these kinds
of exceptions.
This whole surveillance
program was created
as the President's
surveillance programs, you know,
as an exception, and
that's its problem.
If the problem is not--
>> Although continued by
a democratic president--
>> Yes, that will-- that's
right but I think that it needs
to be continued on a more
clear accountable basis.
And so I don't tend to accept
that the issue gets solved
by saying more surveillance,
less surveillance.
I think that the issue
is surveillance according
to what rules and with
what accountability.
>> Lynn I quoted [phonetic]
you-- please, your turn.
>> Yeah. I mean, I'll come
to my point [inaudible]
but just [inaudible] said
unwarranted surveillance.
I recognize the difficulty
in unwanted thief
in the perspective
of the individual.
So, it's very much
an unwarranted.
>> Thank you.
>> But-- And I just
really wanted
to echo the three comments
that have been made here
very, very, very strongly.
When-- In a system
like the internet
which is breaking all barriers
where we're so interconnected
and so interdependent, we
have to change our paradigm
of looking at this and move
it to one of managing risk.
We can talk about in the context
of tradeoffs if you like.
But we really do
have to look at--
>> Wasn't that the only context?
Or isn't tradeoff is the only
legitimate way to talk about it?
Because we-- isn't that
what we do everyday?
>> I think tradeoff
is legitimate way
but I'm not sure we're
making the right tradeoffs
and I think we continue to
focus on national security
and quite often run
by flagrant abuses
than we're conflating the issue
or we're not pulling it apart.
It's not about national security
and what is the best way
to protect national security.
From my perspective, you know,
what our members
are actually upset
about is the flagrant
abuses and the lack
of transparency and the secrecy.
>> Yes, please.
>> And I think part of
the problem that I have
with hearing what they've said
for giving the reasons is
we haven't heard the reason
that makes sense.
It's-- why are you
collecting the data,
it's like the equivalent
of your mom
and dad saying well,
because I said so.
You have to give it something
behind that, you know, to--
is there a credible threat
at whatever level there is.
I think Danny is right.
I think the threat level from
what we've read in previous,
you know, speeches
by the President is
that the threat level is a
lot less than it was at 9/11.
Well, if it's a lot less
than it wasn't 9/11,
why do we need to expand this?
Give us something to look at.
But I think when I hear
them talk about these things
and give these reasons as
somebody who does this type
of metadata analysis
and, you know,
just as for my own
infrastructure was certainly
in my own infrastructure I
can do a lot of that analysis.
But there's an uplink to me.
And those people that
control that infrastructure,
my infrastructure,
your infrastructure,
they can do the same
type of analysis
and it trees all the way up.
It's-- So, why?
You have to give us
a credible reason.
>> But was running through a lot
of your comments is a basic
mistrust of government.
I mean you've been told--
public has been told by
the Intelligence committees
which have in brief that this
is a very valuable tool PRISM
and other surveillance are
very valuable to the President.
Duly elected democratic liberal
president had said it's a very
valuable tool, the
liberal chairman
of the Senate Intelligence
Committee has said it's a very
valuable tool and everyone
of you is doubting it.
So, what is the source of your
suspicions and your mistrust
of what you're being told?
>> So, I actually
don't, you know, when--
there's no way for us to
know if it's a valuable tool
or not a valuable tool.
So, you could take them at face
value that it's a valuable tool.
Collection of data
is a valuable thing.
It's just not the only analysis
in a democratic society.
And I think-- so, I mean, I--
and that's I think
the problem here,
we live in a democratic
society where we're supposed
to have proportional laws
that more than a small number
of people are able to know
about to assess that balance
between liberty and security.
We really don't have that.
I mean, if this was a discussion
that they were tasking
these companies
with specific request
based on articulable facts
about specific individuals
or even several hundred
individuals,
we might be having a different
discussion about the balance
between liberty and security.
But we're having a
discussion about the value
of basically collecting
the data potentially
on everybody in the world.
I mean, we don't know the set.
So, it's not about trust
of government or not.
You know, I think you could
probably come, you know,
device a program where you
know everything about everybody
in the world and claim
it makes you safer,
maybe it makes you safer.
But we no longer are
following either the values,
the constitution
or the norms of it.
>> Well, we see you're not
following the constitution.
These laws were passed by
democratically-elected Congress.
>> Well, having been there.
Yeah.
>> Yeah, right.
>> Having been there,
they didn't know what
they were passing
and they didn't know
what [inaudible].
>> So, [inaudible] to that, you
just said the issue wasn't trust
and you now you just say, well,
they didn't know what
they were passing.
>> They didn't know--
they didn't know--
>> If you believe well-- well,
let me ask you the question.
If you believe in a democratic
system, now you're saying,
I don't believe
in the democratic process
that's passed these laws.
>> No.
>> That they were
misinformed and we know better?
>> No. The laws have been
after the fact, stretched
and manipulated in ways
the Congress didn't intend.
And that's where-- and that's
where we've gone off the rails.
Nobody who is in Congress
at that time, Section 215,
the metadata law
that we're talking
about which is aimed
specifically US citizens.
In making the changes they
were making, nobody believed
that relevant data meant
everybody in the United States.
>> Let me ask you--
>> So, you know, this is--
so this is ultimately--
yes, it's a question
of trust in so far
as the people implementing the
law have made it as elastic
as possible without it
completely exploding
and it might not be exploding.
>> John.
>> I've expressed no view
on whether it's desirable
or undesirable or whether
it's illegal or legal.
Neither of those questions
are really interesting to me.
When it comes to
the internet though,
the question is there's
a set of events going
on regarding surveillance,
and the internet is
global in nature.
We have allies and
trading partners
and organizations globally,
other governments who want
to understand what's going
on because they may
have the same desire.
There may be another government
that wishes for its reasons
to engage in surveillance of
communication in its country.
And it wants to understand
what is the framework
by which this is occurring
and how does it happen
in the internet, how was it's
supported, how does it go
on because they have their
own national interests
and they may pass laws in their
region that are perfectly fine
and acceptable according
to their processes.
So, we now have this
framework that says,
there are circumstances during
which surveillance is
apparently an accepted part
of the architecture.
What we don't have
is a transparency
about where that's occurring and
how that occurs and what happens
if another 130 countries
also do it.
I believe that if we're to have
equitable internet governance,
it's necessary to have
a fully articulated
and transparent framework.
I do not know whether
or not everyone is going
to like what their country
chooses for what it does
with surveillance or not.
That's a different question.
That's a question of laws
and governance structure.
But if this is going to go
on, we need to understand
where it occurs and how
it occurs and the fact
that it could be occurring
in a lot of places.
And that should be understood
and documented as part
of a clearly transparent
recognition
that this is part
of the internet.
>> Let me ask you
all this question.
You said you're not
interested in legality.
But one of the key debates
here is the legal framework
for PRISM and the FISA law.
And there's not much
debate about whether
or what's been taking places
legal or not under the law,
but there's a big debate about
whether the law is the right law
and whether the framework
should be altered.
If you all were to testify
before an intelligence
committees and others
on the Hill and asked,
how should this law be change
to advance the values
your talking about?
What would be some
of your suggestions?
Go ahead.
>> I think we have to start
by walking out the changes
that we made in the
records law because we took
out all the words that
make it proportionate
and make it targeted
and transformed it
into a broad collection statute.
And I, you know, I think
the ways to walk it back
that allows substantial
collection of targets
and information related to
that targets and backs it
out of this broad relevancy
standard that allows it--
if one to be collected
on anybody but also not--
it just takes out the whole
national security purpose.
Basically, you no longer
have to be collecting data
because of a particular
terrorist
or particular intelligence
activity.
You only have to be
collecting in order
to be protecting
the United States.
And that is simply to fraud
and to likely to abuse.
It is harder for me to
note what have changed 702
because we still don't
understand what they're doing.
>> And [inaudible] going
to thought about these--
>> Except for the transparence--
except for greater transparency.
>> 'Cause this is-- this
is going to be debated
in the weeks and months ahead.
It's a very important question.
>> But the traditional method
of before the haystack
love [phonetic],
if you allow me to call it that.
Before that was placed,
the traditional methods
of law enforcement
worked, and they would work
with whatever the
internet had to provide.
You needed to identify somebody.
You usually identify them
through a non-technical aspect.
You had an informant
somebody tell you, "Hey,
I think that you were doing
something illegal," and that--
that puts the focus on you.
From there, you would use
your traditional tools whether
they're on the net or
not to find, you know,
to find out what's going on.
Collecting data when there's
no evidence of a crime is kind
of counter intuitive and
kind of productive in a way
because it's sitting there.
So, if it were me and I was an
adversary of the United States,
I would find where those--
that database was being stored
and I would go after it to
collect all that information
because I can do the same
type of metadata analysis
that could be done that way.
The third thing is--
the reason why we--
you've mentioned earlier that
maybe we've mistrusted is
because we know what can be
done with that data and we know
when somebody-- we know
when somebody has given
us a [inaudible] answer.
Because I know that if I
control my infrastructure,
I know the power that I
have doing in my office,
I have to act responsibly
and within, you know,
the laws and all that.
But I know what can be done
if there's no oversight
on my position.
And that's why I
think you hear--
>> But there is oversight.
It is, you know, I mean they--
>> It would be if you were-- if
you were the oversight committee
for what I would do technically,
do you have the technical skills
to know what I'm doing?
[Inaudible Remark]
>> So I'm going to
leave the legal--
so the legal policy
questions, I largely agree
with what has been said.
I don't think that we
have adequate oversight.
And if I were-- I was talking
to Congress about what to do
to increase the trust
in this environment,
I would say that that Congress
and the FISA court
needs a more effective
accountability mechanism.
There is no way that the FISA
court judges or the members
of the Intelligence
community are looking
at every single query--
>> Right.
>> -- that NSA analysts
are performing
on these enormous
amounts of data.
It is technically possible to do
that to Mike Nelson's [assumed
spelling] earlier point
to actually evaluate whether
when Bob led the general counsel
of ODNI says, "Yeah, we have
the whole state haystack
but we only ask questions
based on certain predicates."
I know Bob and I more
or less trust him
but I think it's outrageous
for the government to say
to its citizens,
"Don't worry, trust us."
We give the government
considerable authority
but we need mechanisms
to make sure it's being
used responsibly.
And none of the oversight
mechanisms that exist
that you've mentioned are
able to provide that kind
of accountability
other than to call
up a responsible
person like, you know,
Mr. Ledin [assumed
spelling] say,
"Are you following the rules?"
And he says, "Yes, we're
following the rules."
He doesn't even know what
every single analyst does
in the agencies that's
responsible for it.
>> Let me ask you about one of
the proposals that has been made
and why they debated
in Washington.
And so, a lot of people
pointed out one of the flaws,
critics had pointed
out one of the flaws
in the FISA process is there's
no adversarial mechanism
within the FISA system that the
government comes in and asks
for permission and there
is no counter voice.
There is no adversarial
proceeding.
And in most legal systems, there
is a mechanism for challenging.
So, a number of legislative
proposals have been advanced
in one form or another.
Is this one of the mechanism set
that Congress should be looking
at to try to reorder
this balance that many
of you think is out way?
>> Yeah. I actually think
this question of some kind
of a public advocate and
I think it's probably--
we figured it out how to give
private lawyers the ability
to look at classified
information
and criminal procedures.
We ought to be able to come up
with this table of those people
who litigate the
national security state,
who can stand in for the public.
And, you know, I think it's one
of many potential points of sort
of strengthening oversight here.
I think it's-- I think
it's critically important.
I think different levels of sort
of reporting back
to the FISA court.
I think part of the
problem and the reason
that people are responding
is, I mean,
it's interesting FISA was
a civil liberties measure
when it was adopted in 1978.
And, you know, it's basically
my mentors were the civil
libertarians who
proposed this crazy idea.
And the world is really
different from the ones in 1978.
And being in contact with
a foreign person, I mean,
you know, there was
an iron curtain.
If you were-- if they
could actually figure
out you were communicating
with a foreign person,
it probably had some
significance.
We now live in this
world where 20 percent
of us are first generation,
where our corporations
are global,
where we travel all the
time, and where we--
people don't identify
themselves in that the same kind
of we-them way and
yet we have a law
and a designation
of foreign persons.
That really is a Cold
War era and almost has
to be reconsidered
all by itself.
And the law can't
be working very well
because it's not
supposed to sweep
up American's communications.
And all you have to do is read
the minimization guidelines
to understand how much of our
communications [inaudible].
>> Hal Berghel wrote
an interesting article
that just came out in
IEEE Computer magazine.
It's called "Through
the PRISM Darkly".
And there's a quote
in there that says,
"The Foreign Intelligence
Surveillance Court has an
approval rate of 99.93 percent
of all surveillance requests.
While this might not meet
the strict definition
of a kangaroo court,
it seems to fall
within the marsupial family."
So--
[ Laughter ]
The point of it is is that there
are, you know, there are two
and he calls them cyber urban
myths that deal with this.
One is that you do need
to all of this, you know,
information to do it
and that there's not--
there's sufficient oversight.
And I think we all agree,
there's not sufficient
oversight.
You wouldn't ask me to be on
a medical oversight committee
to review surgical procedures
because I'm not a
doctor, I'm not a surgeon.
I don't have that
expertise in that area.
And so, I think that's the--
I think that's the thing
that bothers the technology
people is we know it is a small
community of the actual people
that know the nuts and bolts
about technology security
from that standpoint.
And if we don't know
each other by--
personally, we know each other
by reputation and whatever.
And so when you look and
you query your other peers
and they have the same
misgivings that you do,
then it makes you
wonder what's going on.
>> I just want to make one
other point about the people
who are being swept up in
this-- in this surveillance.
I think one of the reasons
that more fine-grained
accountability is really
critical is we don't know
but I think it's a safe
assumption that, you know,
somewhere north of 90 percent
of the people who are on any
of these terrorism watch list
who were the targets of any
of this surveillance
are Arab-Americans
who have some connection
to the Middle East,
many of whom are
probably Muslims.
In this country, we don't
have a great history
of treating minorities
perfectly fairly.
And part of the reason we
have oversight mechanisms,
part of reason we have open
courts, part of reason we try
to have due process with public
visibility is to make sure
that we actually do
our job and really--
and really treating people
fairly and not discriminating.
And again, I don't--
I don't think--
I think it's really interesting,
you hear this panel,
you hear the public discussion.
I don't think through
that many people saying,
"Stop the Surveillance."
People are not saying that I was
in Boston during the
marathon bombings.
People are clamoring for the
instillation of more cameras.
You know, the question
I really believe is not
about whether law enforcement
should have these tools.
It's about whether
we can feel confident
that they're using
them responsibly.
>> One of you made
the point that--
actually, this is
not a new issue--
that just a few years ago,
Washington posted an expose,
a very lengthy one, Diana Press
[assumed spelling] and others.
And there was not anything
like the same kind of reaction
that there's been lately.
And I'm wondering why you
think what's the difference,
why has there been such--
why are we having this panel
and not three years ago.
Why is there so much
conversation now
about this issue
when, in fact, when--
at least some of this
was made known through--
to post another mechanisms
some years ago?
>> Smartphones.
>> We didn't have the numbers,
we didn't have the numbers.
I mean, from the Verizon
order, we now know.
It's-- I don't remember
how many millions
of people's records were--
tens of millions of people's
records were collected.
We had-- we have
general outlines
from the post reporting,
from, you know, from all kinds
of things but we did-- it was
not dramatized with numbers.
And I think and now we know
it's basically everyone's data.
We never actually knew
that was happening.
>> Right. I think that's right.
I also think there was an uproar
when the warrantless
wiretapping was first revealed.
We had a very big
battle in Congress.
People were really
worried about it.
People who were not
following it closely
and they passed an amendment
might have actually thought
that what we wound
up with was some kind
of additional procedural
protections.
Instead, we wound up-- what we
wound up with was a ratification
of that program, immunity
for the providers,
and not much more.
And so I, you know, I
think a lot of us knew some
of this was going on but
I think we didn't know.
One of my colleagues, Jim
Dempsey who's on the Privacy
and Civil Liberties Oversight
Board wrote an article saying,
"Did they just approve
a vacuum cleaner?"
And lots of people then
responded with, "Oh,
you know, that's ridiculous."
Well, no, in fact, they
approved a vacuum cleaner
and we didn't know it.
>> Laura? Laura.
>> I agree with those points.
I think it is a matter
of scale in this case.
But I think there are
few other points, too,
that are a little
bit more subtle.
So, one issue is that this came
after Hillary Clinton's
internet freedom speech
and this entire mean
[phonetic] that has been created
and I think rightly
replicated around the world
about internet freedom.
So, this is a very important
issue and what this does is it--
it's providing an opportunity
for people to challenge
that notion of the internet
freedom and to point
out what they're
calling hypocrisy.
So, that to me is a problem.
It can also be an opportunity
if it's addressed appropriately.
Another issue is that individual
citizens are living their life
online to a much greater extent
than they did five
years ago even.
It's really amazing
if you think.
I mean, I'll just give you
an example I was talking
about today.
So, five years ago, I was--
I would read the paper
in the morning and now,
I'm completely connected to
my smartphone all the time
and I am posting information,
I'm communicating in it.
So, the public sphere
is what's at stake here.
The public sphere is online.
And I think that
that is much greater
than during these last
instances a few years ago
that were mentioned.
So, scale, the issue of
people being more cognizant
about how their internal
private life is lived
out in the digital
realm and also the mean
that has been fairly successful
about internet freedom
has ensued
in these intervening years.
>> I think those were
grilling [phonetic] points.
>> Yeah. [Multiple Speakers]
>> I was just going
to say quickly
that technology makes lots of
things possible and it was clear
at the time of those articles.
It's clear today.
It doesn't mean that
it's eminently minable
but that's done outside
of due process.
And I think those are the things
that are getting
people excitable.
It's not technology and it's
not what's technically possible
or even what's technically
feasible.
It's actually ultimately
the full mining
and then the lack
of public scrutiny.
>> What about the argument
that you hear from some people
that not only are we three
years farther away from 9/11?
Several of you made
the point that a lot
of these original
processes were passed almost
in a battlefield mentality,
one of you used that phrase.
And that there were extreme
circumstances with the value
of national security
in protecting the
homeland very much
in the forefront
of a public debate.
So you got three years later
but some people would argue--
there's almost a
contradiction here.
Certainly the intelligence
community would say, "Well,
actually people are more
relaxed now and less concerned
about the threat because
our systems have worked
and that you shouldn't
be dismantling systems
which have actually made
people more secure."
How would you answer that?
>> Let's concentrate to what
the President has said though.
I mean, what the
President has told us--
>> Although he has said
he's in favor of the system.
>> Yes. And he said that
the capability is necessary.
He's in no way precluded
additional oversight.
I think he's actually
welcomed the discussion of--
>> Sure.
>> -- of what kind
of oversight we need.
But he said we're living
it in a lower threat level
and that we need to make
changes as a society
and as government
in response to that.
So, not the other way around.
>> Edward Snowden, to some
people he's a traitor.
To some people, he's a hero.
What do you think?
>> Everybody, on your clearance
you signed a document that says,
"If you disclose secrecy,
there's a penalty."
So, from that standpoint,
you clearly violated
that document and, you know,
I don't have a problem with--
with being prosecuted
because of that.
On the oversight--
>> Because it this contractual--
>> Because it is contractual.
I mean, it states, "Anybody
who has a clearance."
You have that-- it's in
your-- in the agreement,
and it says clearly that you,
you know, 25 years in prison
or whatever it is that--
>> You actually read that?
>> No, it's-- yeah.
It happens when you come from
a family of lawyers, you know.
>> It's privacy policy.
>> No. I mean, you look-- yeah,
I mean, if I'm going to go
to jail, I want to know
what I have to say, right?
And, so I think-- I think
from that standpoint, yes.
I think the rest of the
community is still trying
to divide it as to whether
or not the information
that is disclosed is a problem.
And some people go back kind
of like what I was saying
is there have been articles
about this for, you
know, a long time.
You know, you can go
back into the '70s
when NSA had the ECHELON
program, in the mid-'90s
with Omnivore and
Carnivore, you know,
all of those type of things.
You've had this type
of electronic surveillance
for quite sometime.
So, we're not quite sure what
it is and not having seen all
that is disclosed but we're
not quite sure of what it is
that he's telling the world
that we don't know already.
And so from that stand point,
I think part of my community,
we're not sure about
the content.
But certainly he violated an
agreement and he, you know,
that's what they're going
to go after him for.
>> So I have mixed views
about Snowden because I think
that whistleblowing in a
free society and a right
to know is critically important.
And if you go back
to Pentagon Papers
and terrible things did not
happen with the publishing
of the Pentagon Papers.
[ Pause ]
So-- so, part of--
>> I'll cover the
Pentagon Papers story and--
>> I'm sure--
>> -- cover Daniel
Ellsberg's trial.
So, I'm--
>> And Ellsberg's trial.
>> Yes.
>> And Ellsberg is very,
very out there right
now about Snowden.
I was kind of right with him
until he started keeping away
the US cyber security strategy
with respect to China
while he was in Hong Kong.
And then I started to wonder
exactly what he's motivations
were, you know.
And so, I do feel like
there's a level or recklessness
that worries me and although
it's hard for me to really know
if there's damage to national
security because I've set
in some of these meetings
with national security people
who I've held up the
minimization guidelines
and said, "Is there anything
in here that's really
secret or should be?"
And they said, "No."
The existence of
these programs sort
of have been broad-brushed
some people knew.
So, if at the end of the day,
it's that the public
finally knows
that it's happening
'cause I'm not sure
if any thing's been revealed
that will allow people
to somehow evade all of this,
that, I think, you know,
society has to make
room for whistleblowers.
I just wish we had
whistleblowers
who were a little less reckless.
>> Well, you know,
as a whistleblower,
he appears to have
delusions of grandeur.
>> Well, right.
>> I mean, the first
couple of days, he had--
he'd made statements
such as saying--
such as that he could order
a wiretap of the President
and he could, you know,
had all the locations
of all the NSA stations.
Pretty unlikely, that said,
I guess the, you know,
I think that his disclosures
have done a public service.
I think he's a whistleblower
in that sense.
I actually agree.
It seems like he
just broken the law
and doesn't seem
very complicated.
What I find confusing and--
is that somehow the
traditional notion
of civil disobedience seems
to have been lost track
of in what he's doing.
I mean, I wish-- I mean, look,
he's obviously has put his
life at total risk and--
>> Although traditional
idea is you pay the penalty
for having broken the law.
>> You pay the penalty and
you [inaudible] having a trial
and you expose-- you expose the
things that you're concerned
about and you expose
them, further look at me.
He's a civilian.
He wouldn't be tried in a
court martial the way Bradley
Manning is.
>> Right.
>> He could've staged
a public trial
and a big public discussion that
he chose not to and I, you know,
again, it's not for me
to say how people should put
their lives at risk as he'd has
but I think it's a-- I think
it's an incomplete active
civil disobedience.
>> Anybody else want
it, got a view on that?
Let me ask you another question.
One of the things that
Dan said that the--
are the very interesting
analysis when he talked
about the internet not being
a creature of the state,
how earlier forms
of communication whether
it's telephones and telegraph
and so many of them grew up.
Television spectrums,
I mean, they were sold
by the government,
right, and still are.
So, there was an inherent
regulatory legal organic
connection between earlier
forms of communication
and government regulation which,
as you point out is missing
in a large part in this
particular system we've all been
talking about.
>> But it's more than a legal
question it seems to me,
it's also a cultural question,
that a lot of you being soaked
in this culture and committed
to this culture have evinced
skepticism of government
and of regulation and yet at
the same time, you're grappling
with the notion of how do you
bring some order to a system
that continues to cry out
for some kind of regulation.
I'm interested for you
to use a little bit
about the contradiction between
sort of the historic origins
and the culture of this
system and the need
to bring some order
and regulation to it.
It seems to me there's an
inherent conflict that's worth
talking about.
Start.
>> I'll kick off the
discussion about that.
I think sometimes when
you go to a workshop
that discusses internet
governance,
it descends into a question
such as who should grab the keys
of internet governance, right?
So, the question of--
so if question were asked
such as should the United
Nations control the internet
or should the United
States control it
or should Google control it?
Like those kinds of
questions don't make any sense
in their first instance.
And part of that is because
there is no monolithic system
of internet governance.
It's highly granular,
it's multi layered.
There are-- there are
so many different levels
of internet governance.
I mean, I-- my book is very long
and I only get into
some of them.
But the issue here is
multistakeholder governance
and what that means.
So, internet governance is very
interesting because it is not
about governments necessarily
but that doesn't mean
that there's not a
role for government
in many different areas.
So, if I have identity theft,
I'd like the government
to step in and help me.
I expect antitrust enforcement.
I expect laws about--
some laws about privacy.
I expect some laws
about intellectual
property rights enforcement.
I expect certain national
statute on any variety
of things related to
trade, child protection.
So, there are many places
for the government to be
in internet regulation
and governance.
But most functions
that are related
to the operational
stability and security
of the internet have
not been the purview
of traditional governments
but they have been enacted
by private industry.
And I think that that's
a very important point.
Now, tying this back
to the PRISM issue,
one concern that I have is
that the one separate area
of how the internet is
being used in the data
that can be collected
and gathered will bleed
into this other area of
the operational stability
and security of the internet.
So, we can expect to see
this issue used as a proxy
for governments that are
interested in gaining control
in certain operational
areas of the internet.
I guarantee that
that will happen.
So, we'll see that in some--
once he calls for
international treaties.
Again, that leave
out private industry.
That leave out civil society.
We'll see it in additional
calls for bringing
in a more multilateral control
of some critical
internet resources.
So, I think one bleeds
into another.
But that it's really not
about private industry
versus governments
or civil society
versus private industry,
for example.
It's about the more
contextual question
of what governance is
necessary for what--
which particular area.
So, in certain cases, it's
completely appropriate
for only the private
sector to be involved.
In other cases, it's the
purview of government.
So, it has to be asked
in a contextual area,
and that's what multistakeholder
governance is.
It's not everybody in charge
of every-- it's granular area.
That's not it.
It's about what's
the appropriate form
of governance in
a particular area.
>> John?
>> So the one thing
we do know is
that the internet
evolves very quickly.
I was involved 25 years of
it and it changes rapidly.
What was dial-up
modems and just Telnet
and FTP quickly became
circuits and high speed fiber.
Yeah, OK, sorry.
[Inaudible] And the web--
>> My work here is done, yeah.
>> The fact is that that
evolution doesn't work well
with government regulation.
Having been involved in
the telecommunication side
of the industry as opposed the
internet side, I also dealt
with a lot of [inaudible]
regulations.
And generally, we're dealing
with regulations that were three
to ten years behind the
time trying to drag them
into current circumstance,
and it was inevitable.
The process of regulation
doesn't work well
with a forward-looking
rapidly evolving internet.
So, when we start
thinking about--
about the non-governmental
nature of the internet,
as the internet grows up, we
find ourselves in a conundrum
because the internet
is no longer optional.
The internet is intwined
in everyone's life.
Governments are looking at
their economies and going,
"Some percentage of my
economy is tied to this?"
You can say I have a choices
to whether or not I'm going
to participate but I don't.
I have to be involved.
So, governments are now
realizing they have no choice,
they need to be involved.
They look at their duties,
their responsibilities.
And they go, "How do I
affect what I'm required
to do in this new media?"
And it's a big challenge,
folks, because the protocols
and the operational conventions
that hold the internet
together have to be global.
They have to interoperate
on global basis.
But governments are
used to acting
on a national basis
occasionally regional,
occasionally bilateral,
multilateral, but generally,
they act on a national basis
and you can easily break
the internet by attempting
to regulate it without the
idea of a global context
in which it operates in.
This is really what's
been opened
up in the last five years.
In the last five years,
governments are realizing
they do need to get involved,
they don't understand
the framework
by which they can do so safely.
In general, this has resulted
to not much regulation
but that's not necessarily going
to be the case going forward.
And it's-- the internet
community and I say
that being the operators,
the various entities
that operate pieces
of the infrastructure
aren't exactly forthcoming
to government saying, "Here,
this is how to safely
regulate the internet.
That guide book has
not been written."
And so, without a meeting
of the minds on this topic,
we run the risk that governments
are going to make regulations
that make no sense or break
portions of the internet.
Until this an understood
common framework
for this, we're all at risk.
>> Lynn, do you wanted to talk?
>> Yeah, very good comments
both from Laura and John,
and I just want to make sure
that we're left with a myth here
that much of the internet
committee believes
that there's no rule
for governments.
We've been engaging
with governments
and certainly we could have
done it better over the years.
But, you know, for many,
many years when ISOC had a very
small staff of seven people,
I was going to [inaudible]
and spending weeks
because you don't go to United
Nations meeting for two days,
you'd go for two
and three weeks.
That's a significant investment,
and we went because
we wanted people
to understand how the
internets actually worked.
And I-- it's like a slight
disagreement with John
and nobody has the answer
but how we regulate for a lot
of this new environments.
It's not because we think
there should be no regulations
because we can't
figure out what sort
of regulation there should be.
And to John's other point,
it is because of the pace
with which the internet changes.
And it is to the fact
that it has broken
so many global boundaries.
You can do it within
any single context.
So, it is a very,
very complex world.
And we continue to advocate for
dialogue discussion, thoughtful,
let it take the time it needs.
We need to pull apart lots
of different complexities
and let the right thing emerge.
That is not the same
as saying no rule
for governments but
it often gets--
>> And another dimension of
this is not all governments are
the same.
Because you can talk about China
being a pretty malign force
in terms of internet freedom
to say nothing of Iran
which uses the internet to roll
up all networks of descent,
where in Tahrir Square,
the internet is a
mechanism for expression.
So, part of it, you can--
part of the problem
is government even
as a concept is subject to
many different definitions.
Yeah.
>> It is Steve but that's
probably why the western
governments that operate
in democracies with legal
and constitutional frameworks
that it's particularly
important.
And one of the reasons people
are particularly unhappy
about this particular
surveillance is if we're trying
to come up with this
nuanced role
about where government belongs,
we haven't exactly
sent the right message
to the rest of the world.
And I think that's part
of the problem here.
>> I just want to offer a
kind of a friendly amendment,
I hope to John's view, you know.
I think John correctly
points out that attempting
to regulate the internet
infrastructure whether it's the
institutions that
operate and design it
or the actual operators
is very [inaudible]
because it changes
very quickly and it has
to work on a global basis.
That was the experience of the--
of SOPA, the Stop
Online Privacy Act,
was that policymakers
thought they could reach in--
>> Piracy.
>> -- sorry, Piracy Act.
>> [Inaudible] is the stop--
>> That's right.
Thank you, thank you, thank you.
Right, you know, governments
thought they could reach
in to the internet
infrastructure
to solve a problem
that is really
about human behavior
and violating laws.
However, at the same time
there are plenty of legal rules
that actually work incredibly
well on the internet.
You know, the FTC is still using
the Fair Credit Reporting Act
from 1970 which was one
of our first piracy laws
in United States to stop very
advanced sophisticated online
services from harming people.
That's not causing
John any problems.
He's not-- he [inaudible]
even think about it.
And so I think there's-- I think
that it's when governments try
to do this sort of
shortcuts that they were used
to doing is you were suggesting
with the broadcasters, you know,
say, "You guys have to behave
this way in order for us
to achieve a broader
policy goal."
That's where we really
have problems
in the internet environment.
>> You know, number of you have
used the word privacy as a value
and as a very curious
[phonetic] value.
But I want you to
play with the idea
that perhaps we're
actually really ambivalent
about the question of privacy
because in fact in many ways,
the wide availability of data is
actually an enormous convenient.
Every time you go to
a website and you put
in your e-mail address
and up comes your password
and all your credit
card information
and at one stop shopping
at Amazon, right?
We all love it but isn't that--
>> But we've made a decision
to share that with them, so.
>> Well--
>> Wait just a minute.
I want a-- I want a--
I understand that.
But I want to pose a question
about whether at some point,
aren't we really ambivalent
about this question
that we actually like people
to have information about us
when it serves our purposes of
convenience and accessibility.
And that there was a certain--
I understand your
point about voluntary.
But this will mark [phonetic]
a point here as well.
>> I think part of
the problem is--
is the public is
realizing that the people
that are storing the
information that we give
to them aren't doing a good
a job in protecting it.
I could go to my local bank--
I grew up in Falls Church.
I used to go to the local
First Virginia Bank right there
in Falls Church.
I gave them all my
information when I opened
up my first account
when I was a teenager.
And I had a reasonable
expectation
that the only way they
were going to get that--
anybody else is going to
get that information was
if they actually went in and
rob the bank and stole the file,
you know, the file
cabinet or whatever.
>> Or if anyone in the
government wanted it?
>> Or if anyone in the
government want it?
I mean that-- I mean
that, you know, that--
that's a reasonable thing.
But nowadays when we find
out that you have all these
massive data leaks, you know,
and when you examine
the root cause of what--
what the leak was from
a tactical standpoint,
I should have been
fixed 15 years ago.
You know, some of these attacks,
I do a talk now if they ask me,
you know, I guess it's because
I had gray hair and I've been
in the security business
for 20 years.
I think I noticed something
but I do [inaudible]
that the contract--
>> It's because of the ponytail.
>> I think it is.
You know, it's the-- it's the
same attack methods that we saw
in the early '90s are
still effective in 2013.
And so my question to the
technical community was,
what have we been doing
this last 20 years?
I mean, you know, I in the SANS
Institute that I was a part of,
we drew up a top
ten threats in 2001.
I pulled it out and today
in 2013, every single one
of those top ten threats from
2001 is still affecting today.
So, what have we been
doing to protect ourselves?
And I think it's that type
of thing that's causing this
rumble about, you know, well,
private industry isn't
doing anything about it,
individuals aren't
doing anything about it.
So, that only leaves
a government
to do something about
it, you know.
And so I think that's why
we're seeing that pushing
that direction, 'cause
historically when you look
at it, the same things
are still affecting us,
it's just that what's
happening now is back in 2001,
the scope of a data breach
was much less than the scope
of the data breach now.
>> Sure, sure.
>> And just one last
aside [phonetic].
The iPhone, I think,
I looked it up,
is five years old
today-- this year.
So, to answer your question
about why the difference from--
>> Hold on just a second.
John and Leslie, you want a-- is
there a point you wanted to make
about the voluntary
submission of data.
>> I don't think so.
I think every point has been
made by everybody quite well.
>> OK. John?
>> You thought-- say that maybe
people don't value privacy
over convenience, and that
that we give up privacy
and there's no outrage
about that.
Why is there an outrage
in some cases?
And I guess the question
that comes up is,
a lot of people voluntarily
give up privacy
because we want the convenience.
We want to click the button
and get the order done.
We want the website
to know who we are.
We want to fill out
all the information
on the airline profile because
it just makes traveling a little
easier and anything
that makes it easier--
>> And who can not remember all
those damn passwords anyway,
right?
>> So there are times
we do that.
But the reality is that there's
also times when you consent
to losing your privacy
in situations that aren't
for your convenience but
you're going to anyway.
Next time you're
going to your doctor,
pick up that HIPAA form
and look at it, OK?
It says, "We're going to
share your data with the CDC
because if there's an
outbreak of something,
we're going to do that, OK?"
It's not for my convenience,
hopefully, I'm not relevant,
but the fact is they're
going to do what--
and we consent with because
well, we knew about it.
We understand--
>> And there's a public
interest involved because--
>> There's a public
interest but also as annoying
as that form is, they
took the time to tell us.
I do think that's a
different question
when there's surveillance
is going on
and you don't know
about it at all.
And I think that might be the
angst behind a lot of this.
>> Leslie?
>> Well, I also think
that there's difference
between surveillance
by the government
and surveillance by companies.
I mean I think they've
become more joined together
because they are now the
source of all the information
that the government
is collecting.
But I think at least in
the constitutional system,
not in Europe where they're
making a distinction.
Having that government who has
the capacity to impose penalties
on people and make choices about
who is going to be prosecuted,
et cetera, is entirely different
matter than whether I like
or disliked it, you know,
Google Profiles keep
sending me the same crib
for my pregnant daughter
over and over again.
I'm annoyed, I'm
extremely annoyed.
But I also think--
>> They knew she was
pregnant before she did, yes?
>> Yes.
>> Well, they probably
did, but they knew
that I did this thing once
and it's following the--
it's literally following
me around the world.
I mean it's became
kind of a joke.
But I find that annoying and
I'm somebody who knows how
to set my privacy settings.
So, you know, in one-- in
one circumstance it's a loss
of some measure of control,
we ought to have more control,
it's one of the reasons we
ought to have a baseline build
to give us some fair
information practices.
In the other, you know, the
government has the capacity
to make very important
decisions about us.
And we have a constitution that
says they have to follow rules.
And so we do react differently
to the crib and to the NSA.
>> Yeah.
>> The one thing I would
say about our tension
between convenience and
privacy is-- I just think that--
I think privacy has always been
a kind of a contested concept.
I mean, it's not an absolute.
I don't think there's
anyone hardly in the world
who thinks it's-- it's
an absolute condition.
But we know, as Leslie
is saying,
we have different
privacy expectations.
We want different privacy
results at different times.
And some of the-- some
of the privacy promise
that we have [inaudible]
like seeing the same
crib too many times.
Others have real consequence
like if you're credit
report is wrong
and you can't get a mortgage,
you're really, you know,
it really-- or you don't
get a job, you know,
it really makes a difference.
And I think that-- I think
in a way the phrase--
I think categorizing privacy
as a single right is what
makes the conversation a little
confused because it-- the
reality is it's a number
of concerns wrapped up into one.
>> But in this day and age,
isn't it a little naive to say,
I'm going to give my
information to Amazon?
Or I'm going to give my
information to Google
for my own purposes because I
choose to, and somehow I'm going
to be able to control--
>> Oh, yes.
>> -- all the use
of that information.
>> Yeah, that's not--
that's not doable,
and I don't think any
privacy law we would pass
in this country would
change that.
>> It is naive but I think--
I think the problem is I don't
mind giving you any information
you ask about me if you tell me
ahead of time what you're going
to do with that information.
So, what John was talking
about in the HIPAA thing
when they say, "We're going
to give it out to the CDC,"
and you tell me ahead of
time before I give you
that information, that's much--
that's a much different
environment than giving it here
and then buried in [inaudible]
fine print is a little click
thing that says,
"Oh, by the way,
we're going to just give
this to anybody we want."
>> We only have few
minutes, and I want to--
you've mentioned out, a number
of you, the fact that we're--
we're dealing with a very
complex system as Laura said
with a lot of stakeholders here.
And one of the stakeholders are
these providers whether it's
Google, whether it's Yahoo,
who take that information.
They don't warn you in some
ways that they're going
to give it away because they're
not voluntarily giving it away.
They're being subpoenaed,
they're being ordered
to by the government.
And they are caught.
We have-- this institution,
these huge institutions
are private institutions,
but as we've learned
through PRISM
and all the revelation
subject to government orders
and government warrants.
And many of them are fighting
this at least fighting
to be able to be more
transparent because they have--
they're stakeholders, too.
And-- and they are worried
that their brands are going
to be tarnished by
being swept up in this.
And to talk a little bit
about the special role
of this institution, these
intermediary institutions,
the Apples, the Microsofts,
the Googles who are
on one hand are getting
pulled by the government
to release information.
On the other hand, they're being
pressured to keep it private.
>> Yeah.
>> Yeah, that's a really
important question.
And if you look at-- Let me
just back it up to before PRISM.
If you look at even just the
Google transparency reports
which probably don't have
everything and I think
and even Google says that
national security isn't--
not everything is reflected
in the transparency reports.
But when you do look
at something like that,
you can see that there
is a very big disconnect
between what governments
are asking information
intermediaries to do and
what they actually do.
So, you can see that--
just to give an example.
Maybe they disclosed data about
an individual in 37 percent,
37 percent of the time in--
to the Brazilian government
or 47 percent of the time
to another government.
Do you see what I'm saying?
There is a disconnect in that
differential between the request
to turn over user data
and the actual instances
of turning over the data.
That's where they have this
special governance role.
And that's an important
point the make.
But they also bear a burden
in carrying out a request
such as the more
recent revelations
because there is a
public relations hit.
There is the cost of hiring
numerous attorneys to deal
with the fallout from this.
And there's the possible
economic impact
of having trouble doing business
in other parts of the world
that might be suspicious
towards those companies.
>> Yeah, you might be thanking--
they might be thanking Snowden
for the revelations
because and, of course,
by the national security
letters.
They're not allowed to--
to let anyone know that
they collected the data.
So, now on the other
hand, you have the--
the Snowden revelations
which were saying, "Hey,
these guys had to
give out the data".
So, you know, and
now that's come out.
What they should have
said is instead of trying
to back [inaudible]
been saying, "You know,
trying to save their
reputation."
We said, "Look, you
got subpoenaed,
we know you got subpoenaed,
you gave up the data
because it's a law."
But I think that that
piece of it is the secrecy,
not being able to say why
something was collected,
you know, and not being able
to even say anything about it.
Lesley?
>> Well, the question raised
is this bigger question
when we talk about players
in the internet echo system.
The intermediaries
are a critical part.
And we have-- under our law, we
sort of provide them with a lot
of special protections
because they can't be liable
for everything that's going
on in their platforms.
And yet there under
enormous pressure,
this national security
revelation is one small piece
of this very complex
environment where when you ask
about governments and government
is trying to regulate,
the first place that governments
go in trying to solve anything,
a social problem is to
try to figure out how
to get the intermediaries to
take on that responsibility.
So, this has been sort of
a tension that's existed
since we actually had any
kind of powerful intermediary.
So, that's one side, and the
other side is the amounts
of power they have as
a governance entity
to set the rules of
[inaudible] for the internet.
And so--
>> And one of the
interesting variables here is,
I noticed just recently that
several of the big players,
Microsoft, Google, Yahoo,
I think three have
all filed suit--
>> Right.
>> -- in an attempt to be
able to be more transparent
about the role and the-- and
the requirements they're under.
I think in part because of
the point Laura was raising
that they themselves, given the
culture that they're part of
and the stakeholders they have,
they are taking a big public
relations in branding hit
for this kind of cooperation.
>> And they also have
real leverage here
which they have exercised
on occasion, you know.
Many of the big intermediaries
Google, Twitter,
others have at times gone to
court to challenge subpoenas
and other kinds of court
orders they receive saying
that they're too broad.
I mean, Google had been--
had been under order to
turnover large volumes
of search log data, this
was just a criminal case,
it's not [inaudible] cases.
>> Right.
>> And they've challenged
those and they, you know,
so the intermediaries have
a really important role
because at least until
there's a change in the law,
they are often the only ones
who can actually bring any kind
of challenge to the
scope of the--
of the authority that the
government is claiming.
>> This is nothing new
because in World War II,
the federal government used to
look at the telegraphs at RCA
and look at the phone system
for records and stuff like that.
I mean what's changed
is the number of players
because back then it was
just those two, I suppose,
plus maybe some minors.
Now you have you know multi--
multinational groups
all over the place.
But the technique, I mean,
the tactic of government going
to a provider and getting the
information they need, it's--
it's [inaudible] been there.
>> I would assume the
Egyptian government did
that when the communications
were chiseled, you know,
letters chiseled
on stone tablets.
>> I'm going to-- We
have five minutes left.
I'm going to give each
you one minute to--
we've had a long day,
very attentive audience,
lot of issues.
But I want to give each of you
a chance to make a final point
after this conversation.
You started with
your initial remarks.
But what do you want the
audience to be left with?
What-- in each of your mind
is something significant we've
talked about today, an issue
that you want to reinforce?
I want to give each you
a chance to do that,
and since we've started
with you earlier,
I'll start with you this
way and we'll go down.
>> So as contentious
this processes is,
I think it's incredibly
important,
the dialogue that's happening
in the United States.
Hopefully, we can do more than
just, you know, have battles
in the newspapers and on panels
like this and really look
at our legal system and make
sure that it has the kind
of accountability
that we needed.
Laura said, you know,
the problem here is scale
and I really agree with that.
Courts have been good
at supervising electronic
surveillance [inaudible] one
wiretap on one phone number
or a handful of phone numbers.
The scale of intrusion here is
such that we can only manage it
and have it a proper oversight
with using exactly the same kind
of computational tools
and analytic tools
that are being used by the
intelligence agencies to try
to mind this data to begin with.
And the one other thing I
would just say is, this is--
as several people said,
this is a global discussion.
The US is obviously an important
part of the internet environment
but we are not the only
country on the internet
and we do not control the whole
internet and I think that just
as we are examining the behavior
of our intelligence agencies,
I think it's very important
that other countries
that have democratic values
that believe in transparency
and due process really look
very hard at their intelligence
and surveillance
practices as well.
>> Thanks, Dan.
Lynn.
>> I guess I'd start with saying
that everybody's voice is
extraordinarily important,
not only here in the US
but every individual
around the world.
And we need attention
and we need voice
to every issue whether it's
this particular set of issues
or it's United Nation's
expressed interest
in some portions of
the internet ecosystem.
And I think specific
to this issue,
the only thing I would say is--
[inaudible] we're all
very interconnected
and very interdependent.
And when we think about
things such as security
or managing risk or trying
to manage those tradeoffs,
we actually need
to do that in a--
with both sides of the dialogue.
One actually looking at what
we can actually do to protect
and ensure things that
support economic prosperity
and social development
while trading off
against preventing
any perceived harm.
I think we're far too much to
[inaudible] over in the side
of preventing perceived harm,
and that we don't quite
have the balanced rate yet.
>> Laura.
>> I think it's important not to
take the stability and success
and the security of the
internet for granted.
So, we're used to it working for
the most part because of efforts
like John's organization and
others and it is working,
but we also have examples
of problems with denial
of service attacks, examples
of government is
cutting off access,
examples of interconnection
problems.
We have trends away
from interoperability
which I think is a
really big problem.
We have trends away
from anonymity
which changes the nature of
the technical infrastructure.
So, point one is that we
cannot take the stability
and security for granted.
It has required a
tremendous amount of effort
on the parts of many,
many people.
And it's something that concerns
me everyday when I see some
of the trends away from
interoperability and security
and interconnection
security, basically.
So, that's point number one.
The second point
is if we believe
that we have the public sphere
in the online environment,
if we believe that there's
a technical mediation
of the public sphere,
then we have to think
about what that means.
What does the democracy look
like when the public
sphere is now digital?
So, we know historically that
possibility is for anonymity
or at least traceable anonymity
have been closely linked
to democracy.
So, I think it's important
to just ask the question
of what does democracy look
like when we have this technical
mediation of the public sphere
and the privatization of
conditions of civil liberties
where private companies are
getting a request delegated
from governments or carrying
out their own governance
of these infrastructures.
So, don't take the
internet security
and instability for granted.
Remember that we have
the technical mediation
of the public sphere
and the privatization
of conditions of
civil liberties.
And the final point is that
internet governance is something
that is-- that the
public can be engaged in.
We've seen examples of
civil society action
of private action, civil
liberties advocates involved
in decisions about
internet governance.
There are many avenues
to do that.
That the-- I do have an
engineering background
but the technology
is not that hard
that people can learn the
technology, learn the issues
and get engaged in these
debates which are very important
because as goes internet
governance,
so goes internet freedom.
>> Randy?
>> All of you here that are
attending this panel discussion,
you all have stake in what
we've been talking about.
And so I would challenge you
guys to not only make sure
that your peers understand
what some of the pressures are
and what some of the
pitfalls and what are some
of the disadvantages are of
working with the internet.
But also, you guys are
probably going to be on a track
where you're going to be
working with legislators
and policy makers
and things like that.
And make sure that they
understand the implications
of whatever it is that they're
trying to write up or implement,
both on the legal side
and on the policy side.
That's to me what I see your
contribution to the whole thing.
It's going to be
very incremental.
It's going to take a long time.
You know if some people talk
about the fact that it's
like when the automobile was
first introduced at the turn
of the century and it took us
25 years to come up with the set
of laws and procedures
to make things work
when we didn't do that.
But you guys are the ones that
are going to be talking to
and down the road
influencing policy and laws.
And so, you know,
take the challenge,
do it and influence
what you can.
>> Leslie.
>> So I'm just going
to make one point.
And that's that those of us
who are Americans or people
in the United States have
a voice in this debate
that we need to exercise.
But we need to exercise
it with more
than our own rights in mind.
If-- the interesting thing
about the internet and the lack
of a government control is the--
it's also very unclear
about who's responsible
for the human rights of people
in the online environment.
And we-- I think the NSA example
that we have data flowing all
over the world, states
have obligations to people
within their borders, and
a few other kinds of places
that the law has developed.
We're not sure who has the
obligation for the rights
of internet users in
this kind of environment.
And it's a conversation
that really needs to happen
because the-- our lives,
some significant part
of who we are is now online
and flowing through these wires
and who ultimately is
responsible to make sure
that basic privacy and free
expression rights are honored
has become increasingly complex.
So, people need to
take that on as well.
>> Last word, John.
>> It's a global internet.
There needs to be an
equally global discussion
about the principles
by which we operate it.
And part of what's occurred
over the last few months will
help encourage one aspect
of that discussion.
So, it's a probably
a good thing.
>> I want to thank all
of you for being here.
I want to thank the
Internet Society.
I want to thank the GW
Engineering Department
and their [inaudible]
Lance Hoffman's Institute.
Long afternoon, thanks
for your patience.
I hope you learned something.
Come back again.
Thanks a lot.
[Applause]