While the internet has undoubtedly been an unparalleled source of information and connection, it also has proven to be one of the most powerful tools for breaching privacy and security.
Understanding the balance between privacy and security — and how it affects liberty and democracy — was the theme of the fourth Princeton-Fung Global Forum held March 20-21 in Berlin.
About 450 industry experts, scholars and students, as well as 30 reporters and editors from German and American media outlets, gathered to hear 40 speakers discuss liberty in the digital age. The two-day forum, "Society 3.0+: Can Liberty Survive the Digital Age?" was organized by Princeton's Woodrow Wilson School of Public and International Affairs in coordination with campus partners.
The forum's keynote addresses and panel discussions focused on privacy, human rights and surveillance in an increasingly digital world. Specific panel topics included: privacy and human rights versus security; the benefits and risks posed by the internet of things (everyday devices connected to the internet); communication silos and "fake news"; open access to information versus blockage; and a vision for a global cooperative plan.
Welcoming participants to the opening session, Princeton University President Christopher L. Eisgruber said the balance between liberty and privacy in the digital age is both a challenging and urgent question.
"Crafting effective industry and public policy solutions to the challenge of accommodating both liberty and privacy in the digital age will require communication and collaboration across disciplines, across spheres of influence and across political viewpoints," Eisgruber said. "Princeton University is a place where that kind of interdisciplinary collaboration thrives, and we hope to have brought that spirit to bear in the organization of this forum."
Eight Princeton faculty members from such disciplines as computer science, engineering, public affairs and sociology served as panelists, joining other academics and tech industry leaders. Journalists from CNBC, Frankfurter Allgemeine Zeitung, The New York Times, Slate and Vox served as moderators. Cecilia Rouse, dean of the Woodrow Wilson School of Public and International Affairs, emceed the conference.
Thirteen Princeton students helped University staff with behind-the-scenes responsibilities of organizing the conference.
In the first keynote address, Vinton Cerf, vice president and chief internet evangelist at Google and a principal architect of the original internet, kicked off the conference. He provided a history of the internet, insight into its current form and offered the opinion that today's internet requires critical thinking to be used properly.
"Can liberty survive the digital age? I am going to say yes, but only if we make it so," Cerf said. "There is going to be work ahead for us to answer that question positively. … We must exercise critical thinking if we are going to maintain liberty in the digital age."
Cerf's compelling argument quickly emerged as the theme of the two-day conference, which began with a look at how different countries and sectors view privacy and concluded with a global vision for Web 3.0. Speakers generally agreed that to maintain liberty, a balance must be struck between privacy and security. However, this comes with tradeoffs, which vary across sectors, industries and countries.
The lively discussion also extended online, with the hashtag #PrincetonFung trending on Twitter. Student volunteers used the University's Snapchat account to document their own experiences, and a Facebook Live video captured student reactions to Berlin and the conference.
Preserving privacy and liberty while balancing security
Today, everyone is a digital citizen — regardless of one's online usage or persona. Protecting one's digital presence is possible through tighter security measures. Therein lies the rub: Certain security measures collect enormous amounts of data, which seriously compromises privacy.
"There is no liberty without privacy. Privacy is a foundational right," said Microsoft President Brad Smith, Class of 1981, in his keynote address. "Before you act in public, you often need to plan and prepare in private. However, privacy is constantly challenged by new technology."
With new technology has come cheaper forms of surveillance, according to Harlan Yu, who received a Ph.D. in 2012. Yu is a principal at Upturn, an organization that provides internet expertise for policymakers on a range of social issues.
"You don't need a great rationale to collect all of the data you can get your hands on," Yu said. "Not only is collection easy and cheap, but the tools that data collectors now have to use that data — the tools that are used to make predictions — have also gotten a lot more powerful."
Yet, few people understand what data is being collected and how it might be used.
"Are we regulating Facebook, Google and other companies in the same way we speak about the National Security Agency? Does that apply? Does it make sense? Do citizens really understand what data is being collected from them?" asked David Dobkin, Princeton's Phillip Y. Goldman '86 Professor in Computer Science.
Data collection is compounded by the fact that many citizens freely trade their own privacy for convenience. For example, within the internet of things (IoT), everyday objects like thermostats and appliances have the power to send and receive incredible amounts of data. Devices like Amazon's Alexa, a personal assistant device, know intimate details about a user's habits. The potential cost? One's own privacy and possibly security.
"We get so used to agreeing to give access to all our personal data, we don't even think about it anymore," said Björn Scheuermann, professor and chair of computer engineering at Humboldt University.
It remains unclear who owns this data and what private companies can do with it. Ambiguities also lie within the IoT device itself. "It's not just about who owns the data," said Nick Feamster, professor of computer science at Princeton. "It's about who owns IoT devices. Is it the consumer or the vendor?"
While some of these questions remain unclear, tools are available that can protect one's internet footprint. In his keynote address, Roger Dingledine explained the underpinnings of Tor, software he helped to develop, which acts as a search engine but protects users by bouncing their communications around a distributed network of relays run worldwide by volunteers. Essentially, Tor prevents an entity (be it a person, government or company) from knowing what websites a person visits and their physical location; Tor also provides access to websites that might be blocked. Yet, even software like Tor can be compromised.
"Anonymity serves different interests for different user groups," Dingledine said. "But unfortunately, there will always be malicious users that exploit this anonymity."
Preserving liberty in an age of fake news and filter bubbles
Today, 15 percent of Americans are not online. In contrast, 68 percent of Americans use Facebook, and 20 percent use Twitter. People select into the use of these communication platforms at different rates, and not everyone uses or understands the internet in the same way. There is no handbook or introduction to the internet. And so, some users are unable to differentiate between vetted news and spam-bot messages and memes.
Panelists discussed how most digital users experience "filter bubbles," with information that is curated, tailored and delivered to viewers by the communication platform itself (like Facebook). This information has a way of corrupting how users see the world and can be potentially as dangerous as "fake news," which was evident in the 2016 U.S. presidential election. All of this challenges democracy in the digital age.
Understanding the difference between propaganda, "fake news" and vetted news stories is a challenge for many internet users. Likewise, the term "fake news" itself is problematic, said Eszter Hargittai, a 2003 Princeton Ph.D. recipient and a professor in the Institute of Mass Communication and Media Research at the University of Zurich, during a panel on communications.
"I don't think we have a very good definition of 'fake news,' unless that definition is that it's truly fake news. Usually when people use that term, they mean something broader, which would include a spin on that story they disagree with," Hargittai said. "We can't just group it all into that one term."
Gabriella Coleman, the Wolfe Chair in Scientific and Technological Literacy at McGill University, took it one step further. "'Fake news' is too cute of a term. We need to stop using it. It's propaganda."
However defined, "fake news" or "propaganda" is not only a problem of misinformation but also one of missing information, argued Ronaldo Lemos, director of the Rio Institute for Technology and Society. "The problem is that you don't know the authorship of what you're reading, where it comes from, who paid for it, the individual that's posting or company. I wonder if we are able to create better tools to increase the information that we get and read online?"
The situation is exacerbated by the fact that fewer people get their news from credible media outlets. "No offense to journalists, but politics is not an interest to all people. I think we need to be realistic about the average person's interest in politics — and where they get their news," Hargittai said.
Few oversight mechanisms remain for states to monitor "fake news" and filter bubbles, and so it's an issue that experts continue to grapple with, argued Joel Reidenberg of Fordham University's Law School and former Microsoft Visiting Professor of Information Technology Policy at Princeton's Center for Information Technology Policy.
Throughout the forum, a consensus on how to combat "fake news" wasn't reached, but one theme was echoed strongly: the importance of digital literacy, which should be included inside and outside the classroom. "I strongly believe that every high school student should write a piece of software to better understand it," Cerf said.
Preserving liberty in the future: A digital Geneva Convention?
In many ways, cyberspace has created a new type of warfare, said Brad Smith, with large digital divides being built by governments around the world. And there is little uniformity among nations on how to treat the internet. That being said, can there be global regulation of Web 3.0?
"We did not talk about law very much today because we do not have legal solutions," said Niva Elkin-Koren, director of the Haifa Center for Law and Technology at the University of Haifa.
One solution many panelists offered was a digital bill of rights. Brazil's "Internet Bill of Rights," an effort directed by Lemos, governs the use of the internet. Panelists debated whether this approach could be used in other places, but did not reach a consensus.
Julia Pohle of WZB's Berlin Social Science Center cited the example of Germany's "fake news" bill, which threatens fines of as much as 50 million euros if social networks refuse to give users the ability to complain about hate speech. "In this case, the protection of basic rights is outsourced by the state."
Audience members were captivated when Smith brought forth a solution cited previously: a digital Geneva Convention. This collaboration would bring together governments to "pledge that they will not hack the accounts of journalists or other private citizens who are involved in the infrastructure of our democracy."
Even this solution comes with its own set of challenges, however. In the case of cyber warfare, it is difficult to determine where an attack stems from. How would the treaty address this? Additionally, the original Geneva Convention was not organized by states but by the International Red Cross.
Digital Geneva Convention or not, most panelists agreed that global cooperation is needed to grapple with today's internet.
"Digitization has contributed to the fact that we now see the world as a globalized hub," said Neelie Kroes, former vice president and commissioner for Digital Economy and Society, European Commission, in her keynote address. "We need international cooperation between trusted partners to solve the great challenges we are all facing. Nationalism denies the reality we are facing. I think the best way forward is to educate people, involve people, be inclusive and give them information. I challenge you all to make a difference."
Kroes' final words summarized the themes discussed at the forum: "The internet is a powerful tool. We need to treat it like it is. I challenge you to make a difference."
The Princeton-Fung Global Forum
The Princeton-Fung Global Forum was established in 2012 as part of a $10 million gift from William Fung, a 1970 Princeton graduate and former University trustee who is group chairman of the Hong Kong-based company Li & Fung.
Visit the Fung Forum website for more information about the event. Videos of the panels and keynote speeches are available on YouTube.