After seeing the backlash over Instant Personalization from Facebook, many people have been nervous to approach the subject. But invariably, as we move forward into an increasingly data-driven society, personalization will need to become a larger and larger part of how we communicate with customers, site visitors, and consumers of online content. So the question is, how do you personalize content without making people feel violated and uncomfortable? Is it just a question of people’s preferences changing over time as they "come around" to the idea of personalization, or is it an implementation question? What's the degree of personalization that is acceptable to most consumers? This panel will look at how to preserve users’ trust while personalizing content to them. It will also discuss some acceptable practices for personalizing content to individual users' data, and shifts in the societal acceptability of content personalization over time & by demographic.
by Phil Zimmermann
Philip R. Zimmermann, technology visionary and internet folk hero, says “it is sometimes better to take direct action to change unjust laws”. He is an encryption guru and privacy innovator who has made huge personal sacrifices to create technology that protects people around the world. Phil is the creator of Pretty Good Privacy, an email encryption software package. Originally designed as a human rights tool, PGP was published for free on the Internet in 1991. This made Zimmermann the target of a three-year criminal investigation because the government held that US export restrictions for cryptographic software were violated when PGP spread worldwide. Despite the lack of funding, the lack of any paid staff, the lack of a company to stand behind it, and despite government persecution, PGP nonetheless became the most widely used email encryption software in the world.
Phil has recently focused on launching a secure VoIP protocol that allows people to make encrypted phone calls over the internet. He will discuss why encrypted phone calls are the next evolution in privacy, why easily wiretapped, unsecure VoIP is bad for society and good for organized crime, and how a secure VoIP protocol will protect the criminal justice system. Other topics include the effects of pervasive surveillance technology on democratic institutions and the future of consumer authentication.
There’s no topic with more buzz around it than the “cloud.” However, for all the aspects of our social and commercial lives we entrust to the cloud, at the same time we surrender our data, and increasingly our memories and finances, to others. Who controls that data, who protects it and who ensures our privacy? There are however possibilities for creating one’s own cloud, and retaining a measure control over off-site data and services, both software and hardware based. We’ll explore a number of solutions to the notion of a personal cloud, and the trade-offs inherent in that choice.
No matter how narrow you think the use of your website or service will be, if it's successful, it'll be used in ways you'll never expect - including life or death fights over human rights in foreign countries. The design of your sketchy PHP code might make the difference between a free press or a government clampdown, tortured dissidents or a bloodless coup. Twitter aids activists in Iran; Facebook helps the independent press in Ethiopia; World of Warcraft is policed for sedition in China. What is happening on your site that you don't know about? And how can you design it so you help the good guys?
by Jeff Jarvis
In our current cultural obsession with privacy, we risk losing the benefits of publicness - of the connections the internet enables. So, in a discussion, we will consider the value of publicness in our lives and communities, in transparent government, and in truly public companies.
We will ask what privacy really means and examine its brief history (it was born out of fear of new technologies, especially the dastardly Kodak camera). We will discuss the ethics of privacy and publicness that should inform our decisions in social and business interactions: what we reveal, what we keep private, and why. We will look at different cultures' views of privacy (how the Germans, who get naked in saunas and public parks, care deeply about the privacy of everything ... except their private parts). We will ask what Facebook, Foursquare, Google, Twitter, government, and companies should do about privacy. We will claim ownership of the public sphere--what's public is owned by us, the public. And we will forge a bill of rights in cyberspace to protect the openness of the internet that is our tool of making publics.
Jeff Jarvis, author of What Would Google Do? and the upcoming Public Parts, will present his findings and views about publicness - and his own experience revealing his prostate cancer--and then lead a discussion with the entire room - Oprah-like - about the nature of privacy and why it worries us.
11th–15th March 2011