Transcript of Episode 8: Ann Bezancon on The Right to Be Forgotten
(This is a transcript of New Robot Overlords Episode 8: Anne Bezancon on The Right to Be Forgotten, first published April 21, 2013)
Myk Willis: My guest today is Anne Bezancon, Founder and President of Placecast, a location‑based marketing company she launched in 2005. She has a long history in communications and computing, dating back to the late 1980s when she pioneered the Minitel system in France. Anne, it’s great to have you on the show.
Anne Bezancon: Likewise, I’m very happy to be here with you Myk.
Myk: Our topic today is privacy in the digital age. You recently wrote a byline article in Forbes titled, “The Right to Be Forgotten, Protecting Digital Privacy.” In it you describe the very different attitudes toward digital privacy that exist around the world, highlighting a particular case in Germany where two convicted murderers, having completed their jail sentences, sued the Wikipedia Foundation to scrub their names from articles about the victim.
This “right to be forgotten” sounds dramatic to many American ears and especially, to American companies like Facebook and Google who have spoken out loudly against it. But, it appears it could soon become law across the European Union. Can you describe the right to be forgotten and what it entails?
Anne: The right to be forgotten is one of the many rights revolving around privacy, and it involves the deletion of the data that a company would have about you and, basically, removing any information that they have about you. There are several perspectives on this. The first one is the company in question has collected that information about you. In the first place, and then there’s the question of how did they collect that information and did they collect directly, or did they actually buy it from someone else.
Then, the second side of the issue is how to implement the systems that are going to be able to enable this kind of deletion of data where they talk about g shoes actually sitting in a lot of different databases.
Myk: At a really high level, you can see this argument that, for example in that German case that you talk about in the article, that once someone’s “debt” to society has been served, they should be able to live a life free of stigma essentially. Jeffrey Rosen of the George Washington University, for one, though has called this right to live in oblivion, “The biggest threat to free speech on the Internet in the coming decade.” Why do you think that a right to be forgotten is seen by so many to conflict with, at least, American principles of free speech?
Anne: I think we need to put this in the larger context of what is really at stake here. The emergence of this new technology called the Internet in the last couple of decades has created capabilities that weren’t there before. Clearly, constitutional rights, that were establishing principles at the time where none of this technology existed, have to be adapted to the institution that we are facing now.
And furthermore, think of the application of these principles with /g newer technologies and our focus at Placecast is on mobile. There’s a whole part of the conversation that is tied to the difference between [inaudible 00:03:37] PC and the interaction with online services on your phone because the phone is a much more personal device.
I think that there is an aspect of the application of principles to evolve in technology that’s important to understand.
As far as free speech, I would challenge that it’s not initially an issue of free speech as much as it is an issue of free collection of data. From that perspective, we’ve always looked at the world from the perspective of the individual, the user, the consumer.
In a technology universe, where data is collected about you without your knowledge for every click you make on any type of content, on any type of website, for example. It’s really important that the individual realizes how much information is conveyed about them and their behavior. Having a sense of what this information is going to be used for, I’m not sure that it’s a freedom of speech issue.
I think it’s much more question of adjusting the boundaries of what is considered private in society and with the technology available today where those boundaries are clearly challenged from how they were defined a century ago or two centuries ago.
Myk: I think the free speech arguments follow from a line of reasoning that, the one example I read in Jeffrey Rosen’s article, was a photo that was posted that someone posted themselves and later wanted to, essentially, take down. Say, that was mine, I took that. I want to take it down. But since it was posted on some place like Facebook, it had been replicated. Ten of their friends and twenty of their non‑friends had taken it and used it for something else. On the one hand, you can say that was mine. I own that data and I want it to be taken off the web.
On the other hand, other people now used it. Can you take it back from them? I think that is some of the free speech argument. Once something enters the ether.
Anne: The other dimension that is implied in that argument is time. Which is, I can show my photo at a moment in time and change my mind later and how can I take it back. It’s the cat out of the bag problem. How do you get it back in? That very problem exists because the technology enables that particular piece of data to be disseminated in a lot of different places at, literally, the speed of light. It’s really complicated to frame the principles upon which an individual would be able to take that data back, irrespective of the technological aspect of this.
There’s a principle issue around, is this data still belonging to me entirely the moment I’ve put it out there? What are the legal principles that are framing those decisions?
Myk: I recently interviewed Michael Chang, who is the CEO of Yes Video. They’re a company that takes old home movies and that type of thing, and they digitize it. They put it into the digital domain. During that conversation, we talked about the fact that technology’s close to taking us to the end of forgetting. We record more and more into the domain. And because of the cost of storing the data is becoming zero, it’s very likely that something you record today will live forever. Do you think it’s even possible for a legislative right to be forgotten to have any chance of overruling this kind of end of forgetting brought about by technology?
Anne: The right to be forgotten is just one of the challenges out of a whole set of issues around the collection, the retention, and the dissemination of personal data. If you look at the three series of problems. The first one is about collecting data. If I contribute data, I just like a different problem that data is extracted about me without my knowledge or combined with other inferred data or derived data that I never practically contributed myself, right?
On the retention side, which has to do with your question around infinite capacity for storage, what are the rules that are implied to that retention, that only the appropriate access is given to that data that is stored? This is where we run into breaches of security for credit card companies, for example, right.
Then the third piece is the dissemination. If I put my picture on Facebook, clearly I intended to be disseminated. But when Facebook collects thousands of pieces of data about me with all my interactions with their service, and packages it and sells it, either as an instrument for advertisers to target me or even sometimes other companies actually sell the data for the data itself, for other people to use it.
Now we’re in a slightly different problem around what we were talking about earlier, which is the data is now replicated and used in all sorts of ways. And I have no idea what’s going on.
Myk: You make the case in your article that “the time has come to define and adhere to the rule of law in regard to personal data collection and use.” I wonder to whom you think the responsibility falls for defining the rule of law.
Anne: There are a number of organizations that are already engaged in helping that happen. And of course it’s a combination of the citizen, the concerned citizens that are organized through a number of non‑profit organizations that are working toward helping to define the boundaries that I was talking about earlier. How to actually establish, the correct language to describe what the problem we’re trying to solve. Then of course there is the legislative bodies, and just not in the US but outside. We have the experience of what’s been going on in Europe and the legislation that has now been put in place that has fairly stringent obligations by American standards that will oblige companies to comply by 2015, which is tomorrow.
The legislation happens because a need appears, and whether you decide to go at with legislative tools out of the consequence of misbehavior and then having to regulate, which is often the way the Americans organize their criminal system, right?
You identify a problem, usually citizens, whether they be corporate or individual, misbehave, and the need appears to regulate this.
The European way is traditionally a Roman code approach, where you think of the problems, you establish the principles in books, and then everybody needs to comply to the rule that has been defined.
Whether the process is active or productive, with regard to how that legislation is going to come about, it’s seems pretty obvious that you have signs from the Consumer Privacy Bill of Rights that the White House provided last year.
Decision last year as well from the California state, to provide guidelines on how to deal with individual private data, personal information online, which actually has been taken up fairly recently by the state of Texas as well. In other countries you see a number of initiatives that actually are much more closely following the European approach.
There is a movement, and I think that movement is coming out of the realization that this problem needs addressing, and yes, it’s very complicated. There are a lot of different tentacles to the octopus.
Back to the boundaries and the definitions of what needs to be addressed, and what type of rules do you want to have to be able to define what’s acceptable and what’s not acceptable? And then of course, what punishment if you don’t abide by the law is going to occur.
It’s really interesting, for example, to look at what happened to Google in Europe, right? The press picked up on the few millions dollars of fines, but actually their obligation is just not to pay money, it is to…
Myk: I’m sorry. You’re talking about the Street View case? Is that what you’re talking about?
Anne: The decision there was and Google agreed to commit to a week‑long privacy training every year for all their developers. That is a very serious commitment. That means there’s a behavior change that needs to occur within all the companies that are involved in one way or another with this new technology that enables to collect and store and disseminate this data. Everybody understands what’s at stake, and why it matters to actually build the software in a particular way in order to not be exposed to these kinds of issues.
If you think of last year as well, there was this or a little longer than last year. Apple having gotten into trouble because location data stored on their device wasn’t encrypted and could be accessed by someone with a modicum of technical knowledge. It’s that same carelessness.
Carelessness is not necessarily deliberate. It’s sometimes just caused by ignorance. Starting by informing people of what the issues are, and why it matters to think of the problem in a certain way, is very important.
The European community decided to give very strong framework to how to approach this, and in the US from that perspective were much less regulated, if you will. There’s a lot of trust put into the industry itself, trying to self‑regulate, and as with any other area of business, self‑regulation has its own limits.
And back to what I was mentioning earlier, when you have excesses, look at how many [inaudible 00:16:00] we’ve had because of Facebook and how they have pushed the envelope. I think that it’s and Google in parallel, I think those two companies are the poster children, if you will, of what the technology can do, and what new questions arise out of what the technology can do.
Myk: The power of these companies has to be pretty strong. When you look at a juggernaut, like a Google and Facebook, and you pair them up against a European government, sometimes I wonder who’s going to win. Is the dominance of these US firms, like Google and Facebook, who have very aggressive approaches in terms of what’s fair game for them to do from a privacy point of view. Are they causing a change in European attitudes towards privacy? Or is it maybe going the other way, where there’s kind of a backlash effect because of that laissez‑faire attitude?
Anne: I think it’s a combination of both. I think there is much more awareness of the type of problems that need to be dealt with because of the fact that these companies, and a lot of others behind them, are running against a number of these new issues irrespective of the “why.” Obviously, for business reasons they are behaving the way they are. The reaction of the European community also very much a product of their history. As I was talking about in the [inaudible 00:17:47] , the American citizen has not had to deal on their soil, in their blood with the risks and the consequences of having their privacy violated the way European countries have.
Be it what happened during the last world war, information deletion, spying on the massive systemic scale in eastern countries. These are issues of life and death, there of much higher sensitivity, culturally, to those issues than there is here. I think that’s why when there is any form of abuse, in this case at this time, coming through corporations from America, there is a reaction.
Myk: It seems you must have a pretty interesting perspective on this. You’re a French citizen, I believe. Now, you’ve been in the US for 20 years is that right?
Anne: That’s right.
Myk: You’re in San Francisco which is, essentially, epicenter of this move fast, break things, crazy, wild world of technology start‑ups where it’s not a terribly deliberate process sometimes. They’re not thinking four years ahead about the policy ramifications of something that they do. They just do it. They see what happens and then they patch up later as they’re going, running really fast. Is that culture shock for you, or have you learned to reconcile that type of doing business with what seems to be a more deliberate approach from Europeans?
Anne: It’s been really interesting to assimilate, if you will. I think, I embody and our company embodies a balance between the two. By cultural tradition, personally, I was always extremely sensitive to issues that were not apparently at the forefront for some of my colleagues in the start‑up universe. Clearly, there is an influence there to start with. I also think that the business rationale of the decisions we have taken, which are indeed 180 degrees from how most other start‑ups who work in our universe behave, is because fundamentally, I believe, it’s good business.
The future of online technologies in advertising at large and particularly on the mobile phone, relies upon a relationship of trust, relationship of transparency and the relationship of value exchange.
You look at the transaction, and this is the American side of things, the transaction piece, right. If I look at the transaction where I give my data against something of value to me, it’s a vastly different conversation than asking for mission from a hierarchical standpoint.
You change the framework of the relationship from what it is today which is, “Oh, we need to get users to consent.” The very word “consent” is a higher power relationship. We’re speaking of value exchange. We’re speaking of the consumer needs to know what they will get for what they will give.
Myk: Let me interrupt you for a second. I’d like to give you a chance to describe a little bit about what Placecast does. Then, maybe, we can talk about some of the specifics there and how they relate. First of all, tell us about Placecast, what business you’re in?
Anne: Placecast is a mobile market company that is providing a service on an opt‑in only basis. Right there, it starts with the consumer choosing to receive marketing messages about companies, that have products or services that they choose to receive information about, when they are in the proximity of the stores at which those products and services are available. It’s a combination of preference driven, personalization driven, expressed consent, opt‑in rather than consent, in this case, driven approach where the mobile phone that you carry with you becomes an instrument to receive useful information that you can act upon immediately. Hopefully is valuable to you. Otherwise, you are not going to stay in this program.
Myk: This is a realization of that because it was about 2004, 2005 where people were dreaming about what LBS, location based services, they said would allow. It’s basically, you’re walking down the street, and you’ll be notified of something that is of interest to you in a nearby shop. It usually, involved a Starbucks. I’d have no idea why. This is, basically, an embodiment and a commercialization of that, is that true?
Anne: That is true. The term “location based services” is rather ambiguous. It’s, actually, in the industry covering a much broader swath of activities than what we do, but it definitely falls into that category.
Myk: Talk to me a little bit about this. You’re a strong advocate for establishing these standardized norms for personal data protections. It sounds like your company is building its value on knowing exactly where a consumer is and what they’re interested in so that you can deliver highly targeted advertisements to them. How do you reconcile this desire for increased personal privacy with what seems to be pretty highly advanced behavioral and location targeting on behalf of, presumably, advertising customers?
Anne: What’s most interesting about how we have approached the problem is that you do not need a lot of personal information in order to deliver a lot of value to a consumer. Real‑time location, in that context, is a way for us to be able to derive [inaudible 00:24:35] relevance for you. The trade of value, that I was talking about, is that you as the consumer are choosing very expressly to be part of this program. To stay in this program which it’s bringing you valuable information that you can act on.
It’s not considered advertising from the perspective of the consumer because the messages that come to you are corresponding to what you asked the system to deliver to you in the first place.
The second element is that the volume of messages that you get is one a month, one a week. It is nowhere comparable with the thousands of advertising messages that you see online.
The reason it works from a business standpoint, on the marketer’s side, the advertiser’s side, they can now speak and engage in a relationship on the duration with a particular customer and continue over time to deliver them this value. If you don’t provide this value to the consumer, they simply opt out.
Myk: In an application like that it seems pretty clean. You, as a company, can make it very clear what you collect, how you’ll use it and give someone the opportunity to opt‑in to it. If they care not to, they don’t opt‑in and you, presumably, don’t have any information.
Anne: That’s right.
Myk: Now, over the long arc. Consider Facebook. I tend to bring it up all the time. It’s something that in a lot of parts of society, if you’re not on Facebook you miss out on a lot of the social communication that’s happening. With Facebook, it’s not as simple as saying, “Hey, I’m going to weigh the pros and cons of giving this company some information about me. If I’m comfortable with that I’ll use the service. If I’m not comfortable with that, I won’t use the service.” There’s a lot more societal pressure to make you use that service.
I wonder if this is an inevitable journey for us all that we continue to erode our privacy or, at least, give up more of our personal information? We’re, essentially, either willingly trading it for the convenience. It’s a value exchange, as you say, or it’s the inertia of society pushing us that way. Do you worry about that?
Anne: I would like to go back to something. The trading of your data for value is a problem of whether you buy the product or you are the product. The difference between a credit card company, telecom carrier company or your health insurance company, for that matter, to whom you provide vast amounts of personal information, is you’re getting a service in exchange. Usually, you are paying for it, right?
In that case, that data is already held by these guys, but they also have regulation to tell them what they can and cannot do with that data.
In the case of Facebook, you’re not buying Facebook. You are sold by Facebook to advertisers. The way they do that, is they provide you a service that, as you say, is creating a feasibility to [inaudible 00:28:32] with your network. The distinction between buying the product or being the product is an important one.
Whether, in the long run, you’re already seeing very interesting stats about usage and Facebook, and how younger generations are, actually, not as much on Facebook already and are using other services.
I do think that we are still in the very early stages of understanding all the ramifications of the new technologies. Whether Facebook will still exist in 10 or 20 years, is a question mark. Whether there will be, yet, new ways of enabling this horizontal communication between human beings is yet to be seen.
What I think is apparent, is that there is an increasing sensitivity on the part of individuals about the use of the data and what they are trading for it. The awareness is slow, but it’s definitely starting to be more brought up, written about more. The fact that we are having together this conversation, we wouldn’t have had that conversation five years ago.
Myk: Yes, I sometimes wonder, though, if that’s a generation gap. I say, often, on the program I might just be this grumpy old man. That essentially, I don’t like the loss of privacy and the spread of information that seems to go hand‑in‑hand with the use of so much social media. And yet, the last company that I ran, the audience there was quite young and their attitudes towards sharing personal information seemed to be, essentially, anything goes.
They didn’t seem to have any restraint and again, I sometimes wonder if that’s a generation gap. When I die out when me and my kind dies out, then the people who are left will be people grew up, essentially indoctrinated by Mark Zuckerberg.
If I could shift a little bit in the S1 filing in the SEC. Facebook founder, Mark Zuckerberg, wrote, and this is a quote.”By giving people the power to share we believe that leaders will emerge across all countries who are pro‑Internet and fight for the rights of their people including the right to share what they want and the right to access all information that people want to share with them.”
Noticeably absent from this kind of empowerment mentality is the right to restrict what is shared with others. It would appear that restricting what is shared is a myth about Facebook’s mission of making the world more open and connected.
Given that they are such a global juggernaut, are they going to be a meaningful force in changing popular attitudes? Basically, the whole privacy debate might just become essentially an anachronism.
Anne: The real question here is not about the freedom to share, I am all for the freedom to share. The question is how is my data used beyond my sharing of my data with my network. That’s the question and it goes back to being the product versus buying the product. I cannot have the option to buy Facebook service with no ads in it and my data not being used for any advertising purpose; that’s the problem. I can’t buy Google and not have any of my data shared for any advertising purpose. It is the primary function of Facebook to sell my data for advertising that is the way they finance the fact that they provide free service.
For the individual here is an economic choice, not just a data choice is, “Would I like to have the option to pay for the Facebook service and not have my data used for any other purpose than the direct express purpose of communicating with my network?”
What is the real [inaudible 00:33:11] in here; of course behavior is influenced by the availability of technology when you think back of the very beginning of the phone. Do you know who were the first people to use the telephone as an instrument, it was a prostitute.
People frowned up on having a telephone at their house because there was a stigma attached to it. Progress since then, my point being there’s always a cultural environment and a time for when things happen, it sort of takes care of itself over time.
Saying what our university is going to look like in the realm of technology and access to data and online services in five, ten, twenty years is really difficult because there’s so many things that have been changing in such a short amount of time.
It should talk about when we started the company 2005, the iPhone wasn’t even out; the iPhone came out two years later. At the time, nobody initially understood the complications of computing power being packed in such a device that you could carry with everywhere.
Fast forward 20 years, what is that going to look like? Is it going to be implants? Is it going to be snapping a finger and having a virtual screen appear in front of your eyes? Look at Google Glasses, right?
It’s moving quickly, the product and services availability, and our behavior is slowly catching up. Always with a little lag.
Back to the beginning of the conversation, beginning to think of the principle standpoint. Which is distinguishing the what is the problem from the how you solve the problem.
You see the European community having a very principled approach to the issues around personal data because they have much more vivid experience of what happens if you don’t pay attention.
It is going to sort itself out and maybe it will be totally fine, culturally, and societally, and politically as well, we haven’t even touched on that, to have your data being available everywhere.
The reaction comes when there’s a problem. When there is a breach and you have millions of credit cards that are used by thieves there is regulation. What is the problem that will need to occur, that there is a reaction? We don’t know yet. We’re going to see what happens.
But back to what I was saying earlier, there’s an awareness to have from the individual, the consumer, the user perspective, which is what is the value of the service that I get that justifies that all that data be used the way it is.
Maybe there’s no problem. Maybe it’s a fair trade. Maybe the fact that I don’t depend on box amounts to use Facebook is worth having excess to all that data. What does it hurt me, right?
Myk: It’s fascinating. I don’t know how it’s going to end. Because, on the one hand, I think about technology a lot and I see every day these exponential effects that happen everywhere. We’re recording exponentially more data and we’re storing it forever. Every aspect of our lives is becoming digitized. I always think of the Internet economy as the closes thing to free market that we’ve ever had. I just wonder in that chaos of the Internet economy, does it plot its own course into the future? Does technology choose where we go into the future because the other human organizations, governments, can’t keep moving fast enough to keep up with it? I wonder. What do you think?
Anne: I’m wondering if this is the perfect conclusion to the conversation which is, we are powerful citizens. Each of us individually and collectively, this is my French side, where we have a bit of holding revolutions when we’re not happy with our government. It hasn’t happened in a long time. But there is a sense of you have to take on the weight of your own responsibilities and fight for the world that you want to live in and you want your children to live in.
Back to what I was describing earlier, I think the European approach to this is I am electing people to represent me and I want them to deal with the problems before they happen.
Here, there’s a much less laissez‑faire approach. You wait for the problem to occur then you try to curb it. There is no good or bad. It’s just different ways of doing things.
But at the end of the day, you end up a society where you have citizens who care more or less and the more they care the more organized they can be. And the more protected they can be. It’s a reflection also on the rule of law. One of the things that differentiates the US and Europe from other parts of the world is the strength of the rule of law.
How that law comes about seems to be a bit [inaudible 00:39:23] here and there, but at the end of the day you have boundaries that are established that are controlling and politically acceptable, for one. And that’s what we’re trying to define around this whole new universe of the Internet.
We also have to think of all the good things in the world that can happen because you have access to that information and because have access to this communication mechanism. An instant democratization of information. The transformation that you see in the media, and the print and newspaper business disappearing slowly, is one of the illustrations of this.
We are in a phase of massive transformation and, I think it’s just a big name. I certainly don’t have any firm answers to your question, but I trust humanity. That we are cogent, we have instruments of knowledge and ultimately we’re going to g move individually and collectively the decisions that are the best for us. That’s why I trust that it’s going to sort itself out, but it’s going to be interesting to see the vagaries on the way there.
Myk: My guest today has been Anne Bezancon. Anne, thanks so much for being on New Robot Overlords.
Anne: You’re very welcome, Myk. It was a pleasure. Thank you.