Below is a transcript of the Business Gets Personal session. It is loosely edited.
Jim Killock: So, our session today is looking at how businesses are using personal data, and how different sorts of business practices, good and bad, if you like, could affect the way our identities are stored and how our privacy may be affected in the future, as again technology is pushing this field very quickly. So, we have invited a range of speakers to give a short introduction to their perspectives on business and privacy, a number of different approaches.
Introductions
Peter Bazalgette: Thank you. As you said in your kind introduction, I suppose I have a couple of angles that I come from. One is that I have spent a life creating television content and I’m currently very concerned where the revenue is coming from to create the content that people would want to watch in the future. And the second is that I personally invested in two or three digital startup companies, companies that are trying to create the new online economy who are on the forefront of trying to turn online world into business. And they are struggling with a lot of things including what people’s privacy and personal rights should be. Very quickly, I would like to point out three contradictions or paradoxes that I see at the moment:
First, Gordon Brown and Lord Carter are doing something they call Digital Britain at the moment. Their bet is that they have a collapsing income from the financial sector and services for reasons we all understand, and there has to be that digital dividend they are hoping to establish. And they hope to get lots of revenue from the online world in the future to replace it. That’s their dream or their vision. And yet the truth is there is a hell of a lot of activity online at the moment but not a lot of revenue. And from a business point of view this concerns me a lot.
A second little paradox is that I watched Facebook users quite rightly get very concerned about things like Beacon which was going to do a me-too-marketing using their personal data and also more recently getting annoyed about Facebook’s alleged ownership of their personal data and the retention of their data. But at the same time, the paradox is that many of the users of Facebook are completely profligate with their own private data, and wild and mad about it; they are not media-sensible. And they are pretty wild about other people’s personal data. Individuals abuse the net, and abuse online social networking, and so on. The parody there is that people worry about the system but are actually abusing the system at the same time. One of the things that confronts business at the moment, and it confronts Facebook, is that Facebook is on the face of it rather a marvelous service, something that millions of people enjoy using and taking part in. But it has no financial model. And it isn’t a charity. Also it was sold for a lot of money a few years ago. It is not making money, and hasn’t found a way of making money, and in the end, things like Facebook have to function and it has to find a financial model – not one that abuses people’s rights, but it has to find one. And that’s something we have to grapple with. In fact, no media company at the moment knows what its business model is going to be in ten years time. So great is the level of digital disruption.
My third paradox of the morning if you like would be that while Google is allegedly sensitive to date retention – it has actually reduced considerably over the last few years the time it’s holding personal data – you’ve got the home secretary wanting mobile phone companies and so on to hold personal data for much much longer. So you have got some commercial organisations trying to be more responsible while the government is trying completely in the opposite direction. Between those two points is a complete public policy vacuum. Privacy is only one issue and I’d like to hear people in the debate this morning acknowledge that developing new revenue flows to fund content and services online that we all enjoy is also important. We have to acknowledge therefore the commercial imperatives. I think in the future we will very often pay for content and services that we want with two commodities. One is our personal attention. ‘I will give you my attention for thirty seconds watching an advertisement if you’ll give me that download of that movie’. And secondly, we’ll pay with our personal data, i.e. ‘my personal data is valuable. What I’m interested in, if reported consensually with my permission, is valuable because you could put targeted advertising towards me and that targeted advertising could well be to my benefit’. So, these are the things we have to grapple with and acknowledge.
I was asked to mention Phorm very quickly. Phorm is a way of tracking your online activities and then delivering you targeted advertising. But according to Phorm, and I’m quite convinced by that actually compared to some other services online, it is not a bad system. Please see the significance of it. If you don’t like the system of Phorm, there has to be systems like that if we’re going to get content and services in the future, I’m arguing to you. We’ll debate it. So I think in the future also there may well be a new definition of intellectual property. We know the music companies have not been able to protect their intellectual property online and it somewhat destroyed their model. I think in the future perhaps the definition of intellectual property, owning IP, will be not being able to control who accesses your content but having a right to being reported back to you who saw it. That way you’ll still derive flows of revenue from your IP and believe me, the online economy does need a method of deriving revenue flows from IP ownership. Otherwise you don’t get an economy, you don’t get economic growth, you don’t get many of the pieces of entertainment and other things that you enjoy. So overall I’d say a bit less piety please and more practicality, a bit less outrage, a bit more cooperation, a little less posturing and better public policy. Thank you.
Casper Bowden: Right, well after that excellent introduction from Peter, I wanted to sort of clarify that I think I’m here, really, for two different reasons. Today I am Chief Privacy Adviser for Microsoft in Europe, Middle East and Africa, but up until 2002 I was director of a small think tank called the Foundation for Information Policy Research and during that time I did a great deal of analysis, and campaigning and legislative activity, for example, to try and curb the worst excesses of the Regulation of Investigatory Powers Act. So, this means I’ve had quite an interesting experience of sort of both sides of the fence, both as a privacy advocate and also as somebody now working in a very large company where these issues are very much to the fore. So, you should assume when I’m talking that I am actually not talking as Microsoft unless I specifically say so.
So, advertising is the fundamental business model of the Web. It does seem that people won’t subscribe to a service if there is a free service on offer paid by advertising but we have as it were a deep cultural expectation, that for example, when we borrow a library book or watch a television programme, that there’s nobody actually tracking what we read, page by page, or which channel we flip, but that actually is pretty much the situation now on the Web. That information is being recorded as you watch each Web site. Now, the reason that that is recorded is because if advertising is placed according to a profile that’s built up, which as it were, comprehends your supposed interests, then that advertising slot is much more valuable to the advertiser than if somehow that advertising is presented blindly. Now, the trouble is that once one has accumulated this kind of profile information with consent of the user and to fulfil this business purpose, the trouble is, that becomes a sort of tempting honey pot for the interests of state surveillance to move in and try and use that data for other purposes and actually that is precisely what has happened over the years with the situation of mandatory systematic data retention by Internet service providers.
I have some other remarks which I will work in probably during the Question and Answer, but given the limitations of time, the second thing I want to talk about is about Microsoft. So, my position is no to ID cards, but we do need forms of secure authentication online, because these services that we access online, they do hold stockpiles of very private information and we’re all going to need, again particularly online, ways to manage that information in a way that excludes others that we don’t want, whether they’re hackers or whatever, trying to access that information.
So, there is a field of technology which is called privacy enhancing technology that has been pretty active now for 10 or 20 years and it really is the application of methods of cryptography, way, way more sophisticated than simple encryption to this problem of, if you like, trying to square the circle, trying to get the ability to personalise services or to access services securely without needing to reveal more information than you strictly have to, and so for example, it is technically possible to as it were, prove that you are a member of some group of entitlement whether that’s somebody who has paid a subscription to use a service or enrolled to use some online service, but without actually revealing who you are.
Now, the cryptographic methods underlying those sorts of capabilities are profoundly counterintuitive, but there is a particular strand of work here, which normally goes by the name of private credentials, that goes back about 10 years and so far has frankly failed to find a place in the market. So, I’m very pleased indeed that, having carried a torch for this kind of technology for about 10 years going back to my time in FIPR, Microsoft is now introducing this technology called You Prove and we’re building it into the fabric of Windows infrastructure. So, really any programmer or any business that wants to introduce the state of the art in cryptographic privacy protection will be able to do so, I hope, from about a year’s time. Now, of course whether data controllers, whether those are organisations in government or business, will actually take up and, as it were, bear the costs of innovation in a new and complex area, is not something that necessarily will happen through market forces unless data protection regulation is strengthened to, as it were, create a presumption that privacy enhancing technologies such as I’ve described must be introduced. Frankly the evidence from market forces we’ve seen to date is that organisations don’t bother.
The other barrier perhaps is the sheer complexity and counterintuitive nature of this technology and personally I have, over many years, written half a dozen submissions to government consultation exercises. I’ve personally briefed senior civil servants and politicians and there is a communication problem here, because unless you have some background in technology, what I talked about, the danger is it just goes woosh and if you have an official, a policymaker, who has, in their own mind, battled, if you like, with the issues that Shami and others outlined in the plenary, if those officials have, in their own mind, sacrificed privacy and just, as it were, seized the evolution of government policy as making these increasingly intrusive surveillance methods acceptable to the public. The methods I’ve described are not just counterintuitive, they’re positively disruptive and unwelcome. I can see someone in the back, who’s an old friend who’s nodding and who has particular reason to understand what I say. So, I’ll leave my initial remarks at that and look forward to the questions. Thank you.
Wendy Grossman: I’ve been writing about privacy and related issues for most of the last 15 years. I first encountered them at a conference that they run every year in the States called the Computer’s Freedom and Privacy Conference and I think what we have is two fundamental problems. One is that with computers it’s almost easier to keep data then it is to throw it away, because if you throw away, you have to think about what you’re keeping and you have to be selective, whereas if you just keep everything you can just sort of assume that at some point in the future if you need something it will be there, and I think the second problem is that privacy is invisible and intangible and people don’t really understand what it is until they’ve lost it.
So, in the situations that we’re looking at with many of the online businesses, you mentioned Facebook, but it’s as true of Amazon or eBay or Google or Microsoft. The problem is that we don’t really know, we can’t really make informed decisions, because we had no idea what the data will mean in the future or how it will match up with other data. A friend of mine, named Robert Schifreen, who was one of the people who inspired the writing of the Computer Misuse Act used to talk about how they hacked into Prince Philip’s Prestel Telecom Gold mailbox: he saw a phone number on a screen one year and a password on a different screen a year later and put them together and that was how he understood how, well this is the kind of thing that can happen to all of us now. I think of my eBay ID as private, but actually it occurred to me a few months ago that I actually happened to use the same ID on another system where I’m clearly identified and somebody who cared to could figure out, could match those together and see both the history on that particular Web board and my history of eBay purchases and tie that to my actual real life personality.
People don’t, it’s very, very difficult to interact in so many different ways, in so many different businesses and not see the whole picture. It’s easy to do it without being able to see the entire picture that you’re creating. That said, I think online businesses have lots of different reasons for keeping data and sometimes it’s mutual benefit. EBay keeps our history, but that actually helps us because it’s what gives us a reputation and makes it easier for us to trade online, eBay is more or less unusable if you have no reputation at all.
The social networks to me are the most fascinating thing because people invade their own privacy on those networks to a degree that no business would ever dare ask them to. If you signed up for an online service and they gave you a questionnaire and said, please list all your friends, please upload photographs of all of them, please tag the photographs, you would run away screaming, but it’s insidious, its bit by bit by bit and you’re sort of sucked in and it’s all peer pressure. To me the biggest and most interesting mystery is Google, because unlike say, telephone companies and ISPs and banks, there is no government, well there is now, but there was not originally, there was no government mandate making them keep data and there is no requirement that we use Google, there are actually other search engines, there is no peer pressure to stay on Google and there’s really not even any benefit to us that they keep all this data and yet, you know, every day, each of us, I don’t know I probably make 100 Google searches a day and that all goes into the mountain that they collect. So, we’re kind of seduced by the slickness of the interface and the convenience of getting these results and every day we make the decision to hand over a little bit more information about ourselves.
It’s a difficult problem because users can’t make informed decisions. Users hate reading privacy policies, there’s actually somebody at Carnegie Mellon who researched this and what do you know, they hate reading privacy policies and I don’t think we can rely on government to help us unless we force it, force the issue, because as Casper has pointed out, they would like to have this data too – it’s in their interest, so there’s no reason for them to stop people from collecting all this information and so I think in the end, you know, Facebook’s idea of the Bill of Rights is really interesting, but it’s still a private business and there is no reason why they have to stick to it other than users might revolt. So I think, in the end events like this and political action and things like the Open Rights Group and Privacy International, These are campaigning organisations where you really try to force change at the political level. That’s actually a really important thing.
Iain Henderson: Yeah, Mydex is a social enterprise that we only set up three or four months back and as a social enterprise we need to have a charter, so our charter is to help the individual realise the value of their personal information. That means realise in both senses, so the first task we hope to tackle is help people understand, help people just figure out what the hell is going on, specific to their situation rather than just generically across their protection principles and if we succeed in that, we then move forward to realising the other sense, you help me get a return on my information whether that be save me time, save me money, save me hassle. So, why we’re doing that, is that we believe that the current ways of working around personal information are structurally broken. As Peter mentioned earlier on, organisations like Facebook, or the government, or BT, or whoever have to behave the way they behave. They have shareholders and they have stakeholders: they have to make a profit. The way that they can best do that is gather up as much information as they possibly can do, regard it as their asset and turn it back against the individuals that they deal with.
We believe that we need to establish a different type of model, a more balanced and more respectful model that effectively puts the individual in charge of their information and allows them to selectively disclose it to the organisation that they deal with. That model won’t replace the current model overnight, frankly I would still expect to be doing this in 15, 20 years time to end up with a much more balanced viewpoint, but roughly what we’re saying is, my view of me is vastly superior to any other persons or any organisations view of me. See, our own data is data that we once, this person bought that thing at that time, for that price at that outlet. That information is used to guess what they’re going to do next and to try and get a relevant message in front of them. My view of me is, I’m going to buy a new car in six months time, I pretty much know the process I’m going to follow, I know where I’m going to go for advice, I hold back on the information, because if I give all that information upfront, that reduces my bargaining position. So, my view of me is better than anyone else’s, if you, sometimes I need to prove who I am or how much money I’ve got or whatever, I can just as easily get that proof as any organisation can with the right tools and processes.
Jim Killock: Could you explain what Customer Relationship Management is actually, just?
Iain Henderson: Oh yeah, Customer Relationship Management, CRM is the kind of tools that you deal with every day, day in, day out, you phone up the call center for British Telecom, you hang on for half an hour where you get your service issues dealt with or they’re trying to sell you 5 or 6 more things that are probably irrelevant because you bought them last week from British Gas, yeah and this is someone who’s worked in CRM for 20 odd years and realises how broken it is.
So what we have sent the organisations ultimately when we’re up and running is that you currently spend say 3 Pound per year to access what is really a pretty poor quality record, that’s pretty toxic because it carries all sorts of liabilities and data protection problems, when you could spend 2 Pound per year to access a record in a different way, in a more respectful way through the tools at Mydex and plenty other organisations will deploy. It uses the kind of technologies that Casper mentioned, so we call this concept volunteered personal information, it literally is volunteered by you of your own free will and it will come with a contract effectively that the organisation signs your terms and conditions rather than the moment when you have to weed through that 25 page privacy policy that you don’t actually do anything about, but designing those contracts at the moment we use very high end lawyers that tell us it’s perfectly feasible, we use very high end technologies that are secure, we won’t do anything radically, there’s no desire to try and this issue quickly because it needs to be done carefully and slowly. That’s where we are.
David Smith: Thanks very much, it’s a real pleasure to be here today, I mean we in the Information Commissions office spend a lot of time talking to businesses and not as much as we’d like talking to wider groups and this year we’ve made a real effort to speak more to civil society organisations and have done that and it’s been very productive. It’s with a little bit of trepidation I speak to a different audience today, because I’m very interested in the views that everybody has. What I’ll say in just the few minutes I have is just a little bit about what we do and how our powers are changing and then throw out three challenges which I think we face as the regulator in this area and ones in which we’ll be very interested to get your views on. I mean, we at the Information Commissions office are the regulator for Freedom of Information and Data Protection Law and clearly its Data Protection Law that we’re concerned with today and our job is to protect the privacy of personal information, I mean within the laws set, that are set by Parliament. We’re not about challenging the law, we’re about working within the legal framework that the government gives us, but we use the tools that we have, essentially the Data Protection Act, the law to deliver protection of personal information and that protection is in two forms, the law places obligations on any organisation, whether it’s state organisations, private sector organisations, Internet businesses, to keep information securely, keep it accurate, to keep it up to date, be open and transparent about how they use that information and it gives me and you and all of us rights, rights of access to that information, rights to compensation if things go wrong.
We at the Commissions office deal with a lot of complaints from individuals, we have a complaints handling function and we have enforcement powers, not many criminal offences in Data Protection Law, our powers are largely to order organisations to change their ways, to change the way in which we operate and I wouldn’t say to you we use those powers hundreds or thousands of times a year, but we have very much increased over the last few years the extent to which we use the powers that we’ve been given.
We have been promised, more than promised, we have legislation now on the statute book which enables us to impose fines on organisations that fail to, seriously fail to meet their data protection obligations. It’s on the statute book, but we need the government to set the maximum penalty and to approve some guidance, but we expect to have that additional power later on this year. You heard in the opening session that about section 152 of the Coroners and Justice Bill, the Coroners and Justice Bill has some good bits in it. There is a section in it, section 151 which is to increase our ability to carry out checks, to go into organisations and do inspections which we can only do at the moment with the consent of the business concerned. The power in the Coroners and Justice Bill gives us that in relation to the public sector, we would like it in relation to the private sector as well and we may talk a little bit about the difference between the two, but we are also promised by the government increased resources. We fund the work of our office through the fees we collect from businesses that use personal information and essentially bigger businesses will pay a higher fee in future and, yeah we’re like all organisations, at the end of the day what we can do depends on the resources, you know, the powers and the tools help, but a lot of it is resource driven.
So, we come to the question: if we have those resources and those powers, how do we actually use them, where do we concentrate our effort, where would you like us to work our hardest? If we have to make choices, we have to prioritise and at the moment our strategy says, and you may agree this is right, we concentrate on the public sector, because, and you’ve heard already about your increasing collections of public sector databases where we all have limited control, limited choice over what’s kept on those databases. If you look at, you know, the breaches that we’ve had, the major losses of personal data, the private sector isn’t immune, but they’re mainly, or certainly the ones we’re aware of, are in the public sector and in the private sector, your market forces have a play, I mean, you’ve seen Facebook take down changes to their privacy statement recently, because of, not because of anything we did, because of market pressure and the need to be seen to be responsible.
Wendy mentioned her Google searches, Google have reduced their retention time for search records, I think it’s now to six months, we almost got into a sort of bidding war between, I don’t know, Casper might know something about Microsoft’s position, but a bidding war between search engines to reduce the time for which they retain records, in the interest of privacy, driven partly by the work we’ve done, but also by competitive pressure, to be seen to be responsible. So, do we need to be there? Other regulators operate in this area, there are a lot of pressure groups as well that are active particularly where the Internet is concerned, but, you know, there’s a question of whether this distinction is valid, lots of public services are contracted out now to private organisations. You heard already today about the communications database where, you know, private communications data which is collected by private service providers and communications providers, is available to the government for essentially policing and law enforcement purposes. A huge amount of information, you know, the tracking information on the Internet which could well be of interest to the state. So is the distinction valid, and where should we be concentrating ?
The second challenge, already referred to, is this reliance on consent – the traditional data protection approach is one of what you might describe notice and choice. An Internet site ought to tell you what it’s going to do with your information. Then you have a choice, you can go away and leave that site and go somewhere else if you don’t like what they tell you they’re going to do with your information or, you know, they may give you, click here if you don’t want your information passed on onto other organisations. And we have recently issued the draft of a code of practice to try and make those privacy notices simpler. What was said is right, you know, nobody reads them and if he reads them they don’t make a lot of sense, to try and make them just into some very simple messages that are meaningful to people. But is that what people want? I guess I mean, the choices and the information you need to make the choices is increasingly complex, I mean, you already see it, do you want us to send you marketing material, and then you have a choice, by e-mail, by post, by text message and in probably, you know, another years time there’ll be six other ways of sending messages to you. And then do we want you to pass it on to other organisations and can they send it to you, I mean there are very complex choices and when you get into things like form, explaining to someone how their information will be used there, so that they can make an informed choice, is difficult. So, is this sort of telling people and giving them freedom of choice still the right model? You’d refer it to profligacy of people on Facebook, I mean they have a choice, is it right, should we just leave them to make their choices and I’m not sure that it’s correct to say, you know, young people don’t know what they’re doing, they put their stuff on Facebook in ignorance, I don’t know, there may be an element to that. So, you know, to what extent should the regulator be a nanny, be part of what we may call the nanny state then, or should we just simply ensure people have information on which they can make free choices?
And thirdly, and lastly the question of globalisation, you know, the Internet operates worldwide obviously, but the legislation that we enforce, the Information Commissions office is UK based legislation. Yes, it comes from a European directive so we have roughly similar laws across Europe, but it becomes very difficult to enforce those on websites that are based outside Europe. Within the Asia-Pacific area, they have a sort of data protection type framework, but there are big gaps and the US is a gap to some extent, although the Federal Trade Commission in terms of, you know, sort of consumer protection are very active in the Internet world. So, should we develop sort of international law, international standards and there are risks there, I mean it’s easy to say, yes we should, but actually privacy, you know, there are a lot of different cultures, different legal systems, I mean China’s developing a Data Protection Law at the moment, but they have a rather different view of what, well I mean that illustrates it for you. There’s a danger if we go to sort of global standards, that we dumb down, you know, we have to reduce the protection in order to get sort of international agreement. I’m not sure that’s right, but there’s a risk there, so is the law actually the right tool to use, should we leave it more to sort of self regulation, market forces? Is the Internet a world in which, you know, nationally based sort of data protection regulators can be effective? I mean, I think we can, but I think also it’s important to recognize there are some limits to what we can do and, you know, we can press for global standards, but they’re not sort of magic overnight solutions in delivering this. I throw those three questions out and will be fascinated to hear the views and the responses.