Data Broker Calls For New EthicsBy: Richard Beaumont | Wednesday, April 9, 2014 | Tagged: Privacy, Acxiom, Data Brokers, Privacy Impact Assessments, Do Not Track | Leave Comment
Scott Howe is CEO of Acxiom, one of the largest data technology businesses in the world, and often given the label ‘the largest company you’ve never heard of’. He makes his money on the collecting and selling of data about people, their preferences and interests.
The data broking industry has come under a lot of scrutiny in recent years, largely because of the fact that they are the organisations whose data and its algorithmic segmentation often drives the behavioural and targeted marketing activity that is at the centre of most privacy debates and concerns.
In a potentially ground breaking article in Ad Age just this week, Scott appears to have held his hands up on behalf of an industry and made the statement ‘we must do better’. In the article he calls for an agreement to a new ethics of data use.
Whatever the motive for such a move (and I am sure there will be no shortage of commentators to provide one) this is a hugely public step towards not just acknowledgement of a problem, but a call for solutions. This makes it impossible to ignore.
The call to action came with an invitation to comment via a dedicated email: [email protected].
We decided to respond of course, but also to share our response publicly in an open letter. So here it is below:
I read your article in Ad Age today and have decided to respond to your call for ideas for the ethical use of data.
As a short background – I represent a company that helps businesses using data understand the impact of such use, and ensure they comply with appropriate regulations.
Being in the UK I believe we have a unique perspective to bring to this global issue. Culturally and economically we are closer to the US in business ethos than many other EU countries, yet our privacy outlook is heavily influenced by Europe, and of course we are bound by EU originated rules. The UK led the EU in the application of the ‘Cookie Law’ – and our company has been a key player in helping business adapt to these new requirements.
I thoroughly applaud your call to action and the fact that it is coming from Acxiom gives me great hope for the outcomes. Change is indeed needed and if it is not embraced by the data industry, it will inevitably be forced upon it by outside agencies – whether these come in the shape of legal regulation or growing individual citizen action to pro-actively protect their own privacy. In reality a bit of both is likely.
I am a strong believer that the future for better data use must come through greater transparency and choice. Whilst some commentators are saying that this model is broken and not fit for purpose in the current and future data market, I disagree.
What has so far failed is that ‘notice’ at the moment is not transparency, and ‘choice’ is either absent or illusory.
Transparency means giving information in simple consumer friendly language, not dense privacy policies that only lawyers understand, and go unread because of this.
Choice means data collectors being pro-active in providing mechanisms for simple preferences to be expressed and acted upon with minimal interference in the user experience. This in particular is something we have learned in developing such mechanisms for Cookie Law compliance solutions.
Choice is definitely not the current model of ‘agree or leave our site’.
I think there are two obvious things the data industry can do better right now, which build upon work already started, and can help to bring back trust to consumers.
First is to complete agreement on the Do Not Track standard. This I believe has fundamentally stalled because the industry has tried to narrow the scope of what the request should mean to a point where it has no privacy benefit, and minimal impact on the collection of data.
If the industry were to agree with most privacy experts, that Do Not Track should fundamentally mean ‘do not collect data’ – then a major barrier to agreement would be overcome. Focus could then shift quickly to a model where user preferences could be pro-actively respected and advertisers and publishers could switch to developing incentive models for users to opt back in to tracking – either globally, or by specific service providers.
Secondly and related to this is the overhaul of the functionality of the advertising opt-out system through the ‘blue icon’ programme.
The industry has done much to promote the programme, and touts its success by focussing on the very low opt-out rates. To privacy professionals this same opt out rate demonstrates its failure.
Surveys of consumer attitudes and the growth of browser plug-ins, privacy browsers, and identity masking technology – all show there is a greater appetite for opting out that the icon programme is not satisfying.
This is because the user experience is both highly interruptive, choices are not comprehensible, and the perceived impact of these choices is minimal. Similar to the proposals for DNT, opting out via the icon may prevent the display of behavioural ads, but does not appear to stop the collection of data. It is therefore a fundamentally flawed model as far as the consumer is concerned – data is collected, they just don’t see the benefits through better targeting.
These I think are the key changes that the industry could make happen relatively quickly, and would go a long way to restoring consumer trust.
However much more could be done by a wider group of players which the ad-tech and data broker industries could take a leading role in. Certain regulated industries and public bodies have for a long time been required to be careful in the use of sensitive data – usually legally defined by various regimes in the US, Europe and increasingly beyond. As a result they often have to complete documentation processes such as Privacy Impact Assessments (PIAs), both to ensure compliance, and to prove to regulators that they take privacy seriously.
In the UK and Europe there are already moves to encourage wider use of the PIA model for more general use of ‘personal data’ or PII as I know it is referred to in the US. The idea behind this is I believe to encourage all players – large and small – to think carefully about how they use data about people in all its guises. Such efforts are likely to lead to more responsible use of data – whether that is defined by legal requirements, ethical codes or self-regulation programmes. This in turn could lead to better security, reduced risk of breach, and lower instances of id theft. All of this would lead to increased consumer confidence – which not only increases sharing, but increases the quality of data shared. Such an outcome would be of huge benefit to consumers in terms of better services, the data economy, and society at large.
I am under no illusions that such changes will take time, and involve short term cost and risk for many established businesses. It may result in some players, even some business models disappearing or radically transforming. However the potential long term benefits are great enough that I believe such risk to be worth it.
I welcome your response and would be happy to elaborate further on these ideas if you would like to discuss them.
Head of Service Development, Cookie Collective