Michael McIntyre once joked that our fridge is the one appliance we are happy to leave switched on when we go away on vacation. In McIntyre’s world, it’s the apogee in trusted technology. The picture is very different when it comes to consumers deciding whether to trust government and big business platforms with their data. A better comparison would be the perfidious steam iron – no holiday can begin without an abortive attempt to leave, only to return to check it hasn’t been left on (maybe that’s just my family).


If data is essential to enabling profitability of firms and effectiveness of government departments, then being trusted with customer data is critical for many new business models to work. Most customers and citizens recognise that sharing their personal data results in better services and digital experiences. Some individuals don’t care how their data is used and shared, but they are in a minority – only 22% were classed as ‘oblivious’, according to a 2018 data sharing survey[1].

Increasingly data trust forms part of competitive advantage, with 75% of customers more willing to share data with brands they trust[2]. There are plenty of blogs now referring to data trust as ‘the new currency’ – a differentiator when price can be competed down and service models copied.

Technology leaders are also beginning to address customer perceptions in this space – whatever your views on Facebook and Apple, their latest developer conferences have both focused on new privacy measures. Regulators are also paying more attention – for example city authorities in San Francisco have banned facial recognition amid worries about who can be trusted with this technology. Open Data Manchester have recently launched a Declaration on Responsible and Intelligent Data Use, which Kainos have endorsed as part of our growing commitment to that city.

My last blog on data ethics was prompted by the probing questions of a candidate for interview at Kainos. In a competitive market for talent, being an organisation trusted with personal data is increasingly a selling point for those who prioritise working for an ethical organisation above financial reward. This is particularly the case with Millennials – and it’s a myth that data trust is less of an issue for younger generations; 78% of Millennials are now concerned about how organisations share their personal data with one another[3].


I was at a conference recently where a global systems integrator was showcasing an AI solution that listened to conversations at border crossings to identify illegal migrants. Without knowing the details of the platform, I’m not going to venture an opinion, but when challenged about how they handled the ethical considerations of this project, their answer was that the solution was GDPR compliant – which I’m not sure I find comforting. In terms of protections for data subjects, GDPR is laudable but anecdotally firms are using ‘legitimate interest’ as a blanket lawful basis. In any case, being GDPR compliant isn’t a mark of trust; it’s just not breaking the law. GDPR also covers some ‘hygiene’ factors like data security – that individuals have come to expect that as a minimum. Regulation can however create opportunity and a spur to action.


Amongst other things, data trust is about:

  • Genuine openness and transparency – going out of the way to make it easy to understand how personal data is being used and signposting where this information can be found. (By comparison many GDPR implementations are simply irritating and designed by lawyers, not user experience professionals like those in the Kainos Experience Design capability). It is important to address customers’ fears openly and avoid being evasive – for some organisations, this might even involve building customer / citizen data literacy, to help them make informed decisions and provide meaningful consent. 88% of consumers cite transparency as the key to trust[4]
  • Putting trust as a primary consideration into the design of digital experiences
  • Having strong data governance processes so that privacy measures are more than window-dressing
  • Using advanced de-identification techniques when releasing datasets or handing data to AI and analytics teams for experimentation
  • Publishing ethical principles around handling personal data, which avoid you sleep-walking into an ethical position you never intended. These should be endorsed by senior leadership. The ODI’s Data Ethics Canvas is useful here as it invites the consideration of many ethical dimensions as part of initiating a new project. More difficult is embedding the culture change that means the organisation lives out these principles.
  • Working with trusted partners, being careful with whom you share sensitive data (a ‘trust food chain’, if you like)
  • Implementing fair exchanges in return for personal data collected. As Kainos have an office in Gdansk I regularly use the city airport, where it is possible to obtain free Wi-Fi access anonymously for 20 minutes or unlimited access in return for sharing a Google or Facebook profile. 70% of consumers would share more data if there was a perceived benefit, with greater online security and convenience at the top of the list[5]
  • Since GDPR is somewhat subjective (in the absence of significant case law at present), how you apply the GDPR tests is also important – for example are you really applying data minimisation as strictly as you could?
  • Personalisation that is only used to save consumers money on the basic unit price of a product (rather than inflating it if the algorithm determines you can afford it).
  • Streamlining the Subject Access Request process


These are some of the elements that make up your corporate reputation when it comes to data trust. Faced with the overwhelming task of managing consents across multiple platforms, customers and citizens will increasingly choose digital services from brands and  government agencies which take a lead in data trust.

One example here in the UK is the Co-op group. Co-op have a strong focus on data trust and appropriately enough, have been openly sharing their progress. One interesting finding comes from surveying Co-op members – 70% of members trust Co-op to share their data anonymously[6], if it benefits the local community (a cause that resonates with the Co-op brand). We helped Co-op develop a reference architecture for data which made provision for customer preferences, de-personalising and fine-grained access control. It’s exciting to see those features taking shape in practical applications.


This doesn’t sound like a blog from a technology company and yet these are issues that we run into all the time at Kainos. The importance of the user experience has been mentioned – we are the people behind the new NHS mobile app, which had a number of trust challenges to solve including giving consent for sharing of medical history information. We worked with NHS Digital and members of the public to ensure that informed consent could be obtained and that trust was not a blocker to adoption. Our cyber security and secure DevOps expertise sits behind many high-profile government services like Register to Vote, helping to secure citizen trust. In my own area of Data & Analytics, we find that implementing strong data controls cuts across the engineering of data platforms including factors such as proper categorisation and classification of data, implementing fine-grained access controls, architectures that separate personal data, and privacy-enforcing functions. To take one example, our team at the credit card provider NewDay have built a high-performance vault architecture to protect sensitive customer data without compromising data throughout. In the area of data science and machine learning, trust issues are vital – including data sourcing, unintended bias and considering whether permission has actually been given for the proposed use of personal data. The area of ethics in data science is a complex one  and  is more fully covered in my previous blog, but we strive to live by ‘responsible innovation’. Suffice to say that data trust is pervasive to what we do in Kainos and touches all our core capabilities.


Our work with the Urban Data Project (UDP) has highlighted the importance of data trust when  collecting data about public spaces in cities around the world. UDP obtains data from multi-purpose sensor devices and makes it available through the City Data Guardian platform, which Kainos are developing. The name is apt as the City Data Guardian protects data the city has chosen to collect and filters it, so that cities can use it to improve services and share it publicly or with third parties.  The filtering process enables fine-grained access control to be used and includes techniques like k-anonymisation and Laplace noise to enhance data privacy. These protections are essential to establishing the trust of both the public and city authorities.


Although a trusted appliance, the fridge analogy actually runs out of road quickly in the sense that a fridge is an off-the-shelf solution to the well-bounded problem of reducing food spoilage. By contrast, standards and good practice are still emerging in data trust so what is needed right now is expertise and insight.

I’d like to acknowledge the help of Kay-Anne Ng (head of our Birmingham Data Capability) in the preparation of this blog.       

[1] What type of data sharer are you? Experian, May 2018

[2] The Appropriate Use of Customer Data in Financial Services, World Economic Forum, September 2018

[3] Deloitte Global Millennial Survey, 2019

[4] Data privacy: What the consumer really thinks, DMA Group, February 2018

[5] Experian Global Identity and Fraud Report, 2019

[6] Speaking to our members about how their personal data is used, Co-op Digital Blog, May 2017