Data abstinence in the name of being an ethical data company is not a sound operating model

Alex Antra - Sep 28 '20 - - Dev Community

We currently exist in a dichotomy when it comes to how companies use our data.

At one end of the spectrum we have companies like Facebook and Google who have in the lovely words of HIBP have; leaked, sold, redistributed and abused [our data] to our detriment and beyond our control. I call them unethical data companies.
At the other end of the spectrum we have companies like DuckDuckGo and Signal, who strive to be seen as the ethical alternative.

However the thing that makes them ethical is the fact that they either don’t store your information or that they encrypt it to the point that not even they can look at it. Meaning that even if a government agency had a legally enforceable warrant or a rogue staff member wanted to go looking, it would not be possible.

What makes them ethical is not a comprehensive and transparent framework that explains exactly how the data is used, but rather that they have just removed all possible avenues for themselves to breach your trust. It’s like preventing getting cancer in your arm by amputating the arm.

Simply put this is the nuclear option, and like any nuclear option there will always be fallout.

Great data products require access to lots of good quality data, the more the merrier.

Facebook and Google have so much of your data they can swing a dead cat 180 degrees and accidentally release three great data products.

For example Googles predictive typing feature is built off of a decade of reading your emails.

Companies that don’t collect enough data or obscure it through encryption as a blanket rule are potentially pushing themselves out the market. (Obfuscation where appropriate is recommended obviously)

Without the data to make market competitive products their ability to said products at the same pace as the unethical companies is effectively the same as trying to win poker with a bad hand, you’ll only succeed if the other person makes a mistake.

This potentially means that unethical data companies may be able to sink or even just wait out their more ethical competitors.


DuckDuckGo is celebrating good growth at the moment but we’ve all seen market factors sink more mature companies. All it takes is for the next new thing to require the very data DuckDuckGo has hidden from themselves and they wont be able to compete.

What benefits the user, the industry, and these ethical data companies is to stop using abstinence as a data strategy and to move towards the center of the current spectrum.

We need our ethical alternative product companies to have access to the data they need to stay in the market, especially while juggernauts like Facebook still exist.

However to be able to access that data to stay competitive these companies must establish a data strategy that puts building customer trust front and center.

A data strategy that achieves this by learning how to thrive in the bounds of these three principles: transparency, controls, and education.

To be transparent you need to move beyond these vague and all encompassing privacy polices that are really used to be a ‘get out of jail free’ card. You need to explain to the user what kind of data you store, how and why you use it internally, how and why you use it externally (if you have those needs) and what sort of things you would be open to doing in the future that you don’t quite do just now. And you need to do this in a way that the average user can understand.

You also need to pair that with some some form of framework so your customers know the rules you’re committed to play between. Something like ‘we will never send your personal data to a third party but we may share aggregated non-identifying usage figures under the following examples.’

Say what you will do, why you’re doing it and what you won’t do. Sentences like ‘we may use your data to improve our products’ or ‘we may share data with third parties’ are just not good enough any more.

Then once you’ve written a proper privacy policy you need to provide your end user with granular control.

We need to move away from this gun to the users head approach of denying access to the product if they don’t agree to every single demand of yours. It’s blackmail and doesn’t build trust, it just tests it until a competitor comes along.

Giving the end user an appropriate level of control is paramount to building that trusting relationship. If they can go into your product and choose to prevent their data from being used for X but still allow it to be used for Y you are not only building trust through control, but leading them down a path of making informed decisions.

You want to provide so much freedom and control in your own product that the user can’t help but notice how everyone else doesn’t live up to the same standards.

Then once you have those processes in place you need to keep communicating with your users and give them the space to engage back.

If you’re changing how you’re using the data, if you’re collecting new data, or if you’re about to do something controversial, tell your users early, give them the ability to opt out before hand and reward them for exercising their choice.

If you do this, you may find that people who opt out as a first response may opt-in at a later date. Either because they believe in your product enough to change their stance or because they spoke with others and realized that it may not have been as bad as they feared.

Also communicate about things happening around you and your customer. Actions done by other companies may have negative impacts on you and saying nothing will often perpetuate them.

If a rival company is using ethical data jingo for their own benefit like Facebook does with their end-to-end encryption of WhatsApp messages you have need to face into that and explain why you are still the ethical choice, especially if what the company is doing is lying to their users.

If the only thing keeping users of your product is the ethical sales pitch then the second Google and Facebook have appeared to ‘fix their ways’ then your user base will go straight back to them.

I love these new privacy conscious companies, I’ve been using DuckDuckGo for over two years now and have found my ads less creepy and my data being involved in less breaches.

However I fear the thing that makes them different might be the thing that puts them out of business. Building an entire strategy around doing the opposite of the bad guys is arguably the easy way around the problem and can potentially prevent them from being competitive or surviving.

It’s about time companies serious about being trusted by their users actually took the time to tackle that problem rather than chopping off every single possible avenue that could lose them user trust.


I currently work for a great company called Xero doing all sorts of fun data things. When you're reading my articles I need you to understand that my words are my own, I'm not speaking on behalf of my employer and If i'm talking about something negative in the field that may not be indicative of Xero, I've worked many interesting roles and I read a lot about my field.


Who am I?

You should read....

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .