Technology has improved our lives in many ways but the one thing that has got gradually worse over time is our privacy.
Companies like Meta, Amazon and Google know a staggering amount of information about us, from our favourite shows and movies to your sexual preferences and your current location.
The goal of all this data gathering is simple, to make as much money as possible by selling targeted advertising and recommending products based on your preferences.
In 2018, the GDPR regulation came into effect with the goal of protecting peoples privacy. Although this regulation had good intentions, as many of us know, it put a huge technological burden on small companies.
Hopefully these companies were already storing personal data securely, but you now needed to put in place ways for people to request data, delete their data and ensure you aren't keeping it longer than needed.
The one thing that this has ensured is that companies think twice about what data they gather on people. Do you really need to know my date of birth, annual salary and marital status when signing up for a new account?
The regulation puts even stricter protection on sensitive data such as:
- race
- ethnic background
- political opinions
- religious beliefs
- trade union membership
- genetics
- biometrics
- health
- sex life or orientation
This however hasn't stopped large companies from gathering this information anyway, especially those outside of Europe.
It was only last month that X (formerly known as Twitter) updated their privacy policy to include the collection of "biometric data". Why on Earth (or Mars) would Elon need to gather that information?
Technology has come far since GDPR was conceived, and we now have every company rushing to try and use AI, so they don't get left behind.
However, unlike a database, the Large Language Models (LLM) used in artificial intelligence don't have any way to delete information. If your personal details get included as part of the training data, then they are in there for good.
If that wasn't bad enough, Meta are now selling Ray-Ban smart glasses that look like normal glasses but allow the wearer to record everything going on around them.
Don't worry Meta have your privacy covered with a little LED light that turns on, so you know someone is recording.
I know many people who don't like their photo being taken and will actively avoid people who look like they are taking a photo on their phone. However, even with a little light notifying you it's recording, it is still easy to miss.
You can buy them for around £300 which isn't an extortionate amount, so I can see these becoming quite popular, unfortunately.
Online Safety Act
Not to fear, the UK Government is here!
Last Thursday the government passed the Online Safety Act. Which aims to make the internet a safer place for adults and children.
In theory this sounds great.
Unfortunately the bill contains a number of questionable points which completely go against people's privacy.
It seems that the governments take on safety is to keep everyone under surveillance. Forget 2023, it is more like 1984, Orwellian style.
One of the key controversial points was that messaging services would need to use "accredited technology" (meaning government approved technology) to scan all messages for content containing CSAM.
Tech firms have rightly been against this as it poses a fundamental threat to end-to-end encrypted messaging. The government hasn't stepped down despite the controversy but admitted that Ofcom would only be able to intervene once scanning becomes "technically feasible".
Although, scanning peoples files hasn't worked out well in the past as this Dad can attest after being marked as a criminal for sending photos of his son to his doctor.
The other controversial measure which is part of this bill is on age verification. Websites which may contain adult content will need to perform age verification on their users. By the sounds of it this may also include all social media websites as well.
When they say age verification they are not talking about the pop-up that asks you for your date of birth as seen on many alcohol websites. No they are talking about verification using your ID or some other means of personal information.
It seems we have gone from GDPR, where only essential information should be collected, to needing to scan your passport before visiting a website.
The main problem with all this, is that most companies don't have a great track record when comes to protecting our data. You only need to look on Have I Been Pwned to see where your data has been leaked.
Shockingly my data has been leaked by 15 different companies over the years.
So forgive me if I am not overjoyed by the thought of companies needing my passport or driving licence to make sure I am an adult.
What can we do as developers
As software developers many of us will be directly involved with implementing some of the changes for the companies we work for.
Even if you are not based in the UK, there is a lot to be said about the state of data collection in other countries as well.
Luckily as software developers, we are in a good position to make companies a little bit more privacy conscious, especially if we are involved early on in the decision-making process.
It is worth discussing these questions when working on new features:
- Do we need to be collecting all this information from our customers?
- How are we going to make sure that this data is stored securely?
- What security measures have been put in place to ensure no unauthorised access?
- How long do we need to keep this data for?
- How are we going to automatically delete the data when it is no longer needed?
- Is this data being shared with anyone?
- How do we ensure the data that we are keeping is kept up to date?
Unfortunately a lot of the data breeches have been due to developers not following best practices when it comes to security. If you are building publicly accessible software then you need to make sure that the proper security is in place and that it is tested for weaknesses.
One such example is from the greetings card company, MoonPig. In 2015, a security flaw was made public which allowed anyone logged into their API to access all of their customers details including name, address, card details and orders. The access even allowed them to make orders on behalf of other customers.
The flaw was found by another developer who told the company, but they failed to fix it for 18 months, so he went public.
Clearly whoever implemented these features didn't understand the difference between authentication and authorisation.
As developers, we need to make sure that we understand the security basics, so we don't accidentally put customers data at risk.
❤️ Picks of the Week
📝 Article - Hopping instead of hustling. The StackOverflow developer survey shows that developers are still job hopping despite the current market with many developers actively looking for a change.
📝 Article - Approaching unconventional problems. Your friend loses their phone in the middle of nowhere with no signal. How do you find it? This is some great out of the box thinking.
📝 Article - Internet Artifacts. Take a trip back in time and revisit how the internet started. Neil has done a great job putting this together and it is fun to look through.
📝 Article - Base64 Encoding, Explained. Akshay has done a great job here explaining how Base64 works. Worth a read.
📝 Article - The Slow Death of Authenticity in an Attention Economy. This is my main issue with social media, everything just seems so fake. If I am going to connect to strangers on the internet, at the very least I want them to be themselves.
💬 Quote of the Week
The real point of doing anything is to be happy, so do only what makes you happy.
From Anything You Want (affiliate link) by Derek Sivers.
📨 Are you looking to level up your skills in the tech industry?
My weekly newsletter is written for engineers like you, providing you with the tools you need to excel in your career. Join here for free →