I had an interesting situation occur on a flight home yesterday that made me think about data privacy and where our responsibility lies as developers to protect end users from well, being human.
I want to share this story with you all to inspire conversation and discussion around how people use what we create and how we can make sure create to systems that are as secure as possible no matter what choices the folks using them make.
The scenario...
I was sitting next to someone on a plane waiting for our flight to take off who was working on their laptop. Without trying, I was able to clearly see this person worked in healthcare and that the system they were in was likely meant for internal/non-public use. There was much personal identification listed including full name, age, diagnosis, etc. The person working appeared to be filling out some reports.
I could see all this just glancing.
Here's a mockup from memory (created in Excalidraw). I do know the software name, but am choosing not to reveal. There's a ton of reasons this product could look the way it does, I don't want to make assumptions or attack. Definitely not my point.
What did I do next?
Well after trying to avoid looking in this persons direction waiting for them to close this system down that I should not be looking at - I decided to give them a suggestion to increase privacy while working in public. This led to an interesting conversation with a person who was luckily not offended by my curiousity and suggestions.
Some topics for discussion that came out of this for me and I would love to hear any input from the community.
This software displayed identifying information all the time on the dashboard that the person using it did not need to have visible at all times. Is it our responsibility as developers to mask UI better even when a system is internal? Under what circumstances does data like this need to be displayed all the time? Even in an office setting, I would argue this is a poor design decision.
This person told me there is security for the application in that they cannot access it via public wi-fi (although implying they definitely had tried). They ended up using their hotspot. We as developers should assume that if a person can connect to internet and takes their computer outside the office, they will do this at some point.
The suggestion I gave to this user was to get a privacy screen for their laptop which can help limit viewable area. They were completely unaware such a thing existed. How can we better help companies who train their employees about security know simple tips and tricks like this? While it may not be directly our job to do this kind of training, I do think developers are uniquely qualified to consider more scenarios that an issue could arise.
I do realize there are so many people in a project from conception to delivery and we - the developers - are only one stop along the way. This system the person was working on also screamed legacy and I wouldn't be surprised if the dashboard design hasn't been considered for the way people work today.
What I really hope you take away if you have found this story interesting is remembering that the end user is the weakest link in security and when you run into applications that can be improved upon in simple ways like obscuring some data, speak up! And if anyone ever says to you "that's an internal system, no one uses it in public", ask them if it's possible to connect. If it's possible, people are working in all kinds of places now.
Would love to hear thoughts in the comments and please connect with me on GitHub, LinkedIn, Twitter, and BlueSky @amandamartin.bsky.social