Becoming a Consumer Again: Why Should You Care About Data Privacy?
In my previous post on the ethical concerns surrounding AI, I referenced the fact that the rise of social media has caused us to lose ownership of our data. As the wisdom goes, if the product is free, then you're the real product. Companies such as Meta and Google rely on our data for their business. We can all probably remember one time when a single Google search or a small conversation with a friend over a specific product somehow resulted in targeted advertisements for said product. While in the case of the latter, it is likely to be some form of a coincidence, it's still worrisome how much big corporations know about you. In the case of Google, their Ad Center provides a breakdown of exactly what categories they understand you belong to, and this is based off of your Google/YouTube searches, Chrome history, location, etc. Ultimately, I'd rather not have Google know so much about me, and that's just one specific company. There are many companies like that, who know or can guess more about you than you realize.
Granted, this might not be enough to be disquieting to a general consumer. Maybe, from your perspective, sharing data about yourself is a small price to pay to use a service for zero monetary cost. Fair enough, but perhaps by reading on you might understand why some people care more about their privacy.
"Why do I need privacy if I have nothing to hide?"
Privacy is not the same as anonymity or secrecy; the fact that you as a person exist can still be on the web even if you care about privacy. Privacy refers to restricting knowledge of yourself from others, and it's an innate human right as natural as freedom of association or freedom of speech, one that existed long before the Internet was even a thing. Our control of our own data controls our relationships with others. With people we're closer to or want to get closer to, we share more information about ourselves in the hope that they do too. With different groups of people, you might be more open about some types of data and less open about other types, since the data about yourself is multidimensional and complex. Allowing a corporation full access to the information that makes you you is like making it your closest friend even though it couldn't care less about you. You don't need to be on the run from the law to care about the information you share with others, because withholding information about ourselves is a negative right that's part and parcel of what makes us human.
Here are three reasons why you might want to care more about privacy.
Data Security (or lack thereof)
Your data is not safe. Data breaches are common, and who knows what kind of data about you is publicly available. Of course, big corporations have big security teams that are on the cutting edge of cybersecurity, but in a field moving as fast as cybersecurity, slip-ups in the arms race are possible. Do you want to risk malicious actors getting hold of your personal data? Corporations are (rightfully) hush-hush about their inner workings, and this means it's impossible to be fully confident in what they're doing to protect and secure your data (same reason why open-source software can be trusted more).
Data Misuse and the Loss of Autonomy
Today your data might be used for X, but in the future it might be used for Y. The trend so far doesn't inspire much optimism, and more and more privacy policy updates over the years can make corporations you had originally trusted use or share your data for purposes you hadn't agreed to. While targeted advertisements are annoying, data misuse becomes more of a problem when your choices are restricted due to value judgments made by corporations based on personal information that might be rationally irrelevant to the choice at hand. Worse yet, it might not even be a corporation but some AI making these life-changing decisions. Outside the digital space, imagine if Alice knew everything about Bob's life and pulls the right strings to ensure that Bob's choices are limited in all matters personal and professional. Bob only had his privacy violated, and yet he finds himself in a very frightening situation none of us would like to be in.
Data Permanence and Personal Growth
Personal growth is something we all aspire to. It wouldn't be controversial for me to say that most of us are constantly in the process of finding ourselves. Our views and opinions change, and our online activity will follow these trends. Do you want to be judged by your online activity when you were (current_age - 5)? Probably not. It doesn't need to be anything bad, but it could just be things that you feel no longer represent you as a person. While corporations are incentivized to keep their understand of current you as accurate as possible, that doesn't mean that all of your online data will be wiped off every once in a while. As long as your data is there, it can't be truly forgotten, and it's subject to the same two issues previously described. Data is worse by itself, but is made even worse when placed out of context with growth over time ignored. To allow space for second chances and personal growth, it is necessary to protect the right to privacy.
Our personal data has value, even if it can't personally identify us. We should be in full control of it and how much we're willing to give up. Of course, there are additional layers of nuance surrounding the topic, and a good example is how privacy itself is on a spectrum. How much do data do you want to withhold and from whom? While it's easy to care about the basics, maintaining digital privacy requires of us to ask ourselves questions like that so that we can put it into practice. In the next part, I'll go deeper into these issues, and talk about my personal journey with data privacy.