by: Brad Koyak of Spectre Graphics
Ethics of big data is a topic that concerns us all. In my previous post, we discussed Big Data and the way it is changing the way businesses are run on a fundamental level. Big Data is the greatest advancement in marketing since television and is without a doubt a disruptive technology. As is characteristic of any disruptive technology there are ethical considerations that must be taken into account. Ethics in Big Data is still in its infancy and directly influences decisions that may have ethical discriminative consequences.
THE ETHICAL DILEMA
The current philosophy driving data collection is to gather as much data as possible wherever possible. For example, when registering a Maytag dishwasher warranty some of the questions on the form include household income and the number of children in the home. How is this data relevant to a new dishwasher?
The concept of overcollection is starting to become an issue in the modern world. In April 2021, a former children’s commissioner for England launched a legal case against TikTok alleging that it illegally collects the personal information of its child users.
In most cases, data collection policies are described within the terms and conditions or privacy policies of companies. This leaves many consumers feeling like their personal data is part of the price paid for goods or services. In addition, organizations, in general, do not provide details regarding the source of big data collection unless specifically requested. For instance, a 2012 study conducted by Facebook tested user’s emotions without their consent or even knowing they were participating. This type of practice may be considered by many as a breach of privacy.
The FBI currently has at least two ongoing investigations related to personal data collected by hospitals being held as ransom. In 2016 hackers were able to obtain the personal data collected by a Los Angeles hospital. The hackers demanded 9000 in bitcoin, roughly $3.6 million to decrypt the files.
When considering ethics in big data there is also the question of data ownership. Who is the rightful owner of your data? For instance, your data could potentially be collected and used by courts to build a case against you. In many cases, your data is collected, bought, and sold daily without you ever knowing about it.
Ethical Issues in Big Data
There are many ethical questions raised by Big Data.
- How much and what data will be collected?
- Will the bare minimum data be collected, or more?
- Can customers choose what data to provide, while still getting access to the product or service?
- What level of disclosure will be made to ensure that consent for collection is truly informed?
- Is any publicly disclosed information “fair game” for collection and use?
- Once data is collected how long is it appropriate to store or keep the data?
- How will the data be protected?
- Are companies spending an adequate amount to assure our data is kept safe from data hackers?
Typically companies do not dispose of or delete data they collect. In addition, they typically continue to collect your data long after your initial interaction with the company. Currently, there are no regulations or laws to influence these decisions.
Massive troves of data must have a home somewhere. Companies tend to deploy their own local or in-house data centers to this end, or they outsource the systems and handling to a third-party provider. In any case, the collected information gets stored on remote servers often not far from an open connection. It is exposed and available to unscrupulous groups.
In the event of a large data breach, which happens often these days, that information and data can quickly become compromised. Sometimes the root cause is lax security. Other times it is through no fault of the company that was collecting said data — though they are certainly still responsible for it. This compromised data is affecting thousands, maybe even millions, of people.
A widescale data breach comes with many consequences and repercussions.
- Identity theft
- Reputation or social damage
- Financial or personal issues.
The companies that owned the data can face legal and financial punishment as well.
DATA USAGE & REUSAGE
When discussing ethics in big data there are also no guidelines, regulations, or laws regarding how our data is used. In many cases, our data is being sold off to the highest bidder for the purposes discussed in our last post. There are also no regulations regarding who may use the data collected and for what purpose the data will be used. The typical user agreement simply states what data will be collected. Rarely does a company explain how they will use the data. More often than not your data is being sold to other companies and reused. No secondary consent is required for a company to do this.
Much like the problem with algorithmic profiling, big data can be — and currently is — a source of both intentional and unintentional discrimination in all forms, not just race and nationality.
A Google study, for instance, revealed men and women see separate online job ads, resulting in men being exposed to ads for higher-paying jobs and better opportunities, more often.
It’s not always due to direct targeting from a specific individual or group, however. Facial recognition has a tough time identifying those who are not white or fair-skinned. On personal devices like a smartphone, it’s frustrating. Yet, when companies or law enforcement agencies use the technology on a greater scale for activities like criminal identification and profiling, it’s not only unethical but puts vulnerable people at risk.
Given the vast amount of data collected, how does the business know, for example, that you prefer strawberry over vanilla? With smaller details that are less consequential, this isn’t an issue, nor is it scary. It can be silly to see how many odd things advertisers suggest for you based on something like your past browsing habits or recent purchases.
But what about ethics in big data when the information is much more crucial to your life or opportunities? For example, Australia’s automated debt recovery system has been the target of complaints, many of which claim the system inaccurately targets improper or vulnerable people. Similar to this, welfare cuts in the U.S. powered by big data systems have also come under fire for harming the lives of many innocents in Indiana, Florida, and Texas due to inaccuracies.
The improper visualization of data is also possible through collection or processing errors. This can lead to distorted or corrupted data that has unintended consequences for both your audience and organization.
POLITICAL AND SOCIAL REPERCUSSIONS
The more people and organizations know about you, the more influence they can have on your decision-making process. For example, the rise of bots that circulate false information online has been a frequent topic of recent media coverage in regards to the effect they have on political polarization. This spread of misinformation causes both social and political harm. In this way, propaganda has taken on a new form in the modern digital age.
The Oxford Internet Institute revealed the results of a study regarding this issue. Taking a closer look at how misinformation via social media can be weaponized to manipulate the public.
As we have discussed data is used by every single company, every single day to make decisions in regards to their practices. Behind these decisions are the analysts deciphering and interpreting the data. What biases might be introduced when factors are selected and built into or excluded from algorithms? In addition, we must ask ourselves if there are limitations to the analytical tools being used. If so, are those limitations considered when data analytics are used to make business decisions? In some cases, analytics may accidentally result in what is called improper profiling.
It is common practice for analysts to merge and study details collected to determine more complex targets, this is called algorithmic profiling. If you followed my Digital Marketing series, this is how user personas are built. Algorithmic profiling is used to develop in-depth and exclusive personas.
The ethics in big data dilemma here is that data profiling can lead to some extremely dangerous and morally concerning practices. For instance, It is possible to compile lists with the names and contact information of rape victims, the addresses and locations of domestic violence shelters, people with specific illnesses, and more. People called data brokers can then get their hands on these lists and sell them to interested businesses.
HOW ARE WE ADDRESSING THESE CONCERNS
How are ethics in big data being addressed? On April 10th, 2018 the founder and owner of the largest social media platform on earth Mark Zuckerberg stood trial in front of the US Congress. This resulted in two days, ten hours, 600 questions, and even more confusion surrounding data collection.
The year prior a whistleblower named Christopher Wylie told The New York Times and The Guardian/Observer that Cambridge Analytica had purchased Facebook data on tens of millions of Americans without their knowledge. The goal was to build a “psychological warfare tool,” which Cambridge Analytica then unleashed on US voters to help elect Donald Trump as president. This of course was not the first time Big Data had been used to run a political campaign. It may surprise you to learn that targeted marketing was the key component of the Obama Campaign.
When asked by Illinois Democrat Richard Durbin if he would be comfortable sharing aloud the name of the hotel he stayed at on Monday night or whether he would be comfortable sharing the names of the people he has messaged this week, Mr. Zuckerberg simple said “No. I would probably not choose to do that publicly here.” As Mr. Durbin stated after the question, this is the very root of the issue behind big data.
“The limits of your right to privacy. And how much you give away in modern America in the name of connecting people around the world.”
What use is our Facebook data? In 2013, two University of Cambridge researchers published a paper explaining how they could predict people’s personalities and other sensitive details from their freely accessible Facebook likes. These predictions, could “pose a threat to an individual’s well-being, freedom, or even life.” Cambridge Analytica’s predictions were based largely on this research.
In response to being called before congress, you may suspect that Facebook ceased its data collection processes. However, shortly after the trial, it came out that Facebook openly collects and sells our private messages. If you sent a message on Facebook, you agreed to it in the terms and conditions required to use the Facebook app.
THE HARM BEHIND IT ALL
Big data collected by Facebook is used to develop persuasive design techniques like push notifications and the endless scroll of your newsfeed. It creates a feedback loop that keeps us glued to our devices. A 5,000 person study found that higher social media use correlated with self-reported declines in mental and physical health and life satisfaction.
Social media advertising allows anyone to reach huge numbers of people with phenomenal ease. Poor ethics in big data can give bad actors the tools to sow unrest and fuel political divisions. The number of countries with political disinformation campaigns on social media has doubled in the past 2 years.
Algorithms can promote content that sparks outrage, hate, and amplifies biases within the data that we feed them. 64% of the people who joined extremist groups on Facebook did so because the algorithms steered them there.
HOW DO WE FIX IT
Obsessing over engagement metrics can lead you into the trap of assuming you are giving people what they want when you may be preying on vulnerabilities. Outrageous headlines make us click even when we know we shouldn’t. Knowing other people are together without us makes us feel left out and causes us to be depressed. The real issue here is that false information, once we believe it, is impossible to displace.
Instead, we can take a more value-driven approach to ethics in big data. We can use the same metrics and align them with the specific values we intend to create with our product. We can measure our success by including qualitative research and utilizing more outsiders or third-party consultants.
When given the opportunity humans are highly capable of accomplishing goals, connecting with others, and doing many of the other tasks technology seeks to help with. Instead of making design choices to drive or influence these tasks, we can focus on making design choices that support humans in making their own decisions. An example of this is the popular app MeetUp which connects people who have shared interests and encourages in-person get-togethers.
Ideally, your organization would understand the harms it creates and would incentivize reducing them. In all actuality, these harms are complex, fluid, and hard to understand. It is important to build an empathetic connection between product teams and the users they design for. Many modern practices use personas or focus groups to gain empathy for the user. However, humane technology requires that you internalize the pain your users experience as if it were your own. This mindset leads to a deeper understanding and caution.
In addition, we can help people make choices in ways that are informed, thoughtful, and aligned with their values by being more mindful in the way we frame information. For example, when presenting new information, appropriate framing can help people make good decisions. The same information lands differently when framed in a more relatable context. When people are presented with information in an intuitive way, they are empowered to make wise choices.
As the world of big data continues to expand in the absence of government regulation it falls upon us to make ethical decisions. As designers, managers, and even frontline employees we can be sure to use Big Data in ways that benefit or propel instead of discriminate or misinform.
Get the Skills for the Rewarding Career You Want
Is a fulfilling career and a brighter future right for you? Get more info...