A number of ethical issues this week dealt with technology issues and setting boundaries. What happens when the algorithms become too efficient and lead companies to actions that may be “too human” or inadvertently racist?
- The scummiest of moves (updated): I am adding the ethical violation I just saw because a) it strikes close to home and b) is the scummiest of scummy moves. A Maryland company was just fined for selling hand sanitizer to mass transit and to my town’s school that it claimed could stop the spread of COVID – yet had 0% alcohol. This appears to be a pure case of lying and unethical behavior.
- Stereotyping Asians – One of the students in my BU Ethics class shared a great article that looks at the backlash Estee Lauder is facing over its racist behavior towards consumers in Korea based on algorithms and consumer data. A consumer ordered foundation online but she received a product completely different from her original order along with the store’s note explaining that they sent a different item because the color she chose “does not fit Asians.” As my student says, this is not inadvertently consistent with Estee Lauder’s brand message, which values diverse beauty of each woman. This case also shows unethical discriminatory behavior assuming that all those living in Korea are Asians, and that all Asians have same skin color.
- Authenticity and Techlash – Building on the algorithm topic, SFWeekly ran an interesting article that looked at both techlash and the ethical issues of when technology sounds too human, or allows people to automate what used to be personal tasks (like “I love you” texts). I think it highlights two important ethical concepts: 1) Just because you can do something, should you? And 2) What are the ethical considerations of technology in our daily lives.
- Privacy and Facial Recognition – Another article shared by one of my students reports that Singapore will become the world’s first country to use facial verification in its national ID scheme. Millions of people will be able to access government agencies, banking services and other amenities with a quick face scan. But what are the ethical boundaries of this data which can be used to track and profile customers.
Latest posts by Mark McClennan, APR, Fellow PRSA (see all)
- How to best counsel your client when they want to respond unethically to an unethical competitor – Tatevik Simonyan - October 28, 2024
- Why it’s important to know many codes of ethics – Erin Kennedy - September 9, 2024
- The Importance of Really Small Things – Capt. Barbara Bell, USN (ret) - May 28, 2024
0 Comments