Facial Recognition (Structured Discussion 2)
Facial Recognition Data
Facial recognition technologies pose complex ethical and technical challenges. Neglecting to unpack this complexity- to measure it, analyze it and then articulate it to others -is a disservice to those, including ourselves, who are most impacted by its careless deployment.
About Face: A Survey of Facial Recognition Evaluation, presented at the AAAI 2020 Workshop on AI Evaluation
Some of you may already be familiar with some of the issues that have been raised about facial recognition systems. (If not, scroll down…) Since we’ve now spent some time looking at how data affects a machine learning algorithm, let’s talk about how the data used to train facial recognition systems might matter.
Optional pre-discussion activity: Check out the film Coded Bias (2-minute trailer), which focuses on facial recognition and a few other areas that many of you have already highlighted in forum posts. I have not yet watched it, but I suspect it will be provocative in a way that’s relevant to this class. It’s not yet released publicly, but there are two screenings in the next week!
-
Thursday March 18, 7-8:30pm, Indie Lens
-
PBS on March 22 (10pm)
Shalom in Data We saw a few weeks ago how God put so much good data in the created world, and created humans in his image being able to see that data rightly and act accordingly. But sin has corrupted the data we get from the world and the actions that we take based on it. Let’s look at one example of how data can be corrupted (on the input) and lead to corrupted actions (on the output). Reading and response activities:
-
Please skim this paper on facial recognition evaluation and Tech Review’s coverage of the paper.
-
Choose one aspect that stood out to you and read it in more detail. (Maybe follow one of the citations to learn more.)
-
Make a post on the Moodle discussion forum. You might choose one or more of:
-
What quote from the paper or article stood out to you?
-
What did you find interesting or surprising?
-
What questions do you have?
-
Did you notice anything you disagree with or an assumption you want to question?
-
These readings are from a secular perspective; as you read did you notice any places where a Christian might go deeper or make different conclusions?
-
General Reflection Questions (optional for this activity)
As we look at several different technologies throughout this semester, we will ask some of the same questions about each of them. Here’s the list so far. You don’t need to specifically engage them for this activity, but they may be helpful for prompting your thinking:
-
How does it work?
-
What resources does it need? What resources does it produce?
-
What value does it produce for an organization that uses it?
-
Besides its primary use, what are other consequences (within and outside the organization) of its deployment? Think of people affected, resources consumed or produced, value generated, etc.
-
What are comparable or alternative non-AI products / technologies? What are real-world analogies for this technology (in terms of purpose or function)?
-
What ways of looking at or thinking about people does it emphasize? De-emphasize?
-
What are its limitations? Which limitations are most fundamental?
Facial Recognition Background
If you haven’t yet engaged with what facial recognition does and what might go wrong with it, get some background before you engage the above. Here’s a few resources (DATA 202 students will recognize these):
-
GenderShades summary video
-
Project Green Light (Detroit)
-
Tawana Petty Interview (second video on this page)
-
Hill, K. (2020, July 24). Wrongfully Accused By an Algorithm. The New York Times.
-
-
Portland’s facial recognition ordinance: coverage by The Hill
-
https://www.nytimes.com/2019/07/10/opinion/facial-recognition-race.html
See the UMich ESC Project Green Light site for some other articles.