Generative AI in Teaching and Learning: Biases and Risks

The purpose of this online resource is to assist faculty, instructors, and graduate students develop an understanding of the potential benefits, complexity and dilemmas associated with the use of generative AI tools in higher education and how this understanding can be implemented in their own teaching.

> How do we define 'generative AI' and its impact on higher education?

> What policies exists around the use of 'generative AI' tools?

> What kinds of 'generative AI' tools exist for teaching and learning purposes?

> What is the impact on course design when considering generative AI tools?

> What are a few examples of assignment design that employ generative AI?

> How do we address assessment concerns around generative AI tools in higher education?

Instructors must carefully weigh the benefits and costs of the use of generative AI tools, as these tools provide both an opportunity to enhance student learning as well as deeply challenge it. 

 

Generative AI tools, as they have been designed and developed, have the well-documented potential to reproduce biases, reinforce discrimination, and amplify stereotypes. These tools are trained on large amounts of data from the internet and do not distinguish between reliable and unreliable data in the same way as a human researcher.

A clear risk is that students will use these tools to replace rather than augment their development as critical thinkers. While the use of generative AI tools in the classroom puts pressure on how we define the ways in which students develop crucial skillsets, this use should never replace the development of these skills. "Closing the loop," ensuring the human meaning-making stands as the final arbiter for the use of generative AI tools, is crucial. For example, in their "Statement on Artificial Intelligence in Writing Flag Courses," the Center for Skills and Experience Flags at UT-Austin states:

"Writing is an iterative process requiring drafting, feedback, and revision. Human feedback, specifically, is required for writing skills development, because writing is a process of communication between human subjects, with social and emotional components. AI might be used to augment some of these processes in more formulaic writing, serving as something like a template for a human writer (for example, to generate a draft of a letter that indicates the type of content that might appear in the letter but requires human discernment to evaluate those choices, add specific ideas and detail, and confirm the letter’s accuracy), but it should never replace them. Because good writing requires human input, writing that lacks such input will usually fall short of college-level standards for success. 

Beyond the "hallucination" effect that these tools generate (confidently asserting facts or sources that do not exist), these tools can construct larger patterns of social, cultural, gender-based, and ethnic discrimination. In the graphic below, Rebecca Sweetman (2023) highlights some of the harms that need consideration, including environmental harms, economic harms, as well as epistemic harms. Click on the image in order to be taken to an interactive form of this graph.

Screenshot 2023-11-19 at 13.10.38.png