Supporting Academic Integrity: Ethical Uses of Artificial Intelligence in Higher Education
The text on this page was adapted from the Academic Integrity Council of Ontario’s (2023) “Supporting Academic Integrity: Ethical Uses of Artificial Intelligence in Higher Education Information Sheet,” which was authored by Miron et al. and edited by Steeves et al. Changes made to the text include excerpting only portions of the text, changing the order of particular passages, and adapting some passages to better reflect the context of Trent University. The material is licensed under a Creative Commons Attribution-Noncommercial 4.0 International License (CC BY-NC 4.0).
Overview
In an educational context, Popenici and Kerr (2017) define artificial intelligence as “computing systems that are able to engage in human-like processes such as learning, adapting, synthesizing, self-correction and use of data for complex processing tasks” (p. 2).
Artificial intelligence software like GPT-3 and Bert are large language models (LLMs) that use algorithms to create original text. Additionally, some artificial intelligence applications (e.g., DALL-E) can create images, music, mathematical computations, and code. The sophistication and capabilities of artificial intelligence are progressing rapidly.
Preserving academic integrity in our pedagogical approaches, in all educational contexts, should be carefully considered as the capabilities of these tools develop and evolve, impacting organizations in different ways. For instance, our teaching practices and ways of assessing learning may need to be revised and adjusted. It is important to consider how artificial intelligence impacts our students and their learning, and how our courses activities, and programs are designed to support that learning.
Considerations for the Ethical Use of Artificial Intelligence in the Higher Educational Learning Environment
Artificial intelligence may provide students the ability to offload academic work or academic skills; it is crucial to be clear with students when the use of this technology is acceptable while still accomplishing the established learning goals. Consider the following with regard to the use of artificial intelligence in your learning environments (please note that this list is not exhaustive):
- Explore the various software that students may use in their courses. Speak with students about artificial intelligence and let them know you are aware of these applications.
- Communicate with students and teaching assistants/demonstrators as to how artificial Intelligence may be used ethically and honestly to support learning in courses (or programs), and ensure that both course, page content, and assessment instructions are aligned in detail for what is expected with appropriate use (what would be considered unauthorized use or reportable for academic misconduct within your course or program).
- Think about how artificial intelligence applies to course content and learning outcomes. Consider the appropriateness of introducing artificial intelligence as a learning strategy if it is leveraged in the academic discipline and/or professions that students are studying toward.
- Discuss artificial intelligence in the context of academic integrity and learning. Include integrity pledges that explicitly state students have completed their assignments with or without the use of artificial intelligence.
- Model the ethical use of artificial intelligence by being transparent about how to use it in the learning environment. Demonstrate best practices when using artificial intelligence with students (e.g., include citations, references to its use in class)
- Establish expectations for citing, referencing, or acknowledging the use of artificial intelligence applications in course work and assessments through assignment descriptions, rubrics, and syllabi.
Points to Consider when Clarifying Assessment Expectations
It is important that we provide clear and timely explanations to our students regarding assessment expectations, including what is and is not permitted to use while completing assessments.
Consider the following as guidance when communicating assessment expectations with your students:
Are you allowing students to use cognitive offloading tools during this assessment?
Examples of Cognitive Offloading Tools: calculators; translation tools; paraphrasing tools; course notes; PowerPoint slides; search engines (e.g. Google search); artificial intelligence (e.g. ChatGPT, DALL-E).
NO |
YES |
---|---|
|
|
Adapting Assignments as a Result of Artificial Intelligence
Many faculty may choose to adapt their assignments to make it more difficult for students to successfully use Artificial Intelligence apps. Here are some suggestions to consider:
- Documenting the research process: You can ask students to keep and turn-in their research notes or to keep a log documenting their research process. Or, you could ask students to discuss their research process orally, in small groups or through a recording.
- Centre Assignments Around Texts: Artificial Intelligence generally cannot (yet) pull information from particular sources. Asking students to incorporate ideas/evidence from particular sources will make it more difficult for them to successfully use artificial intelligence in their work.
- Ask for Comparative Reflections: Artificial intelligence platforms do not know what happens in your particular class. When relevant, asking students to compare ideas, events, or sources to concepts discussed in class will make it more difficult for them to successfully use artificial intelligence in their writing.
Opportunities to Integrate Artificial Intelligence into the Classroom
The thoughtful use of artificial intelligence may enhance the learning environment and support learning. It is important to first think about whether artificial intelligence use makes sense in your course. Explore the following links to learn more about how you might use artificial intelligence in your learning environments:
- Center for Engaged Pedagogy. (nd). Generative AI & the college classroom. Barnard College.
- https://cep.barnard.edu/generative-ai-college-classroom
- Center for Innovative Teaching & Learning Indiana University Bloomington. (2023, January 25). How to productively address AI-generated text in your classroom. Indiana University Bloomington. https://citl.indiana.edu/teaching-resources/academic-integrity/AI-Generated%20Text.html
- Eaton, S. E., & Anselmo, L. (2023). Teaching and learning with artificial intelligence apps. Taylor Institute for Teaching and Learning University of Calgary. https://taylorinstitute.ucalgary.ca/teaching-with-AI-apps
- Georgian College. (nd). Artificial intelligence-assisted work. https://www.georgiancollege.ca/ctlae/academicintegrity/#ai
- Mills, A. (2022). AI text generators and teaching writing: Starting points for inquiry. WAC Clearinghouse. https://wac.colostate.edu/repository/collections/ai-text-generators-and-teaching-writing-starting-points-forinquiry/
- Mollick, E., & Mollick, L. (2022). New modes of learning enabled by AI chatbots: Three methods and assignments. University of Pennsylvania & Wharton Interactive. 1-21. https://ssrn.com/abstract=4300783
- Monash University. (nd). Using artificial intelligence. https://www.monash.edu/learnhq/build-digitalcapabilities/create-online/using-artificial-intelligence
- Prochaska, E. (2023, January 23). Embrace the bot: Designing writing assignments in the face of AI. Magna Publications. https://www.facultyfocus.com/articles/course-design-ideas/embrace-the-bot-designing-writingassignments-in-the-face-of-ai/
- Teachonline.ca Contact North Nord. (nd). AI in higher education resource hub. https://teachonline.ca/ai-resources
- York University. (nd). AI technology and academic integrity. https://www.yorku.ca/unit/vpacad/academicintegrity/ai-technology-and-academic-integrity/#leveraging
Limitations of Artificial Intelligence
While artificial intelligence apps offer opportunities across the higher education landscape, they do have limitations. Large language models (e.g., ChatGPT) currently use large amounts of data that are entered by humans and pulled from the internet. Information is therefore limited as these applications do not scrape the internet regularly like a google engine would. The output then, can be prone to errors, omissions, or biases. The quality of results is dependent on the quality of the prompts that are entered into the app, the information that the system accesses, and if the system has a/an (adequate) filter.
Further, the technology lacks emotional intelligence and human nuances. Therefore, artificial intelligence applications may create false information. While some artificial intelligence is currently available for open use, it has been monetized which creates barriers for equity, diversity, and inclusion.
How to Cite Artificial Intelligence
It is recommended that you check the appropriate citation manuals for the most up-to-date reference on artificial intelligence. Other links you may access are listed below.
It is important to establish clear guidelines for students about how to acknowledge the use of artificial intelligence in their academic work. Review the develop rules for accountability section of the post by van Dis et al. (2023, February 23) for more ideas: https://www.nature.com/articles/d41586-023-00288-7
American Psychological Association (APA):
Modern Language Association (MLA):
https://style.mla.org/citing-artificial-intelligence/
https://style.mla.org/citing-source-code/
References
Mount Royal University Library. (2023, January 23). Artificial intelligence. Mount Royal University. https://library.mtroyal.ca/ai
Popenici, S.A.D., & Kerr, S., (2017). Exploring the impact of artificial intelligence on teaching and learning in higher education. Research and Practice in Technology Enhanced Learning. 12:22. https://link.springer.com/content/pdf/10.1186/s41039-017-0062-8.pdf?pdf=button
Van Dis, E., Bollen, J., Zuidema, W., van Rooij, R., & Bockting, C. (2023, February 3). ChatGPT: Five priorities for research. Nature. https://www.nature.com/articles/d41586-023-00288-7