Instructional Technology has been working closely with the Faculty Development Center and faculty groups to understand how faculty, departments, and colleges want to engage with AI in their instructional activities. We understand that due to the lack of available tools and time, many faculty have not had the chance to use AI. We also understand that GenAI can be used incorrectly and create academic misconduct issues. At the same time, we have been working with individual faculty on some small pilots where they want to leverage AI to better support students in their classes. Below are a few examples where AI can be leveraged to support teaching and learning.
Equity of Access
- UMBC offers three generative AI tools to faculty, staff and students:
- Google Gemini and Microsoft Copilot to all campus users (faculty, staff, and students)
- Amplify AI from Vanderbilt University to faculty and staff. To request an Amplify account, please fill out the DoIT Support – AI Tools form and select Amplify to get access.
- When you sign into any of these UMBC-supported AI tools using your UMBC account, your data is protected and not used for future training.
- Students will need to learn how to navigate in a world in which AI is a commonplace feature, if not a job requirement (Lakhani, 2023). Learning new skills, such as how to use large language models, as well as the benefits, failures, and ethics of doing so, are important components we can include in our courses and internships (ChatGPT, 2023; Mollick & Mollick, 2023c).
- In an age where misinformation is rampant, understanding AI can help students discern between genuine information and AI-generated content, whether news articles or deepfake videos (Kreps, McCain, & Brundage, 2020; Hsu & Thompson, 2023). Additionally, since AI depends on vast amounts of data for its training models, including personal information, students need to understand how AI can be biased and harmful (Kertysova, 2018; Dwivedi, et al., 2023).
- Today’s cloud-based tools (Google & Microsoft) will include AI as a native feature (Warren, 2023; Vincent, 2023). Embedded tools provide generative AI across productivity applications to help users visualize, organize, and write as they work. In the near future, it is very easy for students to use AI in some measure.
Tips for Teaching in an AI World
- Survey your students and then invite your class to have a conversation about AI. Share your thoughts and ask your students for their perspectives. Set clear expectations for using generative AI tools including how to cite or reference appropriately, if permitted (Mollick & Mollick, 2023).
- Consider your course schedule, how much content is in your course, and how that might impact whether students may be tempted to use generative AI tools to take short-cuts (Alexander, 2022; Watkins, 2022).
- Demonstrate how AI tools work: Show students the benefits and limitations of using AI tools in your course and discipline. Clarify parameters in which you might consider AI (e.g., brainstorming, grammar checking) and when you would not permit it (e.g., writing).
- Have students critique what generative AI tools output — code, text, lesson plans, etc. — and determine if it is accurate, inclusive, accessible, etc. for your discipline or course level. AI can be a beneficial tool when used appropriately.
- Incorporate AI into active learning to help students learn how to appropriately use AI technologies by writing prompts, analyzing AI outputs, and refining follow-up interactions (Watkins, 2022).
Assessments and AI
- For non-timed assessments, ask students to compose their assignments in Google so they can share the version history. This will show the overall composition process as they work and whether large blocks of text were pasted into the document. If assignments are composed in Word (in 365 online), it’s also possible to view the document version history.
- Scaffold major assessments. Build steps, such as outlines and drafts, so students show their writing progress. Have them share their notes and other preparatory work that contributes to the final product.
- Include reflection components for each assessment, especially the writing process, so students provide personal insight into the process of developing their ideas (Borup, 2023). Reflections can also be created using multimedia with Panopto or VoiceThread.
- Consider peer assessment, group work, and other collaborative activities where students can learn from each other and share responsibility for the assessment process.
- If students do use AI, they should have prompts readily available to explain how they made use of them and explain their process. For example, ChatGPT includes a share function that allows a user to provide a link to the conversation that you can review.
- Leverage Respondus Lockdown Browser, which prevents students from opening anything except the assessment environment. This works best if students are working in a timed assessment scenario. Respondus Monitor will record the desktop if it is absolutely imperative to have this level of security.
Academic Integrity & AI Detection
- There are a number of AI detection tools available, some built on older APIs for ChatGPT and therefore they are not accurate just on this factor alone. AI, by its nature, is constantly learning and improving itself and it may never be possible to truly detect whether text is AI-generated. Passages from the Bible, the U.S. Constitution, and Macbeth have been flagged as written by AI.
- Researchers argue that AI detectors are not reliable (Sadasivan, Kumar, Balasubramanian, Wang, & Feizi, 2023) or consistently misclassify non-native English writing as AI-generated (Liang, Yuksekgoul, Mao, Wu, & Zou, 2023).
- In a study of fourteen AI detection tools, all scored below 80% accuracy — only five managed to score over 70% (Weber-Wulff, et al., 2023). When submissions are modified to obfuscate AI-generated text, such as editing or paraphrasing, the AI detection is even lower.
- The Washington Post found that some AI detectors falsely flagged students 50% of the time (Fowler, 2023). Similarly, Rolling Stone noted that a professor failed students due to inaccurate AI detection in their work (Klee, 2023).
- AI detectors do a poor job of detecting as the quality of AI-generated text is constantly improving or students can adapt the output to trick the AI detection tools (Wiggers, 2023; Coffey, 2024).
- Educators may consider AI detection tools to test their assessment design and determine how easy it is to use AI to complete an activity… “assume students will be able to break any AI-detection tools, regardless of their sophistication” (Lee & Palmer, 2023).
Note: UMBC does not license any AI detection tools at this time. The institution engages with the broader higher education community to stay informed about the latest trends, technologies and resources related to artificial intelligence and AI detection in academic environments.
AI and Course Design
- The Blackboard AI Course Design Assistant tool offers suggestions for:
- Generate a suggested course structure including learning module organization
- Search or generate thumbnail images for learning modules and course banners
- Generate questions for a test or question bank as well as rubrics
- Generate prompts for assignments, discussions and journals.
- The Blackboard AI Conversations tool provides opportunities for students to engage in Socratic or roleplaying scenarios with a virtual partner — designed and specified by the instructor.
- Generative AI tools are helpful to improve course and unit learning objectives, including alignment of those objectives to assessments, instructional materials, and activities (Quality Matters 2.1, 2.2, 3.1, 4.1, 5.1, and 6.1).
AI and Accessibility
- Generative AI tools can help create long descriptions for complex images. (Discover more in this presentation, Leveraging AI for Enhanced Visual Content to Support Accessibility.)
- Anthology Ally can provide limited support generating descriptions for some images using generative AI, but instructors should review the description for accuracy.
- Handwritten essays and oral exams may disadvantage students with disabilities, or provide a non-inclusive environment for students (Folts, 2023). Google Assignments and other collaborative tools help track student work.
UMBC Research on AI
2024-25 FLCs Related to AI
The Faculty Development Center (FDC) has sponsored two faculty learning communities in AY 2024-25 associated with AI, and these can be ways to reach out to fellow faculty and learn from their discussions.
- Developing Students’ AI Literacy Within and Across Disciplines (Facilitated by Mariajosé Castellanos, CBEE, and Bill Ryan, IS).
- Using AI to Enhance and Expedite Teaching (Facilitated by Diane Alonso, PSYC, and Neha Raikar, CBEE).
Instructional Technology Presentations on AI
- Hawken, M., & Biro, S. (2023). Ask Janet: Leveraging AI for Course Design, Instruction and Learning. OLC Accelerate Conference, Washington, DC.
- Penniston, T. (2023). Toward a New Paradigm: Learning Analytics 2.0. 25th International Conference on Human-Computer Interaction, HCI International 2023. Copenhagen, Denmark.