Use of AI tools in Academics

Sean van der Merwe

Introduction

Topics

  • AI tools are not new, but are suddenly popular
  • Types of AI tools and how to use them
  • Uses of AI for lecturers
  • Uses of AI for students
  • Abuses of AI by students
    • And how to spot them
  • Improvements we can make as lecturers

History

  • AI is not new, been in use for decades
    • Think of robotics and gaming
    • Google revolutionised search by bringing in statistics, and gradually making it smarter
    • Tools for writing English essays have been around for years
    • Wolfram Alpha is an amazing AI tool for mathematics, and there are specific tools for statistical modelling like Stan
  • 2023 saw the release of ChatGPT 3.5 and newer to the public
    • General purpose usefulness and ability to interpret plain language
    • This is when the media took notice

Growth

My (Sean’s) thoughts

AI tools are good at expanding and contracting text.

  • They summarise complicated topics in plain language
  • They explain simple things clearly
  • So they are like Wikipedia page generators
    • They make something like a Wikipedia page on the fly based on what you ask
  • BUT without any accuracy checks!
    • They have no concept of right or wrong, correct or incorrect
      • They are language models based on the general internet text so they have the same biases that the internet has

Chatbots

  • ChatGPT 3.5 is free and useful
    • Superficial and not cutting edge, limitations on use
  • ChatGPT 4 is $20/month and has all the best plugins
    • Still superficial unless you use the right plugins and the topic is popular
  • New Bing AI
    • Requires Edge browser but no login (private Microsoft account recommended for better performance)
    • Based on ChatGPT 4, but different interface
    • Most superficial responses, but still super useful because it is super flexible
  • Google Bard
    • Main competitor to ChatGPT (ChatGPT is primarily sponsored by Microsoft)

Note that ChatGPT was not trained on academic papers so don’t expect academic rigour.

Research tools

The tools below are far less flexible, but far more rigorous, and based on peer reviewed papers:

  • Elicit
    • Free signup, single step to use
    • Loosely based on old ChatGPT 3, but is not a chatbot
    • Finds and summarises (briefly) top papers matching search
  • Consensus
    • Same as above
    • Has three summary modes:
      • First it just presents the top papers to you neatly (similar to Elicit)
      • Then it has a button to generate a consensus summary of these papers
      • If you pay then the consensus summary will be generated using ChatGPT 4 instead of the older models

Examples

Uses for lecturers

  • These tools are fantastic for creating assessments
    • Use them to generate word problems, they are great at that!
      • But don’t ask them to solve statistical problems
      • These tools are bad at math, statistics, and probability, as those require precision
  • They are good at translating though
    • That includes translating into programming code 😀
      • But note that code it generates almost always has errors

More lecturer uses

  • They are great at answering basic student questions about the work
    • Copy the question into Bing, copy the answer back to the student
    • Example: student asks what a line of code means, you paste the line of code into the chat bot and ask it to explain it
  • Use it to draft a difficult email
    • Or a motivating Blackboard post to get students excited about a topic

Lecturer issues

  • One of my assignments this semester was created entirely by ChatGPT from 1 short prompt
  • As a follow up I asked it for a rubric and then used that rubric as-is (with acknowledgement)
  • This seemed to work great until I had to mark it 😮
  • The rubric did not account for the students using ChatGPT to answer the assignment

When students can use AI to answer, the rubrics/memos/mark allocations must adapt

Student uses

  • Students around the world are using these bots as cheap alternatives to tutors
  • They ask the bots to explain the notes to them
    • Summarising the parts that are too long, and expanding the parts that are too short
  • The bots can give easy introductions to topics
  • They can even quiz the student
    • Ask the bot to: give you a multiple choice question on something, wait for your response, and assess your response

Adoption in my course

  • Students used ChatGPT extensively in my honours course this past semester,
    • asking it to explain concepts to them,
    • and to generate code to get them started with their answers
  • They used it in nearly all assessments, including all summative assessments like semester tests

I promoted this because they are encouraged to use it in their workplace and I want to prepare them for their workplace

Student abuses

  • I’ve had students submit assignments where almost everything is just copy-pasted from ChatGPT
    • But, in my view, ChatGPT is not the problem, copy-pasting is
  • Most students used it as a starting point, and then reworked the text as they saw fit
    • Nothing wrong with that if the reworking is thorough and transformative
    • Like Wikipedia, ChatGPT is a good starting point but a terrible endpoint
  • ChatGPT can hallucinate and talk nonsense
    • It can also make up fake references

How to adapt

Step 1: Use it yourself

  • Only by using AI yourself will you learn:
    • What it can do
    • What it can’t do
    • How it presents things
    • How to identify it

Step 2: Spotting issues

  • The most obvious is when a student just does a copy-paste:
    • If students have questions in their answers,
    • or words that are a clear response to a prompt,
      • Like “Sure! Here is an explanation of…”, or “Certainly. The …”
    • or end with a statement leading to more information that doesn’t come
  • Another clue is if the language style does not match that of the student in general
    • ChatGPT language is generally better than that of our students

Step 3: AI detection

  • Turn-it-in has an AI detection tool that tries to pick up AI use.
    • If it says 100% then you know there’s a problem
    • If it says 0% then it’s probably all good
    • Anything in between means the student probably used a little bit of AI text but didn’t rely on it entirely
      • This is sometimes fine, if not excessive, and sometimes not
  • If you want to enforce rules then put those rules in writing
    • Explain to students in advance how to properly credit AI chatbots
    • You could tell students to state the prompts they used and quote anything copied without alteration

Step 4: Adjust the mark allocation

  • Standard ChatGPT language does not normally include real references
    • So increase the mark allocation for providing real references
    • Give serious credit for integrating real references into the flow of the text
  • ChatGPT responses tend to be superficial
    • So raise the cognitive level of required responses
    • Or change the focus or style of questions to require depth

Conclusion

There more things change…

  • There is extensive teaching literature on how to
    • promote deep learning and
    • combat academic misconduct
  • Most of that knowledge still applies and is useful with AI chatbots
    • The university has many programmes in place to help lecturers in this regard
    • Contact your T&L manager or CTL representative for more information

My personal favourite approach is randomised assessments. See my award winning presentation for the 2022 Learning and Teaching awards if you are interested.

Thank you for your time and attention.

This presentation was created using the Reveal.js format in Quarto, using the RStudio IDE. Font and line colours according to UFS branding, and background images combined from various AI sources like Midjourney and Bing AI (DALL-E) using image editor GIMP.