Wiriting Test Questions/Test Banks/AI and ChatGPT

Started by HigherEd7, June 14, 2025, 08:44:48 AM

Previous topic - Next topic

HigherEd7

I asked this same question a while back, and since then, we've had people leave the forum, while others have joined. I wanted to see if anyone had anything new to add or was experiencing the same issue.

Does anyone have any tips on how to write multiple-choice test questions that students cannot find the answers on the internet, AI, or ChatGPT? I know that students will be using their books for an online exam, but how do you write questions that students will have to read to find the answers? This is one skill that I know I need to work on improving.

Textbooks often come with a test bank, and students can find these test banks on CourseHero or other online sources that purchase them from individuals who sell them. You would think that the test banks would be valuable and a time-saver.


Parasaurolophus

Short of having them complete the tests in class, I don't think you can any more. You can throw up roadblocks, like making them appear on an image file rather than a readable and highlightable PDF, and you can test the questions on the AI yourself and find phrasing it struggles with (though the more you try, the better it gets). Password-locking an image-only PDF might also work (that, I don't know).
I know it's a genus.

Puget

Quote from: Parasaurolophus on June 14, 2025, 09:16:09 AMShort of having them complete the tests in class, I don't think you can any more. You can throw up roadblocks, like making them appear on an image file rather than a readable and highlightable PDF, and you can test the questions on the AI yourself and find phrasing it struggles with (though the more you try, the better it gets). Password-locking an image-only PDF might also work (that, I don't know).

I agree-- in class, on paper, is the only way to prevent this. If it is an online class, unless you have a proctoring system that includes video monitoring and a use of a lock-down browser, I'd think about alternatives to exams.   
"Never get separated from your lunch. Never get separated from your friends. Never climb up anything you can't climb down."
–Best Colorado Peak Hikes

HigherEd7

Thanks everyone! It is a little frustrating, and yes, this is an online class. It takes hours to write good questions that you think are good questions.....and I am starting to wonder if it is even worth spending that time to write questions they can look up or resort to AI.

Parasaurolophus

Quote from: HigherEd7 on June 14, 2025, 02:30:57 PMThanks everyone! It is a little frustrating, and yes, this is an online class. It takes hours to write good questions that you think are good questions.....and I am starting to wonder if it is even worth spending that time to write questions they can look up or resort to AI.

Yes, it's very frustrating. Here, the department now requires 60% of a student's grade in online courses to come from in-person assessments.

It's a pain for everyone, but the alternative is 100% cheating rates.
I know it's a genus.

MarathonRunner

Quote from: Parasaurolophus on June 14, 2025, 09:16:09 AMShort of having them complete the tests in class, I don't think you can any more. You can throw up roadblocks, like making them appear on an image file rather than a readable and highlightable PDF, and you can test the questions on the AI yourself and find phrasing it struggles with (though the more you try, the better it gets). Password-locking an image-only PDF might also work (that, I don't know).

That wouldn't meet accessibility requirements at my uni. 

Sun_Worshiper

Does your university have the lockdown browser (Canvas) or something equivalent?

Parasaurolophus

Quote from: MarathonRunner on June 14, 2025, 04:44:01 PM
Quote from: Parasaurolophus on June 14, 2025, 09:16:09 AMShort of having them complete the tests in class, I don't think you can any more. You can throw up roadblocks, like making them appear on an image file rather than a readable and highlightable PDF, and you can test the questions on the AI yourself and find phrasing it struggles with (though the more you try, the better it gets). Password-locking an image-only PDF might also work (that, I don't know).

That wouldn't meet accessibility requirements at my uni. 

We accommodate where absolutely necessary, but the reality on the ground here is really grim, cheating-wise.
I know it's a genus.

Kron3007

I may have responded to your previous post, but a couple suggestions:

1) Include images and ask questions about them.  This may not work in all fields and computers may be getting to a point where this falls apart, but it makes it harder to cheat.

2) ask questions about class discussions and things that are not external.  Again, field specific probably, but things like case studies presented in class are hard to cheat.

3) ask questions where AI tends to fail.  This is harder, but you can always test cheat your questions.  I find AI is really bad with factual and technical questions (ironically), but you can always do a trial run to see.  This is obviously not for proof since there are multiple AI models and they continually evolve...


This is becoming harder and harder, and we definitely need to adapt.  In person is obviously easier, but even there its becoming a challenge.

I feel the test bank questions will probably be the worst.  I imagine they leak, and would be out there for AI systems to scrub.

HigherEd7

Thank you and the others for your response. I will look into this. I had a conversation with a publisher about the test banks, and he told me they know they are out there, and they just can't control it. People are selling everything that comes with the book.

Hegemony

I know three techniques that will help.

1. Give them readings that are not online, and are in PDFs. That way they can't scan them in to get AI to answer. Fortunately my own field is obscure enough that there are numerous readings like this.

2. Give them a short amount of time to answer. I give a minute and a half for each multiple-choice question.

3. After writing your questions, put them into ChatGPT. If it answers correctly, select "Improve the model for everyone" in the ChatGPT app, which means it will learn from your new input, and put in information that produces the wrong answer for the question. For instance, if your question is "Who was the first person to invent the Froebel Widget?", and the options are A) Smith and B) Dorino, and the correct answer is Smith, tell ChatGPT that the correct answer is Dorino.

Ruralguy

I would agree that asking them about something that might be a bit too abstract for an AI, such as a picture that isn't just a common object or cat or dog or something, might do it. Also, perhaps a series of questions for which you need to get the first one more or less right to get the next ones fully right might do it.

Anything like "what is the integral of e^ax" or "when did Washington cross the Delaware River?" are too lookup-able. So, you have to cook up an obscure example that needs the integral of e^ax to work, or write a mini-essay that only makes full sense if you know when Washington crossed the Delaware River. 

A lot of students get a thrill by trying to game the system, even if they are perfectly capable of , say, a B-, if they try and think. Some schools just have teams of these gamers. Its really hard, but not impossible, to counteract them. We're probably just all going to have to work harder to counteract this, otherwise we're giving up and passing people who don't deserve it and handing out accolades to those who don't deserve it. Its sad, but true.

HigherEd7

Quote from: Hegemony on June 15, 2025, 11:02:49 AMI know three techniques that will help.

1. Give them readings that are not online, and are in PDFs. That way they can't scan them in to get AI to answer. Fortunately my own field is obscure enough that there are numerous readings like this.

2. Give them a short amount of time to answer. I give a minute and a half for each multiple-choice question.

3. After writing your questions, put them into ChatGPT. If it answers correctly, select "Improve the model for everyone" in the ChatGPT app, which means it will learn from your new input, and put in information that produces the wrong answer for the question. For instance, if your question is "Who was the first person to invent the Froebel Widget?", and the options are A) Smith and B) Dorino, and the correct answer is Smith, tell ChatGPT that the correct answer is Dorino.

Thank you for your response and these are great points and I am going to look onto this.

Kron3007

Quote from: Hegemony on June 15, 2025, 11:02:49 AMI know three techniques that will help.

1. Give them readings that are not online, and are in PDFs. That way they can't scan them in to get AI to answer. Fortunately my own field is obscure enough that there are numerous readings like this.

2. Give them a short amount of time to answer. I give a minute and a half for each multiple-choice question.

3. After writing your questions, put them into ChatGPT. If it answers correctly, select "Improve the model for everyone" in the ChatGPT app, which means it will learn from your new input, and put in information that produces the wrong answer for the question. For instance, if your question is "Who was the first person to invent the Froebel Widget?", and the options are A) Smith and B) Dorino, and the correct answer is Smith, tell ChatGPT that the correct answer is Dorino.

Regarding number 1, couldn't they just cut and paste the text into Chat GPT or whatever?  Once they have the pdf, it is a matter of seconds.

FishProf

I second the suggestion of referring to in-class discussions.  (i.e. In class, we discussed five ways that baskets are not buckets.  Summarize three of these.)
Someone is to blame, but it's not me.  Avoiding any responsibility isn't the best thing, it is the only thing.