News:

Welcome to the new (and now only) Fora!

Main Menu

New inventive ways of cheating

Started by Hegemony, March 18, 2023, 06:13:14 AM

Previous topic - Next topic

RatGuy

Students are starting to submit final essays, and Turnitin has been flagging some material as AI-generated. The hilarious part? The AI, if indeed that's what it is, is inventing quotations. As it'll say something like "Her countenance was perfect, desolate misery, but the firmness of her mind was a spirit no chains could fetter." That might sound like 18th-century literature, but nothing like that shows up in the novel, much less on the page cited. Oh, and nothing turns up in a Google search. It doesn't feel to me like a paraphraser, so I wonder if the AI is just inventing something that sounds 18th-century.

Sun_Worshiper

Honestly, ChatGPT is going to win the battle and the war if your approach is to try to detect its use. Better to (1) change assignments so that they are relatively cheat proof and (2) teach students how to use GPT programs and their strengths and limitations.


history_grrrl

Quote from: RatGuy on April 24, 2023, 10:58:22 AM
Students are starting to submit final essays, and Turnitin has been flagging some material as AI-generated. The hilarious part? The AI, if indeed that's what it is, is inventing quotations. As it'll say something like "Her countenance was perfect, desolate misery, but the firmness of her mind was a spirit no chains could fetter." That might sound like 18th-century literature, but nothing like that shows up in the novel, much less on the page cited. Oh, and nothing turns up in a Google search. It doesn't feel to me like a paraphraser, so I wonder if the AI is just inventing something that sounds 18th-century.

Oh, yes. It fabricates quotations, citations (now being called hallucinations), facts, what have you. (Interthreaduality; I posted an about this on the ChatGPT thread.) It's amazing, and not in a good way. Basically it produces giant piles of doodoo. I'm somewhat flummoxed at how much awe there seems to be around it.

Sun_Worshiper

Quote from: history_grrrl on April 24, 2023, 01:55:16 PM
Quote from: RatGuy on April 24, 2023, 10:58:22 AM
Students are starting to submit final essays, and Turnitin has been flagging some material as AI-generated. The hilarious part? The AI, if indeed that's what it is, is inventing quotations. As it'll say something like "Her countenance was perfect, desolate misery, but the firmness of her mind was a spirit no chains could fetter." That might sound like 18th-century literature, but nothing like that shows up in the novel, much less on the page cited. Oh, and nothing turns up in a Google search. It doesn't feel to me like a paraphraser, so I wonder if the AI is just inventing something that sounds 18th-century.

Oh, yes. It fabricates quotations, citations (now being called hallucinations), facts, what have you. (Interthreaduality; I posted an about this on the ChatGPT thread.) It's amazing, and not in a good way. Basically it produces giant piles of doodoo. I'm somewhat flummoxed at how much awe there seems to be around it.

It is actually a super helpful and useful writing tool, but people have to learn to understand its limitations as well as its strengths.

history_grrrl

Oops; that should have read "hallucitations."

smallcleanrat

Quote from: history_grrrl on April 24, 2023, 09:42:34 AM

I saw an online discussion about this recently; a student had referred in her paper to - get ready to guffaw - Steve Occupations.


This one took me a minute.

Caracal

Quote from: Sun_Worshiper on April 24, 2023, 03:03:59 PM
Quote from: history_grrrl on April 24, 2023, 01:55:16 PM
Quote from: RatGuy on April 24, 2023, 10:58:22 AM
Students are starting to submit final essays, and Turnitin has been flagging some material as AI-generated. The hilarious part? The AI, if indeed that's what it is, is inventing quotations. As it'll say something like "Her countenance was perfect, desolate misery, but the firmness of her mind was a spirit no chains could fetter." That might sound like 18th-century literature, but nothing like that shows up in the novel, much less on the page cited. Oh, and nothing turns up in a Google search. It doesn't feel to me like a paraphraser, so I wonder if the AI is just inventing something that sounds 18th-century.

Oh, yes. It fabricates quotations, citations (now being called hallucinations), facts, what have you. (Interthreaduality; I posted an about this on the ChatGPT thread.) It's amazing, and not in a good way. Basically it produces giant piles of doodoo. I'm somewhat flummoxed at how much awe there seems to be around it.

It is actually a super helpful and useful writing tool, but people have to learn to understand its limitations as well as its strengths.

I don't doubt that. We're just stuck in a weird moral and technological panic that keeps us from being able to evaluate any of this in a reasonable way.

the_geneticist

ChatGPT is fine for stringing together a list of things, which are coherent enough to pass a first look.  But the program has no concept of making an argument or a product.  Folks who knit and/or crochet are asking it for patterns and the results are hilarious.  Like a sock that technically has all of the required parts to be a sock (cuff, leg, heel, instep, toe), but are WAY out of proportion and wouldn't actually fit anyones foot.
So, get ready for a lot of "but it has an intro, some supposedly supporting info, and a conclusion" essays that technically score well on a rubric, but are just as much an overall failure as the AI sock patterns.

MarathonRunner

Quote from: the_geneticist on April 25, 2023, 07:27:17 AM
ChatGPT is fine for stringing together a list of things, which are coherent enough to pass a first look.  But the program has no concept of making an argument or a product.  Folks who knit and/or crochet are asking it for patterns and the results are hilarious.  Like a sock that technically has all of the required parts to be a sock (cuff, leg, heel, instep, toe), but are WAY out of proportion and wouldn't actually fit anyones foot.
So, get ready for a lot of "but it has an intro, some supposedly supporting info, and a conclusion" essays that technically score well on a rubric, but are just as much an overall failure as the AI sock patterns.

Yes! Especially since crochet can't be done by machine, whereas knitting can. So if you buy a crocheted item cheap, the actual crocheter has not been paid for their work. Knitting can be reproduced by machine, so AI also handles it better. As a PhD candidate and some time crochet designer, Chat GPT can't give crochet instructions that make sense most of the time. Knitting, since it can be done by machine, has better, but often still poor results.

Sun_Worshiper

Quote from: Caracal on April 25, 2023, 06:18:37 AM
Quote from: Sun_Worshiper on April 24, 2023, 03:03:59 PM
Quote from: history_grrrl on April 24, 2023, 01:55:16 PM
Quote from: RatGuy on April 24, 2023, 10:58:22 AM
Students are starting to submit final essays, and Turnitin has been flagging some material as AI-generated. The hilarious part? The AI, if indeed that's what it is, is inventing quotations. As it'll say something like "Her countenance was perfect, desolate misery, but the firmness of her mind was a spirit no chains could fetter." That might sound like 18th-century literature, but nothing like that shows up in the novel, much less on the page cited. Oh, and nothing turns up in a Google search. It doesn't feel to me like a paraphraser, so I wonder if the AI is just inventing something that sounds 18th-century.

Oh, yes. It fabricates quotations, citations (now being called hallucinations), facts, what have you. (Interthreaduality; I posted an about this on the ChatGPT thread.) It's amazing, and not in a good way. Basically it produces giant piles of doodoo. I'm somewhat flummoxed at how much awe there seems to be around it.

It is actually a super helpful and useful writing tool, but people have to learn to understand its limitations as well as its strengths.

I don't doubt that. We're just stuck in a weird moral and technological panic that keeps us from being able to evaluate any of this in a reasonable way.

I felt that way at first too, but when I started playing with it I realized the usefulness (as well as the shortcomings).

And the reality is that this stuff is here to stay and its use will be the norm in the white collar work world. We can teach students to use it properly, with an understanding for what it can and cannot do for them, or we can try to fight it and they will just learn on their own without proper guidance.

Caracal

Quote from: Sun_Worshiper on April 25, 2023, 11:36:17 AM

I felt that way at first too, but when I started playing with it I realized the usefulness (as well as the shortcomings).

And the reality is that this stuff is here to stay and its use will be the norm in the white collar work world. We can teach students to use it properly, with an understanding for what it can and cannot do for them, or we can try to fight it and they will just learn on their own without proper guidance.

I think that's true. It's a tool, in the same way that a search engine is a tool, or wikipedia is a tool. When I try to teach students how to do research, it would be really silly to tell them not to google stuff, or not to use Wikipedia. I use Wikipedia all the time when I'm doing research. It can be very useful when you run into something you've never heard of and you just need some context. The references can also be quite useful as a starting point for further research. I've found tons of sources I never would have otherwise found just by googling some name I run into in the course of my research.

Of course you can misuse google and my students do it all the time. Some of them always use it as a way to avoid doing any actual research and they give me a paper based on something random they found on the internet, or they cite Wikipedia despite all my instructions and explanations. That doesn't mean I should tell them not to use these things. The problem isn't that they use them, it's that they misuse them.

Same thing for Chat. It can be a powerful tool, but that doesn't mean it's a good way to write the first draft of your history paper. It's going to produce a bunch of sensible sounding blather without much of a point and you're going to get a bad grade. Probably the best way to teach that is to actually show students what it generates and then look at whether it's actually useful or not.

Sun_Worshiper

Quote from: Caracal on April 30, 2023, 07:15:34 AM
Quote from: Sun_Worshiper on April 25, 2023, 11:36:17 AM

I felt that way at first too, but when I started playing with it I realized the usefulness (as well as the shortcomings).

And the reality is that this stuff is here to stay and its use will be the norm in the white collar work world. We can teach students to use it properly, with an understanding for what it can and cannot do for them, or we can try to fight it and they will just learn on their own without proper guidance.

I think that's true. It's a tool, in the same way that a search engine is a tool, or wikipedia is a tool. When I try to teach students how to do research, it would be really silly to tell them not to google stuff, or not to use Wikipedia. I use Wikipedia all the time when I'm doing research. It can be very useful when you run into something you've never heard of and you just need some context. The references can also be quite useful as a starting point for further research. I've found tons of sources I never would have otherwise found just by googling some name I run into in the course of my research.

Of course you can misuse google and my students do it all the time. Some of them always use it as a way to avoid doing any actual research and they give me a paper based on something random they found on the internet, or they cite Wikipedia despite all my instructions and explanations. That doesn't mean I should tell them not to use these things. The problem isn't that they use them, it's that they misuse them.

Same thing for Chat. It can be a powerful tool, but that doesn't mean it's a good way to write the first draft of your history paper. It's going to produce a bunch of sensible sounding blather without much of a point and you're going to get a bad grade. Probably the best way to teach that is to actually show students what it generates and then look at whether it's actually useful or not.

When I first started to play with it, I told it to write me an essay about an environmental policy strategy. I told it in a sentence or two what the policy strategy is and how I wanted it to be framed. In seconds, it spit out a 1200 word essay that explained the policy strategy, contextualized it in the framework that I requested, and provided several examples. The essay itself was quite good - certainly better than what the average undergrad would write - but the examples all turned out to be wrong. If I had cared to polish it up, I would have dug up new examples and rewritten, restructured, and edited the essay quite a bit. With that extra bit of polishing (which is no small task, but also easier than starting from scratch), I could have turned this into a very good essay.

So, in fact, it is not necessarily a bad tool for writing your first draft (although probably not great for history, specifically, since the facts it gives are often wrong*), if you come into it with an argument, writing capabilities of your own, and a willingness to fact check the content it provides.

Does this mean students should use it to write their first drafts of their papers? Perhaps not. These students don't have the decades of writing and research experience that we do. But it can be used quite effectively in this way.

* For Bing AI chat, which uses GPT 4, you can adjust how "creative" you'd like it to be. More creative = less accurate.


Caracal

Quote from: Sun_Worshiper on May 01, 2023, 09:10:49 AM


So, in fact, it is not necessarily a bad tool for writing your first draft (although probably not great for history, specifically, since the facts it gives are often wrong*), if you come into it with an argument, writing capabilities of your own, and a willingness to fact check the content it provides.


The problem for a history essay isn't the facts, but the argument. Even if you told it what to argue, it wouldn't really argue it, it just sort of circles the drain, restating the claim again and again. I can see how it might do a better job with something like a policy strategy where there are clearer frameworks it can draw on.

Sun_Worshiper

Quote from: Caracal on May 02, 2023, 11:23:22 AM
Quote from: Sun_Worshiper on May 01, 2023, 09:10:49 AM


So, in fact, it is not necessarily a bad tool for writing your first draft (although probably not great for history, specifically, since the facts it gives are often wrong*), if you come into it with an argument, writing capabilities of your own, and a willingness to fact check the content it provides.


The problem for a history essay isn't the facts, but the argument. Even if you told it what to argue, it wouldn't really argue it, it just sort of circles the drain, restating the claim again and again. I can see how it might do a better job with something like a policy strategy where there are clearer frameworks it can draw on.

We're going in circles here. It can't create an argument, but it can write an argument if you give it proper guidance. From there, you have to do editing/rewriting.

fizzycist

Quote from: Sun_Worshiper on May 02, 2023, 05:35:37 PM
Quote from: Caracal on May 02, 2023, 11:23:22 AM
Quote from: Sun_Worshiper on May 01, 2023, 09:10:49 AM


So, in fact, it is not necessarily a bad tool for writing your first draft (although probably not great for history, specifically, since the facts it gives are often wrong*), if you come into it with an argument, writing capabilities of your own, and a willingness to fact check the content it provides.


The problem for a history essay isn't the facts, but the argument. Even if you told it what to argue, it wouldn't really argue it, it just sort of circles the drain, restating the claim again and again. I can see how it might do a better job with something like a policy strategy where there are clearer frameworks it can draw on.

We're going in circles here. It can't create an argument, but it can write an argument if you give it proper guidance. From there, you have to do editing/rewriting.

I'm not sure where you are going with this subworshipper. Are you saying chatgpt is helpful for faculty to produce quality scholarship?

I dunno about your field, but in mine I probably type 50-100 new pages of scholarship per year in papers, proposals, thoughtful emails, etc. The vast majority of my time is spent thinking about what I've learned and want to say. A relatively small portion is actually typing. By the time I've figured out what I want to say, the typing is pretty fast. Maybe I could disrupt my flow by asking chatgpt a bunch of prompts to try and do the easy part for me even faster, but it's hardly an essential tool. For me, editing the text in subsequent drafts is very time-consuming, but I don't think chatgpt is particularly useful here--the reason it takes so long is because I am refining my thinking.

Anyway, I think chatgpt is useful for writing code in a language you're not great with or for generating boilerplate text that is not going to be scrutinized much.  And it's probably quite good at generating a B- essay in a lower-division undergrad class when you are strapped for time and/or haven't read the assignd texts.