News:

Welcome to the new (and now only) Fora!

Main Menu

A whole new ballgame in cheating. Introducing ChatGPT

Started by Diogenes, December 08, 2022, 02:48:37 PM

Previous topic - Next topic

Wahoo Redux

Quote from: marshwiggle on January 09, 2023, 05:16:28 AM
Quote from: onthefringe on January 09, 2023, 03:34:30 AM
Quote from: Kron3007 on January 08, 2023, 06:27:43 PM
To be honest, I may see it I can use it to draft some tedious sections of my next grant proposal.  AI is coming (already here in many ways).  Either use it, or be left behind....

Several months ago I applied for a grant that required me to include a description of the proposed work ideally written at a tenth grade reading level. They had a reading level program that assessed it and you couldn't submit until the program said you were at most at a twelfth grade level. My first attempt came out at a 16th grade level, and when I thought I hugely simplified it I got to a 14th grade level. Getting that paragraph down to a 12th grade level was really hard. I recently gave ChatGPT my original paragraph and asked it to rewrite it for a high schooler. It did a ... really good job.

I've actually thought that for my students' project reports it would be great if a tool could make smooth, readable prose, so I can focus on seeing that the reports contain all of the required information. since the projects are unique, the student work that it's about will still need to be done, and the students will need to see that the report correctly explains it.

This is how I suspect the machine will be useful! 
Come, fill the Cup, and in the fire of Spring
Your Winter-garment of Repentance fling:
The Bird of Time has but a little way
To flutter--and the Bird is on the Wing.

Caracal

Quote from: Kron3007 on January 09, 2023, 04:27:27 AM
Quote from: Caracal on January 09, 2023, 04:10:34 AM
Quote from: Kron3007 on January 08, 2023, 06:27:43 PM
I am in a STEM field and when I played around with this it did a pretty good job.  A student could easily use it, add some citations to support the claims, and walk away with a decent grade.  Perhaps it could not do the whole project, but could definitely be used effectively to do some sections.

To be honest, I may see it I can use it to draft some tedious sections of my next grant proposal.  AI is coming (already here in many ways).  Either use it, or be left behind....

I would say that if the thing you are supposed to write is so rote that this could do a good job of it, it you might as well go ahead. Ideally it might cause some reevaluation of some of the conventions. In theory, the point of a lit review or whatever is supposed to be to give the reader a sense of what people have done previously so they can evaluate what the new work contributes and how it fits into previous scholarship. If these sections are just tiresome descriptions of previous studies and a bot could write the thing,  can't we just put in some footnotes or endnotes so people who are interested can go look at the studies themselves?

Well, grants are generally reviewed by non-experts so there is a bunch of general intro that is not overly technical.  This is especially true for many of the interdisciplinary grants out there, which seem to be more and more common in my world.  I feel an AI program could do a reasonable draft of this.  It would obviously need revision, but could save time.

Likewise, for my grants l, there are always sections about how the research benefits society etc.  These are not technical sections, and could likely be partially automated.  In fact, the AI may identify some points I would miss, who knows.  There are several sections of proposals like this.

Obviously AI will not write the whole proposal for me, but could spit out some draft paragraphs to build on.  I have not tried this, so could be a waste of efforts, but may save time.

In that sense, it really isn't that different from using Endnote or another program to manage citations. You take something that used to be an extremely time consuming and laborious task and it becomes an automated process. Farjad Manjoo had a good piece in the Times where he argued that despite the predictions, there's no sign that AI is making human workers redundant. The example he used was radiology, where AI is increasingly being used as a tool by radiologists. It can make them more efficient, it can make it less likely they will miss something, but you still need the people there to make sense of what the AI is flagging and to see things that it misses. The model we seem to use when we think about AI is Hal, but it might be more helpful to just think of it as a jazzed up version of spellcheck-a really useful tool that can save us a lot of time and prevent dumb mistakes, but not a substitute for proofreading.

marshwiggle

Quote from: Caracal on January 09, 2023, 06:27:10 AM

In that sense, it really isn't that different from using Endnote or another program to manage citations. You take something that used to be an extremely time consuming and laborious task and it becomes an automated process. Farjad Manjoo had a good piece in the Times where he argued that despite the predictions, there's no sign that AI is making human workers redundant. The example he used was radiology, where AI is increasingly being used as a tool by radiologists. It can make them more efficient, it can make it less likely they will miss something, but you still need the people there to make sense of what the AI is flagging and to see things that it misses. The model we seem to use when we think about AI is Hal, but it might be more helpful to just think of it as a jazzed up version of spellcheck-a really useful tool that can save us a lot of time and prevent dumb mistakes, but not a substitute for proofreading.

I think you've gotten at the way that society ultimately reacts to any new labour-saving technology; people quickly identify what it does well, and what things humans still need to do. There will always be a distinction between those two, even though the distinction will change over time.
It takes so little to be above average.

Kron3007

Quote from: Caracal on January 09, 2023, 06:27:10 AM
Quote from: Kron3007 on January 09, 2023, 04:27:27 AM
Quote from: Caracal on January 09, 2023, 04:10:34 AM
Quote from: Kron3007 on January 08, 2023, 06:27:43 PM
I am in a STEM field and when I played around with this it did a pretty good job.  A student could easily use it, add some citations to support the claims, and walk away with a decent grade.  Perhaps it could not do the whole project, but could definitely be used effectively to do some sections.

To be honest, I may see it I can use it to draft some tedious sections of my next grant proposal.  AI is coming (already here in many ways).  Either use it, or be left behind....

I would say that if the thing you are supposed to write is so rote that this could do a good job of it, it you might as well go ahead. Ideally it might cause some reevaluation of some of the conventions. In theory, the point of a lit review or whatever is supposed to be to give the reader a sense of what people have done previously so they can evaluate what the new work contributes and how it fits into previous scholarship. If these sections are just tiresome descriptions of previous studies and a bot could write the thing,  can't we just put in some footnotes or endnotes so people who are interested can go look at the studies themselves?

Well, grants are generally reviewed by non-experts so there is a bunch of general intro that is not overly technical.  This is especially true for many of the interdisciplinary grants out there, which seem to be more and more common in my world.  I feel an AI program could do a reasonable draft of this.  It would obviously need revision, but could save time.

Likewise, for my grants l, there are always sections about how the research benefits society etc.  These are not technical sections, and could likely be partially automated.  In fact, the AI may identify some points I would miss, who knows.  There are several sections of proposals like this.

Obviously AI will not write the whole proposal for me, but could spit out some draft paragraphs to build on.  I have not tried this, so could be a waste of efforts, but may save time.

In that sense, it really isn't that different from using Endnote or another program to manage citations. You take something that used to be an extremely time consuming and laborious task and it becomes an automated process. Farjad Manjoo had a good piece in the Times where he argued that despite the predictions, there's no sign that AI is making human workers redundant. The example he used was radiology, where AI is increasingly being used as a tool by radiologists. It can make them more efficient, it can make it less likely they will miss something, but you still need the people there to make sense of what the AI is flagging and to see things that it misses. The model we seem to use when we think about AI is Hal, but it might be more helpful to just think of it as a jazzed up version of spellcheck-a really useful tool that can save us a lot of time and prevent dumb mistakes, but not a substitute for proofreading.

Yes, AI has become a basic part of everyone's lives whether we know it or not.  If you use Google maps, or many other software programs, you are already using it to make life easier. 

I think there are jobs that will be lost, or will require fewer people to do the same amount of work, but that has always been the case and should make us more efficient overall. 

RatGuy

Quote from: Wahoo Redux on January 09, 2023, 05:52:48 AM

Prompt: "murikami barn burning"

Quote
"Barn Burning" is a short story by Japanese-American author Haruki Murakami. It was originally published in The New Yorker in 1983 and later included in Murakami's short story collection "The Elephant Vanishes." The story is about a young man named Kikuji who is struggling with his identity and his relationship with his father, who is a violent and abusive man. The story takes place in Japan and deals with themes of family, loyalty, and the consequences of one's actions. Do you have any other questions about the story?

Maybe it doesn't need to be said, but that summary blends the plot of the Faulkner story with some of the details from Murakami's. The character's name isn't even in that particular story, though may appear in other stories in the collection (I only teach "Barn Burning"). I've had only one response like that above, but I think its interesting that the AI couldn't stick just with the given author here.

I don't often see students synthesize terms or concepts across their humanities courses, though my classes are specifically interdisciplinary. So if a student provides a concept that used in a specific way different from the way we've learned in class, I'm assuming it's a cheat of some sort. Not that I get much of it anymore -- my school saw such an explosion of cheating in Fall 2020 that they've implemented a number of protocols. I assume this latest is on the list to address.

aprof

It's amusing to see the people here stating: "Text generation software will never pass muster in my class.  I have already caught the student who used AI to create their last writing assignment."

What about the students who did use AI for all or part of their submission but were undetected? Reddit and other places are filled with students' accounts of using the tool for their assignments to varying degrees. How can you be certain it was just one case?

Also, like a few others have noted, I have considered using ChatGPT to generate summaries or other rote bits of texts for my own work.  It quickly creates a nice skeleton that can then be edited.


Caracal

Quote from: aprof on January 09, 2023, 06:54:45 AM
It's amusing to see the people here stating: "Text generation software will never pass muster in my class.  I have already caught the student who used AI to create their last writing assignment."

What about the students who did use AI for all or part of their submission but were undetected? Reddit and other places are filled with students' accounts of using the tool for their assignments to varying degrees. How can you be certain it was just one case?

Also, like a few others have noted, I have considered using ChatGPT to generate summaries or other rote bits of texts for my own work.  It quickly creates a nice skeleton that can then be edited.

I'm sure I miss plagiarism all the time. I grade a lot of stuff and a lot of is confusing and bad. When something makes me suspicious, I do a little investigating, but if nothing turns up, I get on with the grading. I'm a teacher, not an academic fraud investigator. I try to assign papers that aren't easy to plagiarize-that might make it easier to spot cheating, but it also just means that students who lift stuff off the internet are probably going to do something that doesn't meet the assignment parameters and doesn't make much sense. I'm sure I miss some of this stuff because plenty of students who aren't plagiarizing also turn things in that don't make any sense.

I'm pretty sure most of the time students aren't getting decent grades when they plagiarize, but I'm sure I've missed more sophisticated better crafted plagiarism. I think that's rare because plagiarism is mostly a tool of the lazy and the desperate, but there are exceptions. I tend to think these students are mostly just cheating themselves and probably could have written an equally good paper in the time it took them to cheat well..

CHATGPT doesn't change this basic dynamic. I'm sure some students might use it and it won't be obvious in a way that I can detect. Most of them will get bad grades anyway. It would be a pretty inefficient way to try to write an actually good paper, but I'm sure with enough work and skill someone could manage B range things, but those students would probably be better served to take that time and write their own mediocre paper.

MarathonRunner

So I asked it a question in my field. It did great at defining the concept, less great at options we can use to address the problems arising from this concept. It brought in a lot of extraneous information that doesn't really address the core problem that arises from this concept.

At some point I'll have to throw a case study at it. I use those a lot when teaching in my field.

marshwiggle

Quote from: MarathonRunner on January 09, 2023, 08:01:49 AM
So I asked it a question in my field. It did great at defining the concept, less great at options we can use to address the problems arising from this concept. It brought in a lot of extraneous information that doesn't really address the core problem that arises from this concept.

It just occurred to me that someone could use this as a teaching technique. Give the prompt to the AI, and then go over the response with the class, highlighting what it got right, as well as what it got wrong and/or what sort of analysis it missed.
It takes so little to be above average.

Kron3007

Quote from: MarathonRunner on January 09, 2023, 08:01:49 AM
So I asked it a question in my field. It did great at defining the concept, less great at options we can use to address the problems arising from this concept. It brought in a lot of extraneous information that doesn't really address the core problem that arises from this concept.

At some point I'll have to throw a case study at it. I use those a lot when teaching in my field.

Yes, they definitely have their limits (for now) but could still be useful for some sections of a written project.  I would also note that the output will also depend heavily on the input.  Just as some people are better at internet searches because they know what terms will work best...


Wahoo Redux

I used to teach business writing which has a lot of formulaic genres (letters, reports, memos, brochures, one-pagers).  Blackboard has an app that can search for plagiarism when students turn their work in on that CMS.  The plagiarism app is not a great detective, but it cuts down on my work, as AI should.  Not being great, however, I find that the program detects all sorts of "internet source" material that could come from a great many places----often the wording is very close but not verbatim to a website or two.  I have generally assumed that students were just being lazy and editing their cut-n-pastes, a gray area but probably okay, maybe even appropriate, for actual, real-world business writing.  Weirdly, the app will find "institutional source"---or another student paper also turned in on Blackboard----which has very similar wording.  I have only had one confirmed, unequivocal plagiarism case, a brother and sister, who both turned in the same letter generated by AI.  It turns out they were confused and used a program linked to an internet example I had posted on Blackboard; they saw the link and thought they were supposed to use the program to write the letter.  Duh. 

But now I am wondering if I had a number of assignments written or helped by some kind of AI, particularly on assignments with similar but not exact wording. 
Come, fill the Cup, and in the fire of Spring
Your Winter-garment of Repentance fling:
The Bird of Time has but a little way
To flutter--and the Bird is on the Wing.

Kron3007

Quote from: Wahoo Redux on January 09, 2023, 08:42:37 AM
I used to teach business writing which has a lot of formulaic genres (letters, reports, memos, brochures, one-pagers).  Blackboard has an app that can search for plagiarism when students turn their work in on that CMS.  The plagiarism app is not a great detective, but it cuts down on my work, as AI should.  Not being great, however, I find that the program detects all sorts of "internet source" material that could come from a great many places----often the wording is very close but not verbatim to a website or two.  I have generally assumed that students were just being lazy and editing their cut-n-pastes, a gray area but probably okay, maybe even appropriate, for actual, real-world business writing.  Weirdly, the app will find "institutional source"---or another student paper also turned in on Blackboard----which has very similar wording.  I have only had one confirmed, unequivocal plagiarism case, a brother and sister, who both turned in the same letter generated by AI.  It turns out they were confused and used a program linked to an internet example I had posted on Blackboard; they saw the link and thought they were supposed to use the program to write the letter.  Duh. 

But now I am wondering if I had a number of assignments written or helped by some kind of AI, particularly on assignments with similar but not exact wording.

The real challenge with AI generated text is proving it, assuming you can detect it.  From what I have seen from AI text, the wording and structure would seem a little odd for a student in my class so perhaps my gut would tell me there is something off (or I could completely miss it), but that would be the end of it since there is no clear way to test it.  Plagiarism is easier since you can simply place the original next to it and clearly show that it was lifted.  AI generated text is indeed original, and I dont see how you would even be able to prove that it wasn't written by the student. 

There have been text "re-writers" around for a while.  These programs simply re-word a paragraph to avoid plagiarism detection.  Students are allowed to re-word text and include it in a paper to some extent (this is the basis for a review paper).  Is using AI to re-word a section actually breaking any rules?   


RatGuy

I wonder if multiple students using ChatGPT used the same prompt -- say, the prompt from the assignment sheet -- would the responses be similar enough to flag Turnitin or other plagiarism detection? Even if the students subsequently attempted to personalize the AI-generated response, would the AI, given similar prompts provide significantly similar answers?

the_geneticist

Quote from: RatGuy on January 10, 2023, 11:59:55 AM
I wonder if multiple students using ChatGPT used the same prompt -- say, the prompt from the assignment sheet -- would the responses be similar enough to flag Turnitin or other plagiarism detection? Even if the students subsequently attempted to personalize the AI-generated response, would the AI, given similar prompts provide significantly similar answers?

We can test it as a team on the Fora :)

Wahoo Redux

How about this prompt:

"What is Microsoft's AI online help system so bad?"

Quote
There are a few reasons why some users might find Microsoft's AI online help system to be lacking. One reason could be that the system may not always provide accurate or helpful responses to users' questions. Additionally, the system might not be able to understand the user's intent or question due to the complexity of natural language. It also might be hard for the user to find specific information or feature that they are looking for in the documentation.

Another reason could be that the system might not be designed to be user-friendly, making it difficult for users to navigate and find the information they need. Additionally, the system may not be kept up-to-date with the latest product information, so users may not be able to find answers to their questions.

That being said, it's important to note that many people find Microsoft's AI online help system to be useful and it is actively improved by Microsoft. If you are experiencing issues and not satisfied with the answers you are getting it may worth to reach out Microsoft support team or to see if there are any alternative resources available.

Well, this is interesting.  There is no "support team."  MS "help" is all automated online.

Somebody is not up to date...
Come, fill the Cup, and in the fire of Spring
Your Winter-garment of Repentance fling:
The Bird of Time has but a little way
To flutter--and the Bird is on the Wing.