News:

Welcome to the new (and now only) Fora!

Main Menu

A whole new ballgame in cheating. Introducing ChatGPT

Started by Diogenes, December 08, 2022, 02:48:37 PM

Previous topic - Next topic

Stockmann

Quote from: the_geneticist on May 01, 2023, 12:22:11 PM
... the program didn't watch the video, look at the dataset, read the protocol, etc

So it's already caught up with some students.


Quote from: quasihumanist on April 29, 2023, 09:07:48 AM
Quote from: Stockmann on April 29, 2023, 06:28:07 AM
Quote from: Caracal on April 29, 2023, 05:13:12 AM
This is a good piece. https://www.washingtonpost.com/opinions/2023/04/24/artificial-intelligence-consciousness-thinking/

I especially liked this part, "Chat is like a person who is barely paying attention, which is understandable because — being unconscious — Chat is definitely not paying attention. It couldn't possibly be paying attention." This really helps to explain why chat actually does right like many of my students. Many of them are also not paying attention.


And if those students were paying attention, perhaps they'd worry that AI is doing the same as them - but cheaper and faster. It used to be that skilled white collar workers mostly benefited from automation (although PCs could replace basic secretarial work) but that era seems to have come to an end and it will not be better when these students graduate. Combine that with grade and degree inflation (the US used to teach Latin in HS instead of having remedial English in college), and college costs, and the weaker students should be really, really worried.

I think we'll soon have a society where half of us are disabled.

Maybe that will happen in the medium term. In the short term, there are plenty of crappy jobs - it's the in-between, solidly middle class or upper middle class white collar jobs that seem to be getting hollowed out right now. Low-ranking white collar jobs may well go the way repetitive jobs requiring lots of physical strength but no real skills or dexterity went during the early Industrial Revolution.

Caracal

Quote from: Stockmann on April 29, 2023, 06:28:07 AM
Quote from: Caracal on April 29, 2023, 05:13:12 AM
This is a good piece. https://www.washingtonpost.com/opinions/2023/04/24/artificial-intelligence-consciousness-thinking/

I especially liked this part, "Chat is like a person who is barely paying attention, which is understandable because — being unconscious — Chat is definitely not paying attention. It couldn't possibly be paying attention." This really helps to explain why chat actually does right like many of my students. Many of them are also not paying attention.


And if those students were paying attention, perhaps they'd worry that AI is doing the same as them - but cheaper and faster. It used to be that skilled white collar workers mostly benefited from automation (although PCs could replace basic secretarial work) but that era seems to have come to an end and it will not be better when these students graduate. Combine that with grade and degree inflation (the US used to teach Latin in HS instead of having remedial English in college), and college costs, and the weaker students should be really, really worried.

Maybe, but it's hard to predict how this stuff will actually work out.

In terms of teaching, the thing I am planning to do differently in my upper level class in the fall is to just fail more papers and offer rewrites. I assign essays that require students to engage with primary sources they find themselves in particular databases. I always get students who just completely fail to understand the assignment and produce something that isn't at all what I asked for-usually just a mess of stuff vaguely related to the topic but barely mentioning the primary source and mostly drawn from various crummy websites (with citations of some sort so not a plagiarism issue) My standard approach to this is to just throw up my hands and give them a C-. After all, they produced something...

I'm not defending this on pedagogical grounds. I teach a lot of students, and they come into my upper level classes with wildly varying levels of preparation. That's hard to manage when I'm teaching courses that are designed to focus on content, not teaching writing skills. The problem is that these crummy, not responsive to the instructions, essays are exactly the kind of thing that AI programs could create. And even if the students are creating these things themselves do I really want to give students credit and allow them to pass the course just because they put some words on the page that have some relation to the topic? If you can't produce something that I can distinguish from the work of a bot that didn't read the instructions is that worth any points? So I think I'll just start failing this stuff and allow students to rewrite it with as much feedback and help as they request. I doubt I'm going to get anything particularly good from these rewrites but I want to make sure I'm only giving passing grades to bad versions of what I've asked for on the assignment.

apl68

Quote from: Caracal on May 03, 2023, 05:42:28 AM
I'm not defending this on pedagogical grounds. I teach a lot of students, and they come into my upper level classes with wildly varying levels of preparation. That's hard to manage when I'm teaching courses that are designed to focus on content, not teaching writing skills. The problem is that these crummy, not responsive to the instructions, essays are exactly the kind of thing that AI programs could create. And even if the students are creating these things themselves do I really want to give students credit and allow them to pass the course just because they put some words on the page that have some relation to the topic? If you can't produce something that I can distinguish from the work of a bot that didn't read the instructions is that worth any points? So I think I'll just start failing this stuff and allow students to rewrite it with as much feedback and help as they request. I doubt I'm going to get anything particularly good from these rewrites but I want to make sure I'm only giving passing grades to bad versions of what I've asked for on the assignment.

Students are really going to need for their teachers to hold the line on this.  If they are allowed to slide through to a passing grade with bot-generated junk responses to assignments, then, as stockmann says, they're going to be in big trouble when they get into the world of work.  And if universities allow that to happen on a large scale, the complaints from employers that new college grads don't have the skills they want--and the skepticism about the value of colleges and college education--are going to grow.
For our light affliction, which is only for a moment, works for us a far greater and eternal weight of glory.  We look not at the things we can see, but at those we can't.  For the things we can see are temporary, but those we can't see are eternal.

the_geneticist

Quote from: marshwiggle on May 02, 2023, 05:28:57 AM
Quote from: the_geneticist on May 01, 2023, 12:22:11 PM
I've found at least two students using ChatGPT on their online makeup lab assignments. That's an easy 0 for the grade book. 
The mismatch between the questions and answers makes it pretty darn obvious since the program didn't watch the video, look at the dataset, read the protocol, etc

But, to be fair, it probably didn't spend the evening before the assignment was due drinking and smoking weed.

I don't think the students were doing those things. I think they gambled on ChatGPT being a faster way to complete the assignment, thought the answers seemed reasonable, and learned a lesson the hard way.

I teach freshman.  They see using things like Chegg, ChatGPT, and Discord as 'collaborating' or 'using a resource' rather than cheating.  They can't tell that the answers they are getting are wrong/incomplete/don't actually answer the question.
They don't know enough to catch factual mistakes and are easily fooled by the quasi-professional prose.


Kron3007

Quote from: the_geneticist on May 03, 2023, 07:24:28 AM
Quote from: marshwiggle on May 02, 2023, 05:28:57 AM
Quote from: the_geneticist on May 01, 2023, 12:22:11 PM
I've found at least two students using ChatGPT on their online makeup lab assignments. That's an easy 0 for the grade book. 
The mismatch between the questions and answers makes it pretty darn obvious since the program didn't watch the video, look at the dataset, read the protocol, etc

But, to be fair, it probably didn't spend the evening before the assignment was due drinking and smoking weed.

I don't think the students were doing those things. I think they gambled on ChatGPT being a faster way to complete the assignment, thought the answers seemed reasonable, and learned a lesson the hard way.

I teach freshman.  They see using things like Chegg, ChatGPT, and Discord as 'collaborating' or 'using a resource' rather than cheating.  They can't tell that the answers they are getting are wrong/incomplete/don't actually answer the question.
They don't know enough to catch factual mistakes and are easily fooled by the quasi-professional prose.

Well, the reality is that AI now is a resource, so they are not wrong.  As mentioned up thread, calculators help you cheat as well but have become integral tools in math classes across the world.

This is not much different, and as with calculators, you still need to understand the material to ensure the numbers/words the tools.spits out makes sense for the problem at hand.  We will need to shift teaching to recognize this new reality since this is a tool they will have access to moving forward. 

Caracal

Quote from: Kron3007 on May 07, 2023, 02:29:32 AM
Quote from: the_geneticist on May 03, 2023, 07:24:28 AM
Quote from: marshwiggle on May 02, 2023, 05:28:57 AM
Quote from: the_geneticist on May 01, 2023, 12:22:11 PM
I've found at least two students using ChatGPT on their online makeup lab assignments. That's an easy 0 for the grade book. 
The mismatch between the questions and answers makes it pretty darn obvious since the program didn't watch the video, look at the dataset, read the protocol, etc

But, to be fair, it probably didn't spend the evening before the assignment was due drinking and smoking weed.

I don't think the students were doing those things. I think they gambled on ChatGPT being a faster way to complete the assignment, thought the answers seemed reasonable, and learned a lesson the hard way.

I teach freshman.  They see using things like Chegg, ChatGPT, and Discord as 'collaborating' or 'using a resource' rather than cheating.  They can't tell that the answers they are getting are wrong/incomplete/don't actually answer the question.
They don't know enough to catch factual mistakes and are easily fooled by the quasi-professional prose.

Well, the reality is that AI now is a resource, so they are not wrong.  As mentioned up thread, calculators help you cheat as well but have become integral tools in math classes across the world.

This is not much different, and as with calculators, you still need to understand the material to ensure the numbers/words the tools.spits out makes sense for the problem at hand.  We will need to shift teaching to recognize this new reality since this is a tool they will have access to moving forward.

I'm with you, and I actually don't think it requires too much of a shift in the teaching of writing. If I was teaching writing now, I would absolutely be generating essays using AI bots for students to critique. I always did lots of workshopping of essays in class, but students tend to focus on the sentence level grammar and clarity problems in each other's essays instead of the bigger argument and structure issues that are present in even solid first drafts. It's sort of like if you came upon a car smoking on the side of the road and walked around it pointing out dings and scratches and saying "that'll hurt the resale value." It might be quite useful to give students essays that don't have technical mistakes but are still terrible. And, of course, it would help them understand the limits of AI writing.

But, the problem is that outside of elite schools, there just isn't enough investment in teaching writing. There's nothing that complicated about teaching academic writing. You just need classes that are focused on the writing, not the content, which means students can write a bunch of papers, and for each one of those papers they have to submit a series of drafts and revise them. But without detailed feedback from the instructor on all of these drafts it won't work and you can't expect anyone to do that if you don't allocate the resources.

Wahoo Redux

Come, fill the Cup, and in the fire of Spring
Your Winter-garment of Repentance fling:
The Bird of Time has but a little way
To flutter--and the Bird is on the Wing.

quasihumanist

Quote from: Caracal on May 07, 2023, 05:16:30 AM
But, the problem is that outside of elite schools, there just isn't enough investment in teaching writing. There's nothing that complicated about teaching academic writing. You just need classes that are focused on the writing, not the content, which means students can write a bunch of papers, and for each one of those papers they have to submit a series of drafts and revise them. But without detailed feedback from the instructor on all of these drafts it won't work and you can't expect anyone to do that if you don't allocate the resources.

Also, outside of elite schools, students come in with worse writing skills and perhaps worse writing aptitude, so you need even more resources.

You also need more resources from the student (as in they need to put in more time and effort), so some of the students may decide it's not worth putting in those resources themselves.  Then your university has to decide between losing the students or having inadequate standards, even if the university can and wishes to put in the resources.

Hibush

Quote from: quasihumanist on May 07, 2023, 04:40:26 PM
... your university has to decide between losing the students or having inadequate standards, even if the university can and wishes to put in the resources.

Do you consider choosing the latter to be academic misconduct on the part of the university? Even if the trustees consider the former to be fiscal misconduct.

quasihumanist

#325
Quote from: Hibush on May 08, 2023, 05:02:23 AM
Quote from: quasihumanist on May 07, 2023, 04:40:26 PM
... your university has to decide between losing the students or having inadequate standards, even if the university can and wishes to put in the resources.

Do you consider choosing the latter to be academic misconduct on the part of the university? Even if the trustees consider the former to be fiscal misconduct.

The median degree from an median university has always been meaningless.  I don't know why we pretend otherwise.

But, to answer your question - yes, I consider choosing the latter to be academic misconduct.  Then again, if we ran the world according to my ethical standards, everyone, myself included, would've already committed hari-kiri.

apl68

Had an interesting AI experience yesterday in Sunday School, of all places.  Our teacher had left his phone on beside him while he conducted the lesson using print materials.  He had left his Siri assistant turned on.  During the lesson, the word "Syria"--ancient Aram--came up a couple of times.  Twice his mentions of Syria activated his phone's Siri.  "She" kept calling out that she was responding to his queries for information regarding whatever the last few words he had spoken happened to be.  A nice reminder that artificial intelligence is certainly artificial, but not at all intelligent.
For our light affliction, which is only for a moment, works for us a far greater and eternal weight of glory.  We look not at the things we can see, but at those we can't.  For the things we can see are temporary, but those we can't see are eternal.

Wahoo Redux

Quote from: Caracal on May 07, 2023, 05:16:30 AM
But, the problem is that outside of elite schools, there just isn't enough investment in teaching writing. There's nothing that complicated about teaching academic writing. You just need classes that are focused on the writing, not the content, which means students can write a bunch of papers, and for each one of those papers they have to submit a series of drafts and revise them. But without detailed feedback from the instructor on all of these drafts it won't work and you can't expect anyone to do that if you don't allocate the resources.

My wife's school is looking to cut one of the freshman comp classes, down to one writing class instead of two.  The English faculty suggested a writing across the curriculum tenure line, but it is not clear the admin will go for that.
Come, fill the Cup, and in the fire of Spring
Your Winter-garment of Repentance fling:
The Bird of Time has but a little way
To flutter--and the Bird is on the Wing.

Stockmann

Quote from: quasihumanist on May 08, 2023, 08:28:29 AM
Quote from: Hibush on May 08, 2023, 05:02:23 AM
Quote from: quasihumanist on May 07, 2023, 04:40:26 PM
... your university has to decide between losing the students or having inadequate standards, even if the university can and wishes to put in the resources.

Do you consider choosing the latter to be academic misconduct on the part of the university? Even if the trustees consider the former to be fiscal misconduct.

The median degree from an median university has always been meaningless.  I don't know why we pretend otherwise.

It does seem to make a difference for employers for white collar positions. But I expect that won't be the case forever, or that at least the employability gap between the median degree of the median college and a HS diploma will narrow - that employers will increasingly focus on other qualifications and experience instead of treating a college degree as a requirement. You can't have falling standards and increased tolerance for snowflakery forever on the one hand, and soaring costs on the other, and not expect changes.

spork

Quote from: jimbogumbo on May 05, 2023, 02:28:51 PM
https://www.cnn.com/2023/05/05/investing/chatgpt-outperforms-investment-funds/index.html

Index funds also outperform human fund managers, but that story reminds me of 1) the rise of automated trading in stocks and then bonds/derivatives, which put a lot of human traders out of work, (no big loss, in my biased opinion), and 2) the financial industry's reliance on very complex but faulty risk models that most people didn't understand -- which contributed to the crisis of 2008.

It's terrible writing, used to obfuscate the fact that the authors actually have nothing to say.