News:

Welcome to the new (and now only) Fora!

Main Menu

A whole new ballgame in cheating. Introducing ChatGPT

Started by Diogenes, December 08, 2022, 02:48:37 PM

Previous topic - Next topic

Zeus Bird

Quote from: ciao_yall on May 11, 2023, 08:28:15 AM
Quote from: Wahoo Redux on May 11, 2023, 06:54:23 AM
CHE: Why I'm Excited About ChatGPT

Lower Deck:
Quote
Here are 10 ways ChatGPT will be a boon to first-year writing instruction, Jennie Young writes.

Holy sheep.

The IHE author gives the ball game away in her item #1, in which existing and undeniable socioeconomic inequalities are not simply recognized as the context in which faculty do their teaching.  Rather, according to her these inequities should actually determine our academic assessment of students' work.  I find this is increasingly common among those composition instructors who think that all courses should be de facto composition courses.

secundem_artem

I attended a webinar on ChatGPT the other day and saw some applications for it that I would not have thought of.  Afterwards, I asked it to create an outline for a course on clog dancing.  It did so - 10 modules with topics provided for each module.  I asked to then take 1 module and turn that into a 3 week mini-course.  Same deal.  I asked it provide a textbook - got 5 reasonable recommendations.  Said no textbooks, instead peer reviewed articles -again, reasonable recommendations.

I doubt this thing will be a game changer, but it could well save some time.  With minimal input, it can write LORs, long emails, and who knows what else.  The webinar leader had it writing case studies.
Funeral by funeral, the academy advances

permanent imposter

Latest article from the Chronicle: "I'm a Student: You Have No Idea How Much We're Using ChatGPT."

I can't get past the paywall right now, but I'm curious about the initial paragraph of this article as it seems to contradict what has been said here:

QuoteThere's a remarkable disconnect between how those with influence over education systems –– teachers, professors, administrators –– think students use generative AI on written work and how we actually use it. As a student, the assumption I've encountered among authority figures is that if an essay is written with the help of ChatGPT, there will be some sort of evidence –– the software has a distinctive "voice," it can't make very complex arguments (yet), and there are programs that claim to detect AI output.

That is of course assuming that the article goes on to say that Chatbots can make complex arguments. Maybe it doesn't. All of this just makes me feel very exhausted and I dislike the title of the article already. Why not just.... don't use it? Just honor the work? Do work that is meaningful and helps you learn? (I know the answer to these questions.)

Hibush

Quote from: permanent imposter on May 12, 2023, 12:52:43 PM
Latest article from the Chronicle: "I'm a Student: You Have No Idea How Much We're Using ChatGPT."

I can't get past the paywall right now, but I'm curious about the initial paragraph of this article as it seems to contradict what has been said here:

QuoteThere's a remarkable disconnect between how those with influence over education systems –– teachers, professors, administrators –– think students use generative AI on written work and how we actually use it. As a student, the assumption I've encountered among authority figures is that if an essay is written with the help of ChatGPT, there will be some sort of evidence –– the software has a distinctive "voice," it can't make very complex arguments (yet), and there are programs that claim to detect AI output.

That is of course assuming that the article goes on to say that Chatbots can make complex arguments. Maybe it doesn't. All of this just makes me feel very exhausted and I dislike the title of the article already. Why not just.... don't use it? Just honor the work? Do work that is meaningful and helps you learn? (I know the answer to these questions.)
[/quote

The article is by a Columbia student.The motivated competitive type, so perhaps different motivations from the non-learners. He uses it to find references, check out different ways to engage with the argument, testing out different ways to out together good paragraphs. All the time thinking and learning. Pretty miuch using it as a learning aid, the way we hope it works out.

Sun_Worshiper

Quote from: secundem_artem on May 12, 2023, 08:35:22 AM
I attended a webinar on ChatGPT the other day and saw some applications for it that I would not have thought of.  Afterwards, I asked it to create an outline for a course on clog dancing.  It did so - 10 modules with topics provided for each module.  I asked to then take 1 module and turn that into a 3 week mini-course.  Same deal.  I asked it provide a textbook - got 5 reasonable recommendations.  Said no textbooks, instead peer reviewed articles -again, reasonable recommendations.

I doubt this thing will be a game changer, but it could well save some time.  With minimal input, it can write LORs, long emails, and who knows what else.  The webinar leader had it writing case studies.

Yes, it is a helpful tool for streamlining all sorts of big and little tasks.


permanent imposter

Thanks for reading the article, Hibush, glad I got worked up for nothing :)

Caracal

Quote from: permanent imposter on May 12, 2023, 12:52:43 PM

That is of course assuming that the article goes on to say that Chatbots can make complex arguments. Maybe it doesn't. All of this just makes me feel very exhausted and I dislike the title of the article already. Why not just.... don't use it? Just honor the work? Do work that is meaningful and helps you learn? (I know the answer to these questions.)

I suspect that even a quite good undergraduate might not really be able to understand what chatgpt  What it mostly does is lift arguments from somewhere on the internet without much discernment, because, well it can't discern, it's a bot. As an advisor of mine once said about a book we read in class "the parts that are true aren't interesting and the parts that are interesting aren't true."

ciao_yall

In my experience, students took it for granted that computerized resources such as autocorrect, Wikipedia, Google Translate, etc were correct and reliable. So if they tried to use ChatGPT, I'm not sure how many would think critically about its product, much less feel they could correct, improve or adapt it in any way.



Kron3007

Quote from: ciao_yall on May 14, 2023, 11:14:08 AM
In my experience, students took it for granted that computerized resources such as autocorrect, Wikipedia, Google Translate, etc were correct and reliable. So if they tried to use ChatGPT, I'm not sure how many would think critically about its product, much less feel they could correct, improve or adapt it in any way.

Sounds like a great teachable.moment!  This is exactly why tools like this should be integrated rather than banned.

ciao_yall

Quote from: Kron3007 on May 15, 2023, 03:14:07 AM
Quote from: ciao_yall on May 14, 2023, 11:14:08 AM
In my experience, students took it for granted that computerized resources such as autocorrect, Wikipedia, Google Translate, etc were correct and reliable. So if they tried to use ChatGPT, I'm not sure how many would think critically about its product, much less feel they could correct, improve or adapt it in any way.

Sounds like a great teachable.moment!  This is exactly why tools like this should be integrated rather than banned.

Agreed.

During my doctoral program I critiqued an article that my program director had recommended. He turned beet red. "It's peer-reveiwed." So... I'm not allowed to question the method or findings?

apl68

The announcement of the closing of Medaille University on the "Dire Straits" thread prompted me to do a quick online search for information on that school.  A local radio station report about Medaille's closing struck me as one of the worst-written news articles I've ever seen.  The reporter came across as barely literate.  Was that a case of somebody using an AI to write the article, or would an AI have at least gotten it all grammatically correct?
For our light affliction, which is only for a moment, works for us a far greater and eternal weight of glory.  We look not at the things we can see, but at those we can't.  For the things we can see are temporary, but those we can't see are eternal.

arcturus

Quote from: apl68 on May 16, 2023, 08:19:09 AM
The announcement of the closing of Medaille University on the "Dire Straits" thread prompted me to do a quick online search for information on that school.  A local radio station report about Medaille's closing struck me as one of the worst-written news articles I've ever seen.  The reporter came across as barely literate.  Was that a case of somebody using an AI to write the article, or would an AI have at least gotten it all grammatically correct?
My experience with AI-composed "student" work is that it is grammatically correct, but not-necessarily logically correct. For example, consider a critique of famous artwork (not the actual assignment). The student's response started by describing item A and morphed in the middle of the sentence to describing item B. The sentence was constructed correctly, but was complete nonsense in terms of content. While I don't have proof that this was the work of chatGPT, no human would have made such an error.  This is along the lines of the stories from the first distribution of GPS assisted directions, where it would occasionally direct drivers into dead ends, not recognizing the impassible objects between them and their destination. Improvements in the algorithms will eventually resolve these issues too.

downer

Have there been any estimates of what proportion of college students and high school students use ChatGPT to do their writing for them, as a first draft or as a final draft. I guess in the next few years entering classes of college students will have large proportions of students who have mainly used ChatGPT for their writing.

I was looking on Reddit today at a post by someone who was planning on using ChatGPT to write a memorial for their dead father.
"When fascism comes to America, it will be wrapped in the flag and carrying a cross."—Sinclair Lewis

spork

Quote from: apl68 on May 16, 2023, 08:19:09 AM
The announcement of the closing of Medaille University on the "Dire Straits" thread prompted me to do a quick online search for information on that school.  A local radio station report about Medaille's closing struck me as one of the worst-written news articles I've ever seen.  The reporter came across as barely literate.  Was that a case of somebody using an AI to write the article, or would an AI have at least gotten it all grammatically correct?

Gell-Mann amnesia. Most journalism is trash.
It's terrible writing, used to obfuscate the fact that the authors actually have nothing to say.

Caracal

Quote from: ciao_yall on May 15, 2023, 08:22:44 AM
Quote from: Kron3007 on May 15, 2023, 03:14:07 AM
Quote from: ciao_yall on May 14, 2023, 11:14:08 AM
In my experience, students took it for granted that computerized resources such as autocorrect, Wikipedia, Google Translate, etc were correct and reliable. So if they tried to use ChatGPT, I'm not sure how many would think critically about its product, much less feel they could correct, improve or adapt it in any way.

Sounds like a great teachable.moment!  This is exactly why tools like this should be integrated rather than banned.

Agreed.

During my doctoral program I critiqued an article that my program director had recommended. He turned beet red. "It's peer-reveiwed." So... I'm not allowed to question the method or findings?

That's very strange. In my field, what you're mostly learning to do as a grad student is criticize everything. It eventually becomes a bit of a tic-you don't have to tear everything down-but it's a crucial part of learning how to your own work.