News:

Welcome to the new (and now only) Fora!

Main Menu

A whole new ballgame in cheating. Introducing ChatGPT

Started by Diogenes, December 08, 2022, 02:48:37 PM

Previous topic - Next topic

downer

I realized that ChatGPT will be able to help me write my multiple choice questions, and I'm happy about that.

None of the places I teach has really done anything about AI or cheating. Not all students will cheat, because some have integrity and some are not capable of working out how to. Cheating has always been with us, but at this point my guess it is more widespread, especially in online courses. I imagine there will soon be some statistics that will get headlines about how prevalent it is.

There are steps we can take to reduce it, but as it becomes ever more time consuming to prove cases of cheating, it just won't be feasible to carry it out. Maybe AI will be able to work as a cheating detector too, as turnitin says it can, but it remains to be seen how good it will be.

Quite a few professors will not make any big changes to how they teach, and will just keep their heads in the sand about the issue, especially when there's no external pressure or incentive to do so.

At some point, especially if we can't get AI to fight cheating effectively, there is going to be a crisis of confidence about any work students do outside of the classroom. This will affect some disciplines harder than others. It may just be no longer feasible to have online courses or online work at all.

Then there's the more distant future: why will we need college education when the robots will be doing everyone's jobs anyway?
"When fascism comes to America, it will be wrapped in the flag and carrying a cross."—Sinclair Lewis

Caracal

Quote from: Parasaurolophus on March 24, 2023, 10:35:45 PM
Quote from: jerseyjay on March 24, 2023, 05:15:53 PM

Well, yes, but isn't that the point? Each field has its own "technical terms with precise meanings" that do not really align with the way other fields use the words. To say that only my field understands something is somewhat strange. (Unless the argument--as it were--is, only my field understands my field, which is pretty much a tautology.)


I think that's right when different fields have different definitions of some term. So, for example, a 'disjunction' in logic is a truth-functional connective that translates the ordinary language term 'or' (we then go on to give the truth-conditions for inclusive and exclusive disjunctions). In linguistics, however, it's an adverbial adjunct that expresses unnecessary content.

But that's not what's going on with 'argument'. It's not that English/literature, history, sociology, philosophy, etc. all have different definitions of what an argument is. It's that most disciplines operate with an intuitive sense of what the word means, rather than adhering to its definition. And, unfortunately, those intuitions are driven by use (and misuse), which don't track its actual meaning. Its actual meaning is preserved in philosophy because we have an entire subfield (logic) which is the formal study of argumentation, and because everyone who gets a philosophy degree of any stripe has to pass some sort of logic class. And, indeed, because the whole point of the discipline is the construction and evaluation of arguments.

What is an argument? Quite simply, it's a set of statements, at least one of which (a premise) is offered in support of at least one other (the conclusion). A statement is any truth-evaluable sentence (any sentence that can be either true or false; in other words, a declarative sentence). As far as I (/we) can see, that just is what an argument is, for everyone. It's just that it's a commonly misused term, and the reason it's misused by others but not by us is just that we have significant disciplinary reasons to care about it, whereas others have stronger reasons to care about other stuff. It's not that we have special access to this special term. We certainly do have plenty such terms, but 'argument' is a different case.

The reason I'm exercised about it here is dialectical. The claim was that "Postmodernism has probably lowered peoples' standards for what counts as an'argument' that all kinds of drivel will pass." And my point was that if you want to talk about 'what counts as an argument', you're shifting to a level of discourse where the term's actual meaning is quite important. But then, if you look around, you'll find that very few people actually know the term's meaning and how to apply it correctly (seriously, I have hundreds of intro students to prove it!), and you have to concede that this situation has nothing to do with postmodernism. It's just that most of the time we use the term unreflectively. And most of the time that's perfectly fine, because nobody is applying strict standards to our discourse. And that's true across most of academia, because most disciplines are more concerned with other stuff.

That's fair. I suppose I do the thing where I describe something as "not an argument" when what I really mean is that it's not a good argument or not a useful argument.

For an academic argument to have any purpose in my field, it needs to be something that people could dispute. Many arguments fail this test because they are too vague.
"The Civil War continues to raise troubling questions"
"The rise of Jackson can be explained by social and economic factors."
"Theories of learning have influenced how society thinks about education since the dawn of time."

Other arguments fail this test because they make a statement that almost nobody could disagree with.
"Gettysburg was an important battle in the Civil War"
"Women have made many contributions to physics."
"Plato was an influential figure in the field of philosophy."

You should be able to see the counterargument very easily or it isn't an argument worth making. Probably it's a bit sloppy to call them non arguments but in effect they are. When I see theses like these, I know that the paper is just going to sort of wander around vaguely pointing to pieces of evidence without going anywhere at all. A writing teacher I worked under called this the "museum tour" essay. "On the right you'll see plato talking about the cave, which was influential, and on the left you'll see plato talking about his definition of the good life which is also important."

ChatGPT loves these kinds of non arguments as much as my students do, I suspect for some of the same reasons. The bot would get itself in trouble quite quickly if it tried to make actual claims that you could argue against. What if the claim its making is offensive. What if it's very controversial in a field? What if it implicitly attacks the work of some influential scholar or public figure? Much safer to just deal in vague generalities. Even taking away that problem, arguments are hard to craft and chatbots don't really seem designed to actually think. Stay vague and there's no logic to pick apart, but if they actually try to make a case for something, I suspect it would be obvious they can't do it. That's what students are often doing too, they are trying to be vague and hope that nobody notices they have nothing in particular to say.

Wahoo Redux

Quote from: marshwiggle on March 23, 2023, 11:47:20 AM
Quote from: Wahoo Redux on March 23, 2023, 11:30:20 AM
Quote from: marshwiggle on March 23, 2023, 10:38:36 AM
Postmodernism has probably lowered peoples' standards for what counts as an"argument" that all kinds of drivel will pass.

I don't believe there is such a thing as "Postmodernism"-----Pomo is the simple evolution of the modernist revolution of the early 1900s.

But what are you talking about, Marshy?

The Sokol Affair

Grievance studies affair

These are a couple of examples of how right-sounding drivel got accepted for publication.

The trouble with the "drivel these days" is that there has always been hokum. 

https://www.pbs.org/wgbh/aso/databank/entries/do53pi.html

https://skeptoid.com/episodes/4617?gclid=Cj0KCQjwt_qgBhDFARIsABcDjOcxyu5MjkoKX3_nue-D1Q_hXqfAu6XVBIgVbULoCX7tBNFxD4gPoE8aAsqxEALw_wcB

And the trouble with chiding disciplines for "drivel" is that it happens to the best of us.

https://retractionwatch.com/

Then again, critics of the idea of the "Postmodern" have pointed out that the idea of the "postmodern" is simply pointing out to us what people have been doing since, like, forever----there is nothing new here, folks, move along.
Come, fill the Cup, and in the fire of Spring
Your Winter-garment of Repentance fling:
The Bird of Time has but a little way
To flutter--and the Bird is on the Wing.

Wahoo Redux

#273
AI in relationship to academia is getting a great deal of press.

This might indicate, as predicted, something big.

Or maybe AI is writing its own press?

The Walrus: Will ChatGPT Kill the Student Essay? Universities Aren't Ready for the Answer

Daily Mail: The Great God Gates Loves AI
Come, fill the Cup, and in the fire of Spring
Your Winter-garment of Repentance fling:
The Bird of Time has but a little way
To flutter--and the Bird is on the Wing.

Stockmann

Quote from: downer on March 25, 2023, 04:00:14 AM
Then there's the more distant future: why will we need college education when the robots will be doing everyone's jobs anyway?

For the weaker students, I don't see what's distant about it. We've known for a while that a lot of college students don't get much out of it in terms of cognitive gains, and depending on college and major, many don't get much out of it in terms of hard skills. A lot of employers may still see a cv with any college degree very differently than one without, and for the worst students at mediocre or worse institutions that would seem to be the sole advantage of going to college, not any marketable skills that AI couldn't largely already do faster, cheaper and better.

downer

Quote from: Stockmann on March 25, 2023, 11:21:12 AM
A lot of employers may still see a cv with any college degree very differently than one without, and for the worst students at mediocre or worse institutions that would seem to be the sole advantage of going to college, not any marketable skills that AI couldn't largely already do faster, cheaper and better.

That's mostly true now. But for how much longer?
And once it stops being true, then going to college will need some different justification.
"When fascism comes to America, it will be wrapped in the flag and carrying a cross."—Sinclair Lewis

Sun_Worshiper

I chatted about ChatGPT with one of my techie colleagues the other day. He was very excited about how it was helping him with grading and certain service tasks, which he is apparently able to do in no time. He also alluded to using it as a writing shortcut, although did not say so explicitly. After the conversation, as well as chats with my colleagues in industry that are using this to dramatically increase their productivity, I'm thinking about ways to use it to enhance my own productivity - or, more accurately, to streamline work that I don't like doing so I can focus on the things I do enjoy (like writing, which is ironically the thing this tool is probably most useful for streamlining).

This feels like a "get on the bus or get run" kind of over moment.

Parasaurolophus

Well, I've found it's pretty good at generating T/F questions, okay but not great at generating multiple-choice questions, and very bad at generating essay prompts (I could push it to generate better prompts, but it's a lot less work for me to come up with them myself).

I've already automated as much of my marking as I can, and wouldn't trust it for the rest.
I know it's a genus.

Liquidambar

A coworker said that when he reviews grant proposals, he's found ChatGPT useful for the obligatory one paragraph summary of the proposal that the review starts with.  Of course he checks to make sure ChatGPT is summarizing accurately.

I haven't thought of anything ChatGPT could help me with.  I asked it to generate some word problems on a particular topic that would use a particular technique, but it couldn't stick to both the topic and the technique simultaneously.
Let us think the unthinkable, let us do the undoable, let us prepare to grapple with the ineffable itself, and see if we may not eff it after all. ~ Dirk Gently

dismalist

I asked ChapGPT to explain the Heckscher-Ohlin Model ['ya know, labor and capital, who wins and who loses from free trade]. It did a decent job, let's say a standard job, with the words. I then asked it to explain the multi-factor version of H-O ['ya know unskilled labor, skill A type labor, skill B type labor, capital type I, capital type II, and so on.] Needless to say, this last is mathematically more challenging than the former. Nevertheless, there are some, if somewhat hard to interpret, results. ChatGPT answered by inserting the term "multi-factor" before words in its previous answer!

This is just another search engine. I have nothing against it. I understand why AI is called "artificial", but I don't understand why it's called "intelligence".



That's not even wrong!
--Wolfgang Pauli

Sun_Worshiper

You all who are commenting are using the subscription based ChatGPT 4? Or reminiscing over things you asked the free version to do several months ago?

downer

Quote from: Sun_Worshiper on March 26, 2023, 02:40:58 PM
You all who are commenting are using the subscription based ChatGPT 4? Or reminiscing over things you asked the free version to do several months ago?

I used the latest version, through copy.ai. I have not bought a subscription yet, because it costs a fair amount, say compared to Netflix or Amazon Prime. I'm not sure how much I would use it. I might buy a monthly subscription and use it for a month or two, and then cancel.
"When fascism comes to America, it will be wrapped in the flag and carrying a cross."—Sinclair Lewis

dismalist

Quote from: Sun_Worshiper on March 26, 2023, 02:40:58 PM
You all who are commenting are using the subscription based ChatGPT 4? Or reminiscing over things you asked the free version to do several months ago?

The free version, of course. A central idea in selling information is called "versioning". Some versions are free to get one to know the product and decide to buy, or not.

The free version has convinced me not to buy.
That's not even wrong!
--Wolfgang Pauli

Caracal

Quote from: Liquidambar on March 26, 2023, 12:11:54 PM
A coworker said that when he reviews grant proposals, he's found ChatGPT useful for the obligatory one paragraph summary of the proposal that the review starts with.  Of course he checks to make sure ChatGPT is summarizing accurately.

I haven't thought of anything ChatGPT could help me with.  I asked it to generate some word problems on a particular topic that would use a particular technique, but it couldn't stick to both the topic and the technique simultaneously.

It might do fine with the sort of reference letters I get asked to write where I have nothing particular to say about the student, don't really know anything about whatever they are applying for, but I agree to write them because the student obviously just needs a third reference and would have asked someone else if they were available and nobody really cares what I write anyway.

There are writing tasks that are basically rote, you aren't trying to do anything remotely original and as long as the result doesn't have inaccuracies, it will be fine. If I'm actually trying to express an idea, I wouldn't want to outsource the first attempt at it, because even if the program produces something that is ok, I'm trying to figure out what I actually think by writing the thing.

Wahoo Redux

Come, fill the Cup, and in the fire of Spring
Your Winter-garment of Repentance fling:
The Bird of Time has but a little way
To flutter--and the Bird is on the Wing.