News:

Welcome to the new (and now only) Fora!

Main Menu

SAT Drops Essays and Subject Tests

Started by namazu, January 19, 2021, 10:46:22 PM

Previous topic - Next topic

mythbuster

Back in the 1980's, the subject tests were called either the SAT II or subject achievement tests. The were 1 hour exams and you could take up to three in one test sitting.
    Many competitive schools required three scores in addition to the regular SAT. As I remember they often required an English Comp (more grammar than the old vocab heavy Verbal section), a Math (there were 2 levels of math you could take), and one of your choosing. You scores on some of these exams, such as foreign language, could be used for placement purposes as well.
   Even at the time taking another English and Math ETS exam felt redundant. The math exams were both more advanced than the regular SAT math section. I'm not surprised that they have fallen out of use.

hmaria1609

Besides the SAT, I took 3 AP tests (US and European History and English Language & Comp) before I graduated from high school. My public high school was picked to pilot in school SAT prep, and I appreciated that opportunity.

I've seen a variety of SAT subject test prep books at the library, they're cataloged and shelved based on subject. (For example, SAT II History prep book in the 900s) Every so often, one of the titles will go out for a hold.

kaysixteen

Given that we'd all agree that 'critical thinking' is a good thing, and a very good thing for an entering college frosh to have, how do we evaluate its presence in a college applicant, and how does coming from a culture that greatly emphasizes deference to elders detract from critical thinking, or the ability to demonstrate crit thinking skills?

marshwiggle

Quote from: kaysixteen on February 23, 2021, 11:49:19 PM
Given that we'd all agree that 'critical thinking' is a good thing, and a very good thing for an entering college frosh to have, how do we evaluate its presence in a college applicant, and how does coming from a culture that greatly emphasizes deference to elders detract from critical thinking, or the ability to demonstrate crit thinking skills?

Looking for logical fallacies and unconsidered interpretations in a report or article would be tests of critical thinking skills that shouldn't be hampered by deference to elders. Students always have peers, no matter what the culture, so if one is assigned the task of evaluating something ostensibly from a peer to see if it is well thought out and the arguments are supported by evidence, that shouldn't be a problem.
It takes so little to be above average.

Puget

Quote from: kaysixteen on February 23, 2021, 11:49:19 PM
Given that we'd all agree that 'critical thinking' is a good thing, and a very good thing for an entering college frosh to have, how do we evaluate its presence in a college applicant, and how does coming from a culture that greatly emphasizes deference to elders detract from critical thinking, or the ability to demonstrate crit thinking skills?

One of the international application readers tried to pull this one on her boss (who we were working with), claiming she couldn't possibly evaluate Chinese applicants for critical thinking because it isn't a "value" there. This got a pretty swift and negative reaction-- clearly one of the world's major economies which regularly produces cutting-edge science and technology is quite capable of, and values, critical thinking. What you need to do is get outside your own cultural biases and stop defining critical thinking as openly disagreeing with your elders. It's not hard, you just have to examine your cultural biases a tad.
"Never get separated from your lunch. Never get separated from your friends. Never climb up anything you can't climb down."
–Best Colorado Peak Hikes

spork

Cognitive scientists and psychologists point to the lack of evidence for "critical thinking" as a distinct process in the human mind. In other words, it doesn't exist in the way that people think it does. The phrase is meaningless.
It's terrible writing, used to obfuscate the fact that the authors actually have nothing to say.

Puget

Quote from: spork on February 24, 2021, 07:13:15 AM
Cognitive scientists and psychologists point to the lack of evidence for "critical thinking" as a distinct process in the human mind. In other words, it doesn't exist in the way that people think it does. The phrase is meaningless.

As a cognitive scientist and psychologist I wouldn't exactly put it that way. You are right that it needs to be operationally defined much more clearly to be meaningful (which also applies to many other constructs, like IQ), but as for being a "distinct process in the human mind" no complex aspect of cognition is that-- they are all emergent properties of multiple neural mechanisms working together.

So I'd say it is fine to use "critical thinking" as shorthand, but we have to do a good job of defining what it is shorthand for, and how we're going to measure that.

If we define it as the ability to make decisions based on careful consideration of the evidence and multiple perspectives on and interpretations of same, that's a pretty culturally-neutral definition and one where you can begin to think through where in an application you might find evidence for that. On the other hand, if we define it as going against the grain and challenging authority, that's obviously a pretty culturally loaded definition and I'd say also doesn't capture what we actually think contributes to student success.

If you're using it as a criteria for admissions, you've really got to define it for your application readers and provide them with guidance and training on what constitutes evidence and where and how to look. 
"Never get separated from your lunch. Never get separated from your friends. Never climb up anything you can't climb down."
–Best Colorado Peak Hikes

Aster

Perhaps a better way of articulating what "critical thinking" is to define what it certainly isn't.

It's not memorization.

marshwiggle

Quote from: Puget on February 24, 2021, 09:27:25 AM

If we define it as the ability to make decisions based on careful consideration of the evidence and multiple perspectives on and interpretations of same, that's a pretty culturally-neutral definition and one where you can begin to think through where in an application you might find evidence for that. On the other hand, if we define it as going against the grain and challenging authority, that's obviously a pretty culturally loaded definition and I'd say also doesn't capture what we actually think contributes to student success.


Isn't the technical term for that "the Terrible Twos"?
It takes so little to be above average.

Ruralguy

As far as I recall from the 1980's, achievement tests (ETS SAT II Subject Tests) were mostly used for admissions into specialized programs. That is, in addition to SATs for general admit, the school could say whether you were a "go" for a special pre-med program ( 6 year med programs were a thing at the time..maybe still are?) , engineering, or what have you. I don't believe they were generally used for credit per se (such as with AP's), but that was probably a school dependent kind of thing.

mamselle

QuotePerhaps a better way of articulating what "critical thinking" is to define what it certainly isn't.

It's not memorization.


Well....but....,

Doesn't a student exercise critical thinking in deciding what to memorize? Or an instructor in deciding what is worth the time and effort to do so?

And if one hasn't memorized (or maybe better, internalized, which I think can include memorization but is not entirely comprised of that process--informed critique appreciated) certain criterion, how can one have a basis for making informed critical decisions?

If I hadn't memorized the "times tables" in 3rd grade, I wouldn't have been able to figure out how much my students owed me last month (since it's number of lessons times the fee per lesson...); if I couldn't subtract (because I hadn't memorized the 'minuses' in 1st grade, and so don't have to use my fingers), I couldn't tell how to apply what they paid me extra last month in error to this month's bill.

Etc.

There are huge errors (which I propose to right, single-handedly, of course!) in dance historical and ethnographic writing because people didn't memorize certain dates by which to "peg" certain other events, and so they get the stylistic causality backwards because they've got the chronology in a tangle. (I'll avoid a couple of pet peeves to show this...I'll spend that time writing the rest of the article on it, instead.....)

I noted in another thread awhile ago how serious this is becoming in newer programs like "Dance Studies," or "Visual Studies," which by their titles seem to extract the chronological rigor and render themselves as "Dance-lite" or "Art History-lite," with a nice dollop of predigested lit-crit assumptions thrown in for bad measure. (I blame Sir James Frazier, and Jung, for some of this, even more than the misappropriated Lacan and Derrida that sends folks into raptures...sigh).

If my music students don't memorize their pieces, they never reach the point of playing interpretively and with an awareness of their playing while they're doing it. You can't play jazz without memorizing the chord structure, because everyone is riffing on it at 60 beats a minute or more and you don't have time to get lost in trying to find your place in the music...your fingers and your mind are working immediately on that memorized base, using other memorized motifs that you fit into the chord structure as you go along, in response to the other players, the tempo, the audience, everything.

You can't be free to think about the visual arts without having a bank of memorized pieces in mind as comparanda. When did the use of tertiaries come in? Is this work older or later than that one based on its use of reverse perspective? Can I date this manuscript page to the 14th c., or is it really from the 12th c.?

If the nurses on the floor don't know their bilirubin norms, they may flag (or fail to flag) a patient who needs urgent attention based on the blood results the lab just sent up.

If a chemist doesn't know, or forgets, that putting two particular reagents together can cause a rapid expansion of gases in the container, you might have a problem of a different kind...

I do think memorization has something to do with critical thinking. It may not be the be-all or end-all, but it's an important part of the start.

M.
Forsake the foolish, and live; and go in the way of understanding.

Reprove not a scorner, lest they hate thee: rebuke the wise, and they will love thee.

Give instruction to the wise, and they will be yet wiser: teach the just, and they will increase in learning.

Aster

Quote from: mamselle on February 24, 2021, 10:09:43 AM
QuotePerhaps a better way of articulating what "critical thinking" is to define what it certainly isn't.

It's not memorization.


Well....but....,

Doesn't a student exercise critical thinking in deciding what to memorize? Or an instructor in deciding what is worth the time and effort to do so?
In my field, no. Nearly all modern textbooks now show students exactly what to memorize with a bold-faced font. And through modern use of electronic teaching tools (e.g., PowerPoint), professors have learned to use the same practice in the classroom. Nowadays and for at least the last 20 years in my field, it takes either a noob or an *extremely* old fart professor to not operate/facilitate direct mechanisms for displaying everything or nearly everything that a student should memorize.

Quote
I do think memorization has something to do with critical thinking. It may not be the be-all or end-all, but it's an important part of the start.
Yes, it has something to do with it in the sense that one may choose to use memorization as a stepping-stone to progress to critical thinking. Or not. Memorization sensu strictu is not obligatory for processing information; there are other ways to remember key factoids besides a mere mental picture-taking. That newfangled New Math is an example of that. And for me personally, most sailors have their port and larboard sides memorized. But I never memorized that correctly and instead have to mentally process that "port and left both have the same number of letters". Now arguably, a philosopher could choose to make the argument that the use of that mental phrase is itself a form of memorization, but I'd probably hit them over the head with a pillow bat at that point.

financeguy

I find it difficult to have a good faith argument on what attributes are most appropriate for finding good applicants when we all know that if those criteria, once determined, do not result in the desired racial configuration when applied, they will be ignored.

The entire idea of being "smart" has been attacked. IQ tests, SAT scores, GRE, LSAT, MCAT and other tests don't measure someone's favorite ice cream flavor or what shoe size they have.  They measure (with quite a bit of predictive validity) cognitive ability. Someone's letter of recommendation from their uncle's friend, personal statement about the loss of their pet or other subjective materials do not do this. Relying on soft aspects means ignoring or at a minimum reducing the value of objective data. Holistic is just a euphemism to say "we also look at other stuff!"  Yet this is the undisputed norm in college admissions at all levels.

No one thinks the numbers should be automatic. The test score of 99 from a recently paroled ex murderer is probably not as good an indication of "success" as the guy with a 98 whose list of hobbies do not include homicide. I just find it disappointing that a field of academics is so inclined to substitute their own personal preferences or assumptions above the objective data.

marshwiggle

Quote from: financeguy on February 24, 2021, 11:55:04 AM

No one thinks the numbers should be automatic. The test score of 99 from a recently paroled ex murderer is probably not as good an indication of "success" as the guy with a 98 whose list of hobbies do not include homicide. I just find it disappointing that a field of academics is so inclined to substitute their own personal preferences or assumptions above the objective data.

At a more basic level, the underlying assumption that there is this motherlode of potentially previously-unidentified geniuses who the current system has missed is, at best, unproven. Given that for decades institutions have been trying to find academically-solid students who may have been off the radar, it's most likely that the vast majority of the places where they would previously have hidden have now been discovered. 

On the other hand, since those scores were used to identify unsuitable students, the vast majority who don't make the proper scores will still, in fact, be unprepared.

Hand-sorting peoples' recycling will occasionally find a ring or coin that fell in by mistake, but it's not likely to happen often enough to pay for the labor.
It takes so little to be above average.

Ruralguy

We wish we could get some more of those high scoring murderers.