News:

Welcome to the new (and now only) Fora!

Main Menu

Colleges in Dire Financial Straits

Started by Hibush, May 17, 2019, 05:35:11 PM

Previous topic - Next topic

Puget

Quote from: Hibush on July 24, 2020, 09:01:55 AM
Quote from: Wahoo Redux on July 24, 2020, 07:53:45 AM
I looked up some of the colleges I am familiar with on Galloway's spreadsheet----recent news from these places seems to contradict Galloway's analysis.  One SLAC which has admitted to teetering on the verge and recently laid off faculty is listed as "survive," while another SLAC which has always been healthy and wealthy, and is considered one of the best in the same region, is listed as "perish."  Unless Galloway has some sort of insider knowledge I wouldn't give much credence to his analysis----which mostly tells us what we already know anyway.

Apparently Galloway's spreadsheet doesn't match his graphic either. The comments on Galloway's own post hit him really hard for unserious analysis with serious consequences.

This looks egregious-- obviously wrong things like averaging in state and out of state tuition for publics, as if they had equal numbers of each, and using sticker price rather than actual average cost of attendance. Plus people in the comments are pointing out lots of simply wrong data, including mixing up institutions with somewhat similar names.
Then he does median splits on his variables to categorize colleges-- never good statistical practice. Who knows what other bad statistical practices he's using?
I'm not sure this guy can do math, let alone appropriate statistical modeling.
"Never get separated from your lunch. Never get separated from your friends. Never climb up anything you can't climb down."
–Best Colorado Peak Hikes

apl68

Quote from: Puget on July 24, 2020, 12:55:32 PM
Quote from: Hibush on July 24, 2020, 09:01:55 AM
Quote from: Wahoo Redux on July 24, 2020, 07:53:45 AM
I looked up some of the colleges I am familiar with on Galloway's spreadsheet----recent news from these places seems to contradict Galloway's analysis.  One SLAC which has admitted to teetering on the verge and recently laid off faculty is listed as "survive," while another SLAC which has always been healthy and wealthy, and is considered one of the best in the same region, is listed as "perish."  Unless Galloway has some sort of insider knowledge I wouldn't give much credence to his analysis----which mostly tells us what we already know anyway.

Apparently Galloway's spreadsheet doesn't match his graphic either. The comments on Galloway's own post hit him really hard for unserious analysis with serious consequences.

This looks egregious-- obviously wrong things like averaging in state and out of state tuition for publics, as if they had equal numbers of each, and using sticker price rather than actual average cost of attendance. Plus people in the comments are pointing out lots of simply wrong data, including mixing up institutions with somewhat similar names.
Then he does median splits on his variables to categorize colleges-- never good statistical practice. Who knows what other bad statistical practices he's using?
I'm not sure this guy can do math, let alone appropriate statistical modeling.

Sounds like his bid to draw attention to his work has pretty much destroyed his credibility, then. 
And you will cry out on that day because of the king you have chosen for yourselves, and the Lord will not hear you on that day.

spork

#1217
He never had credibility with me as an analyst of higher ed. As I wrote either in this or some other thread, he's been repeating stuff first written back around 2012. Can't remember if I linked to it in whichever one of my posts that I'm thinking of, but this is an example of a credible analysis. And perhaps this, though Mills is still operating.
It's terrible writing, used to obfuscate the fact that the authors actually have nothing to say.

Hibush

Quote from: spork on July 24, 2020, 10:12:27 AM
Quote from: Hibush on July 24, 2020, 09:01:55 AM
Quote from: Wahoo Redux on July 24, 2020, 07:53:45 AM
I looked up some of the colleges I am familiar with on Galloway's spreadsheet----recent news from these places seems to contradict Galloway's analysis.  One SLAC which has admitted to teetering on the verge and recently laid off faculty is listed as "survive," while another SLAC which has always been healthy and wealthy, and is considered one of the best in the same region, is listed as "perish."  Unless Galloway has some sort of insider knowledge I wouldn't give much credence to his analysis----which mostly tells us what we already know anyway.

Apparently Galloway's spreadsheet doesn't match his graphic either. The comments on Galloway's own post hit him really hard for unserious analysis with serious consequences.

I have never been impressed by marketers. Especially those who teach marketing in business schools.

Marketers have their value, but it is not in the search for truth.

Nevertheless, they can be impressive in getting resources to those who are engaged with the search for truth.

dismalist

Quote from: Hibush on July 24, 2020, 06:11:01 PM
Quote from: spork on July 24, 2020, 10:12:27 AM
Quote from: Hibush on July 24, 2020, 09:01:55 AM
Quote from: Wahoo Redux on July 24, 2020, 07:53:45 AM
I looked up some of the colleges I am familiar with on Galloway's spreadsheet----recent news from these places seems to contradict Galloway's analysis.  One SLAC which has admitted to teetering on the verge and recently laid off faculty is listed as "survive," while another SLAC which has always been healthy and wealthy, and is considered one of the best in the same region, is listed as "perish."  Unless Galloway has some sort of insider knowledge I wouldn't give much credence to his analysis----which mostly tells us what we already know anyway.

Apparently Galloway's spreadsheet doesn't match his graphic either. The comments on Galloway's own post hit him really hard for unserious analysis with serious consequences.

I have never been impressed by marketers. Especially those who teach marketing in business schools.

Marketers have their value, but it is not in the search for truth.

Nevertheless, they can be impressive in getting resources to those who are engaged with the search for truth.

Marketeers can be impressive in getting resources to those who are engaged in the quest for profits! :-)
That's not even wrong!
--Wolfgang Pauli

kaysixteen

So why exactly do ostensibly reputable journals publish stuff like this without having scientists or mathematicians check the data, and the methodology?

spork

Quote from: kaysixteen on July 24, 2020, 11:34:04 PM
So why exactly do ostensibly reputable journals publish stuff like this without having scientists or mathematicians check the data, and the methodology?

Galloway's blog is an exercise in self-promotion, not a reputable journal.
It's terrible writing, used to obfuscate the fact that the authors actually have nothing to say.


Hibush

Quote from: dismalist on July 24, 2020, 06:42:58 PM
Quote from: Hibush on July 24, 2020, 06:11:01 PM
Quote from: spork on July 24, 2020, 10:12:27 AM
Quote from: Hibush on July 24, 2020, 09:01:55 AM
Quote from: Wahoo Redux on July 24, 2020, 07:53:45 AM

I have never been impressed by marketers. Especially those who teach marketing in business schools.

Marketers have their value, but it is not in the search for truth.

Nevertheless, they can be impressive in getting resources to those who are engaged with the search for truth.

Marketeers can be impressive in getting resources to those who are engaged in the quest for profits! :-)

If marketers are not working for you, they are working only for the people who are taking away the resources you need. Then you end up in Dire Financial Straits.

kaysixteen

Yeah, that dumb blog is, but many other real journals publish crap that they could have avoided had they done a bit of homework.

tuxthepenguin

Quote from: quasihumanist on July 17, 2020, 11:31:09 AM
Quote from: polly_mer on July 17, 2020, 08:06:29 AM
60-80% of budget being salaries, benefits, and similar is pretty typical with teaching places having a higher percentage and research places having a lower percentage.

Things get really interesting at small enough places that infrastructure and bureaucracy costs don't scale with enrollment so that the full-time CPA with up-to-date software shows up as a noticeable non-instructional expense.

We got to answer a lot of questions at Super Dinky on how we were an undergraduate-only institution with no research and somehow our budget was only about half instructional expenses and directly related support.

Well, five full-time security guards (i.e., one on duty 24/7 to answer mundane calls like unlocking doors and jumpstarting cars while allowing vacation and sick leave) is a substantial fraction of the 25-35 full-time faculty.

Having a business office with a CFO, a full-time CPA, and three clerks to deal with paying the bills and collecting revenue is another five people who aren't teaching.

The three people in the financial aid department and the two people in the registrar's office also aren't teaching, but we can't run the college without someone doing the job.

Another five people in facilities including custodians for cleaning aren't teaching, but we can't run without them.

An IT department of 3-5 aren't teaching, but we can't run as an organization without them

All told faculty were about a quarter of the employees and we didn't have much administration.  One title was along the lines of provost and dean of faculty and overseer of athletics and overseer of co-curricular activities and enforcer of all student academic issues and first responder to parental fires and representative to most community groups.

It's worth noting that, when we read about staffing at at most colleges and universities 100 years ago, being part-time security or part-time custodian or part-time groundskeeper or part-time librarian or part-time clerk would have been part of the job duties of faculty.

Then you have a custodian teaching college classes. I can't comment on 100 years ago, but if you go back 50-70 years, in the aftermath of WWII, you'll find that "quality of instruction" wasn't high on the list of priorities. It wasn't uncommon for a "lecture" to consist of someone standing in front of a group of students reading out of a textbook for 50 minutes. No questions. No office hours. You knew that was coming if the "professor" was also a coach.

jonadam

His analysis does sound weird, namely because it listed College of St. Benedict as "perish", but St. John's Minnesota as "struggle". If you know about the two schools, they're very tied together schools up here in MN.

They share classes and community, but are single-sex in their residence halls and other things.

Puget

Because I'm a big stats nerd, I took another look at the comments on Galloway's blog post (https://www.profgalloway.com/uss-university)-- people have done a pretty good job of peer review there, pointing out numerous analysis problems including:

1. Restriction of range: he only includes nationally ranked institutions, which are actually the least likely to "perish"

2. He then uses median splits to assign this restricted range of institutions to categories, so definitionally 25% are going to end up in each category. That is, there is no objective cut off for "perish", "struggle" etc. -- it's just top or bottom half of his restricted range.

3. Some of the input variables are garbage. e.g., he includes search traffic as a "reputation" variable, which (a) doesn't distinguish between good and bad reasons for searching, (b) weights heavily toward big institutions (he doesn't weight by institution size), and (c) weights heavily toward places with big sports teams. Other data are out of date, or downright wrong (mis-entered or in some cases wrong institution with a similar name) .

4. Uses sticker price rather than actual cost of attendance for ROI calculations (this just makes no sense at all)

5. Public tuition rates take the simple average of in-state and out-of-state, rather than being weighted for % in-state and out-of-state students (again, this makes no sense)

6. Uses endowment rather than more complete metrics of financial health, punishing institutions that rely less on endowments.

7. Is almost exclusively focused on undergrad metrics. This does not accurately characterize research universities.

8. The idea that large public universities will be allowed to "perish" is just out of touch with reality. A lot of the other labels also just don't pass face validity-- top-ranked SLACs with huge endowments aren't going anywhere either. In both cases their metrics are also likely being seriously distorted by all of the above. If the results of your model aren't face-valid, that's a strong indicator you need to check your data and code.

Really, given the extent and obviousness of these problems (I do stats, but not on these types of data, and the problems were glaringly obvious to me, and obviously lots of others who commented), I can only conclude that  he either is super incompetent or just doesn't care because the point was to get media attention and reach his pre-conceived conclusions (done and done).
"Never get separated from your lunch. Never get separated from your friends. Never climb up anything you can't climb down."
–Best Colorado Peak Hikes

mamselle

Quote from: Puget on July 29, 2020, 08:50:54 AM
Because I'm a big stats nerd, I took another look at the comments on Galloway's blog post (https://www.profgalloway.com/uss-university)-- people have done a pretty good job of peer review there, pointing out numerous analysis problems including:

1. Restriction of range: he only includes nationally ranked institutions, which are actually the least likely to "perish"

2. He then uses median splits to assign this restricted range of institutions to categories, so definitionally 25% are going to end up in each category. That is, there is no objective cut off for "perish", "struggle" etc. -- it's just top or bottom half of his restricted range.

3. Some of the input variables are garbage. e.g., he includes search traffic as a "reputation" variable, which (a) doesn't distinguish between good and bad reasons for searching, (b) weights heavily toward big institutions (he doesn't weight by institution size), and (c) weights heavily toward places with big sports teams. Other data are out of date, or downright wrong (mis-entered or in some cases wrong institution with a similar name) .

4. Uses sticker price rather than actual cost of attendance for ROI calculations (this just makes no sense at all)

5. Public tuition rates take the simple average of in-state and out-of-state, rather than being weighted for % in-state and out-of-state students (again, this makes no sense)

6. Uses endowment rather than more complete metrics of financial health, punishing institutions that rely less on endowments.

7. Is almost exclusively focused on undergrad metrics. This does not accurately characterize research universities.

8. The idea that large public universities will be allowed to "perish" is just out of touch with reality. A lot of the other labels also just don't pass face validity-- top-ranked SLACs with huge endowments aren't going anywhere either. In both cases their metrics are also likely being seriously distorted by all of the above. If the results of your model aren't face-valid, that's a strong indicator you need to check your data and code.

Really, given the extent and obviousness of these problems (I do stats, but not on these types of data, and the problems were glaringly obvious to me, and obviously lots of others who commented), I can only conclude that  he either is super incompetent or just doesn't care because the point was to get media attention and reach his pre-conceived conclusions (done and done).

It's good to see a clear analytical critique of this.

It runs in my mind that Octoprof did a lot of this kind of work awhile back, where is she when we need her!!??

You and she could collaborate on an article...

   (No, I realize, we all have so much on our plate at present....)

M.
Forsake the foolish, and live; and go in the way of understanding.

Reprove not a scorner, lest they hate thee: rebuke the wise, and they will love thee.

Give instruction to the wise, and they will be yet wiser: teach the just, and they will increase in learning.

spork

Quote from: Puget on July 29, 2020, 08:50:54 AM
Because I'm a big stats nerd, I took another look at the comments on Galloway's blog post (https://www.profgalloway.com/uss-university)-- people have done a pretty good job of peer review there, pointing out numerous analysis problems including:

1. Restriction of range: he only includes nationally ranked institutions, which are actually the least likely to "perish"

2. He then uses median splits to assign this restricted range of institutions to categories, so definitionally 25% are going to end up in each category. That is, there is no objective cut off for "perish", "struggle" etc. -- it's just top or bottom half of his restricted range.

3. Some of the input variables are garbage. e.g., he includes search traffic as a "reputation" variable, which (a) doesn't distinguish between good and bad reasons for searching, (b) weights heavily toward big institutions (he doesn't weight by institution size), and (c) weights heavily toward places with big sports teams. Other data are out of date, or downright wrong (mis-entered or in some cases wrong institution with a similar name) .

4. Uses sticker price rather than actual cost of attendance for ROI calculations (this just makes no sense at all)

5. Public tuition rates take the simple average of in-state and out-of-state, rather than being weighted for % in-state and out-of-state students (again, this makes no sense)

6. Uses endowment rather than more complete metrics of financial health, punishing institutions that rely less on endowments.

7. Is almost exclusively focused on undergrad metrics. This does not accurately characterize research universities.

8. The idea that large public universities will be allowed to "perish" is just out of touch with reality. A lot of the other labels also just don't pass face validity-- top-ranked SLACs with huge endowments aren't going anywhere either. In both cases their metrics are also likely being seriously distorted by all of the above. If the results of your model aren't face-valid, that's a strong indicator you need to check your data and code.

Really, given the extent and obviousness of these problems (I do stats, but not on these types of data, and the problems were glaringly obvious to me, and obviously lots of others who commented), I can only conclude that  he either is super incompetent or just doesn't care because the point was to get media attention and reach his pre-conceived conclusions (done and done).

Absolutely love this.
It's terrible writing, used to obfuscate the fact that the authors actually have nothing to say.