We have done some fairly involved analysis of data available from university Common Data Sets, yearly statistical snapshots released by most universities covering enrolment and admissions. A graphical narrative of our story can be found here. (Note this tale is most relevant for the 40-50 American schools that are, for some reason, most popular among domestic and (especially) international students. The overwhelming majority of American universities are excellent, and not competitive to get into.)
The basic narrative is as follows:
- A small number of American universities were becoming harder and harder to get into as there was a slow and building race to perceived "quality" among applicants.
- Covid happened and all schools shifted to a test optional as they couldn't require a test that kids couldn't take.
- This led to an appreciable increase in the number of applications to said schools as many students who would previously have been reluctant to apply due to low test scores went ahead and had a try.
- Universities largely did not increase enrolment and hence the admit rate among these schools dropped even faster than pre-Covid trends.
- Admit rate is a key metric of the many university ranking systems that have far more influence than is merited.
- The majority of students enrolling in these schools did submit test scores, though the actual admit rates with and without scores is not public.
- Universities know that if they start requiring tests then there will be a drop in application numbers, a rise in admit rates, and fall in their rankings - perhaps leading to a cycle of even lower application numbers, rinse, repeat.
- A few universities have reverted back to test required based on several fairly weighty studies that indicated test scores are a better indicator of potential collegiate success compared to GPAs that have been further inflated by Covid. These have generally been schools that don't need to worry about application volume as their high rankings are assured (Harvard, Yale, MIT, CalTech, etc,).
- Universities in the test optional camp often list test scores as "important"" in admissions decisions yet continue to have test "optional" as their stated policy. Are test scores important, just as good grades, academic rigor, and CCAs are also important?
- High schools do not know which students submit test scores and which students do not, and universities do not reveal this. This makes the admissions process more opaque and confusing for all concerned as is makes the question "do I submit scores" into one that is sometimes impossible to answer.
To help families better understand the admissions process, we ask that universities reveal the following information:
- International, in-state, and out of state admissions rates (many are now revealing this!)
- Admit rate for students that submit scores and for those that do not.
Universities will not start doing this until based on the words of a test prep provider. We call on all students, parents, and university advisors to ask every single university representative they meet to reveal this information. This will help families make better sense of the admissions process and make more informed decisions.
Domestic vs. International Admit Rates
Some universities have started including international admit rates in their Common Data Sets! This is exciting to data nerds like us and confirms what we already know: admit rates for foreign students are much lower than domestic, sometime much lower. It would also be very interesting to know how many of the 229 accepted international students are Princeton were helped by "institutional priorities", such as athletics.
We haven't included all the data but some schools get thousands of applications from international students but as the data show, only admit less than 4% of those students. Drumming up applications from international students who realistically have a small chance of admission is a great way to lower headline admit rates, as well as a decent source of revenue at US$100 per application or so.
Sadly, many schools do not report this data as the common data set initiative is technically optional. Please ask every university representative that you speak to from schools that don't report, "why are you being disingenuous (SAT word!) and not revealing data like your peers do?" Hopefully more will report in future, but they certainly will need to be pushed.
More on Testing Trends
The number of applications to the most rejective American universities has been growing for the last 20 years. Interestingly, the number of Americans going to university is reaching a demographic peak, but the number of applications has been growing as students apply to more universities, and more students from abroad also apply - many of whom might not necessarily enrol.
Universities will never tell a student not to apply. The more applications received, the lower the admit rate, and the lower the admit rate, the higher ranked a school will be. These silly, subjective, flawed rankings have led to an arms race of sorts where universities seek applications from around the globe to "keep up" with the school down the road that is doing the same.
Covid resulted in universities largely adopting a "Test Optional" policy as they couldn't realistically ask for a test that students couldn't take. This led to many students applying to MORE schools than before, including those that they probably have zero chance of getting into. Test scores used to act as a rough filter of sorts to reduce frivolous applications, but if the test is "optional" then why not send in an application from a straight A student? (A majority of students at many schools are "straight A students" now...)
This accelerated the growth in application numbers further with those to the most popular schools surging upwards of 20% from 2020 to 2023. The number of places has been fixed so the result is even lower admit numbers as the number of applications to schools like NYU have more than doubled in 10 years. All the while, as previously reported, the majority of those that get into said schools are submitting SAT scores. Test "optional" is actually test "preferred".
Switching away from test optional would result in a sudden dip in the number of applications received, a rise in the admit rate, and a fall in the rankings. Only a select few schools have been brave enough to adjust their policy to reflect the reality on the ground - Cornell being the latest on April 22nd. We expect more to follow, but it is a collective action problem with universities effectively side-eyeing their peers. It is too late now for any other changes for this coming 2024-2025 admissions cycle, but more schools are sure to follow. Perversely, some schools might follow as doing so would group them with the very rejective schools that have already done so and perhaps help with application numbers!
An excerpt from our related research drawn from university Common Data Sets with charts and graphs can be found below.
We will be mining this data for more thoughts in the future. For now, please look at schools outside of some arbitrary ranking table or those that are "popular" at your school - only about 75 schools in America are actually hard to get into!
Testing Trends
We have been mining the Common Data Sets of publicly available information from the most popular 40 or so universities in America. The trend pre-Covid was towards a concentration of applications towards perceived quality with a direct result in lower admissions rates as the size of entering classes has remained largely static.
Covid led to nearly all schools adopting a “Test Optional” policy that had the unintended side effect of greatly accelerating this trend.
This is best seen when we aggregate all the categories of universities.
The large private schools haver enjoyed a particularly large jump in applications with Northeastern and NYU more than doubling in just 10 years.
And of course the number of applications to schools in a specific regional athletic conference also has continued to grow.
With admit rates trending towards historical lows.
The University of California system is one of the few to reveal what the admit rate is overall and also for international students and the differences for Berkeley and UCLA are stark. A good number of the international kids are athletic recruits…
This has been the case across the board with more applications submitted across all the most popular schools. As a result of the test optional policies, many students who might not really have much of a chance of gaining admissions are having a punt and just submitting their applications with no test scores.
Students withholding scores also has accelerated the increase in average SAT scores among admitted students.
While all the while, the overwhelming majority of successful applicants are indeed submitting test scores:
By admit % Submit Rate Admit rate
Top 20 69.6% 6.5%
Top 40 62.7% 9.3%
Top 60 61.3% 12.8%
Top 100 57.8% 23.4%
Top 150 55.4% 35.7%
Top 200 54.3% 47.5%
We group the schools as follows:
Small-mid Privates
Caltech
CMU
Duke
Emory
Georgetown
MIT
Northwestern
Stanford
Tufts
Vanderbuilt
Wash U - St.L
Publics
UC Boulder
Georgia Tech
UIUC
Purdue
UC Berk
UC Davis
UC Irvine
UCLA
UCSB
UCSC
UCSD
Michigan
UTA
UVA
Large Privates
BC
BU
GWU
Notre Dame
Northeastern
NYU
Tulane
USC
The Test is Meant to be Hard
There has been some chatter that the SAT a few weeks back was very difficult.
This is the first time the new test went "large" and was administered in the States as well as international. By tweaking the questions a bit harder, College Board has a bit more room to play with at the top of the curve, in order to keep the scoring comparable to past tests - a core objective of any metric. If the test is objectively hard then it will be scored on a more lenient scoring table. A bit like a handicap/slope rating of a golf course.
If the test was really easy then loads of kids would have had full marks on the math and you can't have 10% of the kids get a perfect 800. The test being "hard" is kind of the point - if most kids got 90% correct you would end up with a terrible metric, akin to an inflated GPA. While the scoring on the new test is indeed a bit of a black box, the headline scores will be statistically massaged to be directly comparable with past iterations. Even last year the complaint was that the test seemed harder than the practice tests but most scores came in as expected.
For any of our many students who sat this test, let us know how you did when the scores come out.