Matt’s Past SAT/ACT News Update:

Matt O'Connor

Feb 09, 2018

 
As Scott Jaschik of Inside Higher Ed writes, a recently published book criticizes test-optional admissions, but two of the book's editors currently work for the College Board.

[Excerpts]:

The chapter attracting the most discussion among admissions experts so far is one that argues that test-optional admissions do not actually add to diversity, and could even hurt efforts to diversify. The argument is summarized in an essay published in The Conversation and goes like this: test-optional colleges continue to use the SAT or ACT for students who submit scores, but only those applicants with good scores submit, so average scores go up. This makes colleges more competitive (and they go up in the rankings). Becoming more competitive means that fewer low-income or minority applicants are admitted. These conclusions came from an examination of liberal arts colleges, a type of institution where test-optional policies have been popular.

 
The two comments written by website visitors below the article are both valuable. The first points out issues with the University of Georgia study so often touted to question the motives and impacts of test optional admissions, and the second comment is from Steve Syverson, and provides information about a second study being conducted by Bill Hiss and Valerie Franks.

[Steve Syverson's comment follows]:

"Bill Hiss, Valerie Franks, and I are just completing a follow-up to Bill's and Val's 2014 study, exploring the impact of adopting a Test-Optional Admission policy on the applicant pools of 28 colleges and universities that vary in size, selectivity, and many other characteristics. The data include slightly less than a million individual student records representing the entire applicant pools from each institution for two cohorts prior to adoption of the test-optional policy and two cohorts post-adoption. The outcomes are not uniform and illustrate some of the diversity among colleges as well as among the implementation of test-optional policies. We expect our report will be available for NACAC to publish by April.

Bill was invited to contribute a chapter to the book, but declined, because it felt so heavily biased toward the testing agencies."

 
Recently, Inside Higher Ed published an article by Jim Jump, who wrote an acerbic article last year about the College Board's sudden zeal for test prep (Khan Academy style). This time, Jump addresses the same College Board-connected book discussed in the item above.

[Excerpts]

...there are certainly skeptics who wonder if the new book will turn out to be a balanced study of the issues related to standardized testing or pro-testing propaganda. The skepticism derives partly from the fact that two of the three editors of the new volume are employees of the College Board, while the third, who recently left his position as senior vice president at the College Board for a similar post at the American Institutes for Research, led the redesign of the new SAT.

The other cause for skepticism is an article that appeared last fall on the website of The Atlantic Monthly as “sponsored content,” a nicer description than “infomercial” or “advertorial.” The sponsor of the article, “When Grades Don’t Show the Whole Picture,” was none other than the College Board.

The Atlantic Monthly advertorial lays out a two-legged argument, both of which lead to the inevitable conclusion that standardized testing, and in particular the new and improved SAT, is an essential tool for admissions officers and for students.

The first leg of the argument is that grade inflation in high schools makes it harder for admissions officers to “fairly differentiate” among students applying to college. It quotes research from Michael Hurwitz of the College Board and Jason Lee from the University of Georgia that high school grade point averages are higher than ever.

While we’re on the subject of inflation, what about score inflation? When the College Board introduced the new SAT two years ago, it argued that it was a different test and that scores from the old and new SAT shouldn’t be compared. If that was the case, why keep the 200-800 scale? Is that part of the brand?

The other leg of the argument is that the new SAT is an engine of opportunity for the very students standardized testing has previously been criticized as disadvantaging -- students of color, first-generation students and those from low-income households. It’s a fascinating -- and convoluted -- attempt at persuasion.

 
The New York Times covers the jump in NYC students taking the SAT in 2017 due to the citywide contract with the College Board, which provided a free SAT to all juniors during a school day. A total of 61,800 high school juniors took the SAT, a 51% increase over 2016. NYC paid a discounted rate of $36 per student. There was a dramatic increase in the percentage of black juniors taking the SAT due to the contract, rising from 47% to 75%.

 
A bill has been introduced in the Arizona House of representatives to provide the SAT or ACT for free to all juniors. The plan would discontinue the current requirements for all juniors to take the AZMerit test combined with a science test for federal accountability purposes.

 
Here is a 5-minute audio interview with the Arizona state rep who introduced the bill.

 
Arizona's House Education Committee has approved spending of up to $800,000 to create up an online system to help students to prepare for the SAT or ACT. The proposal still has further legislative hurdles to surpass.

 
This Orlando Sentinel article provides the news that a study ordered by the Florida legislature to determine if the SAT/ACT could be used in lieu of Florida's statewide high school exams suggests that the college entrance exams would not be appropriate for that purpose.

[Excerpts]:

Neither of the national college admissions exams meet all of Florida’s academic standards for algebra 1 or for 10th-grade language arts, meaning schools might need to give additional exam questions if they used the ACT or the SAT, adding “cost and complexity” to testing plans, the study said.

The two national exams produce different results than the Florida Standards Assessments, so it would not be fair to allow some Florida school districts to use the ACT or the SAT while others used the FSA, but have them all judged by the school-grading system, it added.

The authors said they had “serious doubts on the interchangeability of the three tests” and they felt it was “not fair to compare schools that use the state tests in their accountability system to those that use the alternate tests.”

Finally, they said they doubted such a system would pass muster under federal law. The federal Every Student Succeeds Act allows local school districts to pick a national test in lieu of a state high school test, if they meet certain criteria.

The study, however, considered only a system under which some schools would use the ACT or SAT and others would stick with state exams. The study did not examine what would happen if Florida abandoned its high school exams completely and used either the ACT or the SAT instead.

 
Catherine Gewertz of EducationWeek has written an article centered on the Florida study mentioned above. There are many interesting quotes.

[Excerpts follow. Edward D. Roeber, quoted below, is the lead author of the study.]

"States seem to have this belief that, well, we can just drop our current high school exam, whether it's PARCC, Smarter Balanced, or a custom-developed test, and we can get a two-fer by using one of these college-entrance tests. But I'm not sure they've studied it carefully enough," Roeber said.

It's not that it's impossible to use the SAT or ACT to measure mastery of state content standards, Roeber said. It's possible. But states shouldn't assume the switch will work for them. They must conduct diligent alignment studies that will identify how well a college-entrance exam covers their academic standards. Since standards differ from state to state, each state must conduct its own alignment study, or it can't claim that the SAT or ACT is fully "aligned" to its standards, Roeber said.

As of a year ago, a dozen states were using the SAT or ACT as their official high school achievement test for accountability purposes. Since the SAT has been redesigned, states that use it have not yet gone through the federal peer-review process. But at least a couple of states that use the ACT got letters from the U.S. Department of Education last year asking for more evidence of alignment to state standards and/or a deeper dive into accommodations policies.

"Everything in assessment is a choice," [Executive Director of the Center for Assessment Scott] Marion said. "Do you use a test like the SAT or ACT, and focus on predicting whether students will be successful in college? Or do you use a test designed to cover academic standards? Which one are you trying to measure?"

Using a college-entrance exam without verifying that it does a good job covering a state's standards risks sending confusing messages to schools, Marion said. Teachers will wonder whether they should be teaching the content of the standards their state adopted, or the material covered in a college-entrance exam, he said.

"The College Board has conducted studies demonstrating the alignment of the new SAT with the current standards in all 50 states," [College Board spokesman Zach] Goldberg said. "The SAT strongly aligns with Florida's own standards. We stand ready to support states who want to use the SAT for accountability."

[End excerpts]

We were unable to locate any of the studies Mr. Goldberg mentions, nor does it seem sufficient to rely solely on the College Board's own assessments of the alignment of the SAT to state educational standards.

 
This article underlines how much greater the prospects for graduation are for students who are admitted to more selective colleges; 6-year graduation rates at many historically black colleges and universities are lower than 20 percent.