In the midst of legal and political battles over preferences given to different kinds of college applicants, news broke that the College Board has been experimenting with providing college admissions officers with what some have called an “adversity score” for each applicant to, potentially, give admissions boosts to students from disadvantaged backgrounds.
The experiment has stirred up many controversies, in part because of its secrecy. Students and their families can not see the numbers the College Board assigns them. In addition, the new data tool appears to have added heat to the ongoing debate over the role a student’s race can or should play in admissions. Some critics objected to the failure of the College Board to consider race as a potential adversity factor. And some of those who oppose the use of race as an admissions factor were suspicious that the new number could be used to get around some state bans on affirmative action.
To help journalists cover this issue, the Education Writers Association asked College Board officials to sit down for an on-the-record explanation of exactly what this experiment involves.
The following is an edited transcript of a conversation that took place June 25, 2019.
Swati Guin, Education Writers Association: There’s been a lot of controversy over what some people are calling a new “adversity score” that the College Board is starting to provide some colleges. Can you tell us what you are doing?
David Coleman, CEO of the College Board: The Environmental Context Dashboard (ECD) provides general data or information about your neighborhood and your high school. The number would be the same for every person who is in your neighborhood and in your high school.
It is not a “score.” And it is not about you, specifically. The only “score” is the SAT score.
Connie Betterton, vice president for higher education access and strategy of the College Board: The dashboard provides indicators about each student’s environmental context. For the student’s home (Census tract) and high school we provide data like median income, average educational attainment and college trajectory. (The College Board has posted a fuller explanation of the data.)
Coleman: The dashboard is free. We have no intention of charging for it. It’s useful identically for the SAT and the ACT. It’s not meant to advantage the SAT in any way. We just think context matters no matter what achievement you’re looking at. We are considering changes and efforts to make the dashboard more transparent. So stay tuned: I don’t know on what timetable.
Guin: Why isn’t race taken into account?
Betterton: I’ll take that in two parts. The first is the fact that individual characteristics are not included in the dashboard. We felt it was important to keep a very firm line: the dashboard is about general contextual information; the student’s application itself contains all the individual level data. Second, there are a good number of states where colleges are not allowed to consider race in admissions. So the decision we made was not to include race as an input into the tool.
Kim Clark, Education Writers Association: I want to clarify. You provide a number for each student that is on a 0-100 scale? But you object to using the word “score” to describe it?
Coleman: An SAT score reflects your performance on a test of a small set of reading, writing and math skills. It is something that you can change through practice. A score is something you earn. So when you say, “What’s your adversity score?”, it implies, “I could that improve it if I worked at it.”
But we are describing the environment. We are not “scoring” it. I feel using the word [score] confuses people.
Clark: Why have you objected to the use of the word “adversity” when describing this new tool? What words do you think are more appropriate?
Coleman: We might use the word “challenge,” but if it measures anything, it is “resourcefulness,” or “achievement in context.” It’s really about finding strength, rather than anatomizing adversity. What it illuminates is the resourcefulness of young people.
Guin: For what purpose was the dashboard created?
Coleman: The entire point of this tool is to give general information that might cause college admissions officers to take a second look at someone’s achievements, that might not on its face be that impressive until you consider the context in which that achievement occurs.
On that second look, by the way, you should consult, of course, all of the student’s individual data and application. In other words, you should still read, of course, the recommendation letters, the essay, any interview. Because all students – wealthy or poor – can face adversity in their lives. And that all still remains in the individual application.
Guin: What were the pressures that caused you to create the tool?
Coleman: An achievement test is good, but admissions officers don’t just look at raw achievement. What they actually do is try to look at your achievement and get some sense of what you do with what you’re given. Maybe you don’t have a lot of AP courses but, guess what, there weren’t a lot of AP courses at your high school.
Looking at achievement in context has long been done. But it’s been done in an informal way. College admissions officers visit certain schools, but not others. They have really nice profiles for some high schools but some high schools don’t have profiles.
Researchers showed us that the way admissions officers were looking at context was too, how can I put this, inconsistent. They (admissions officers) – who are our members – recognized this also and asked us for a more consistent tool.
The dashboard allows for a common format so all schools can be seen equally. The dashboard also gives colleges some insight into neighborhoods. (We are) trying to simply make what they already do today more consistent.
And by making it more consistent, we find the results are that they take a second look at a lot more poor children. They take a second look at a lot more rural kids. And they make small but important gains in those kinds of diversity in their class.
Guin: Which colleges are using the dashboard?
Betterton: So far we’re still in a pilot phase. We started with a small operational pilot in the 2017-18 academic year. Then this last year we expanded the original 15 schools to 50 schools. We’re targeting to expand the pilot to between 100-150 this year (the fall of 2019). We’re thinking about a broader launch after that.
Our intent going forward is that any college who wants to benefit from our information would be a part of it. They would have to sign a participation agreement about how they use it, and to share some data with us. We’ve been super intentional and careful about how broadly this tool might be used because we’re committed to having it be used to screen students in, not out, as far as the admissions process.
Guin: What guidance or rules do you have for the pilot schools?
Betterton: Currently, colleges must commit to use the data to screen students in and not out, because that’s an obvious concern. They also agree to share outcome data with us so that we as the College Board and the community of pilot users can see the impact it is having on admissions, to prevent it from causing admitted classes to be less socioeconomically diverse. Users also commit to doing robust training for admissions officers. We developed a series of guides, supports, webinars and train-the-trainer kinds of things. We want to continue to evolve this and make it more robust going forward. We’re also working on a research agenda to understand how the tool is being used or interpreted by users.
Guin: One big concern seems to be the mystery about the number. Why can’t test-takers see the dashboard? Are you considering making this more public or transparent?
Betterton: A good amount of the controversy is a product of articles having coming out before we were ready to go broader. Our intent is to be as transparent as we can. We are thinking through right now when and how to do that.
Guin: What are you doing to address the concerns that parents might try to game the system?
Betterton: We’re aware of those considerations. We want to make sure folks understand the role this information plays within admissions. This is a few pieces of contextual information that that sit alongside 100 other data points in an application.
Guin: So far, have there been any surprising findings?
Betterton: The biggest one is that pretty spontaneously the schools started to use the tool beyond admissions, for things like selecting students to attend bridge programs over the summer before they start their freshman year. And they have also started to pilot use of the contextual data with academic and other advisors on campus in order to try to make sure they are connecting students with resources, programs, clubs, and other things to support success in college. That happening so quickly and spontaneously was a surprise, a good surprise, but a surprise.