The Tangled Web of College Admissions, Data, and Privacy
If a college has only one available slot and receives two applications from students who appear to be equally qualified on paper, should the college admit Student A, who has never shown any interest in attending that particular college? Or should it choose Student B, who has opened every email, thoroughly explored the college’s website, and contacted the admissions office to ask questions about financial aid?
Colleges are facing increased scrutiny about how they collect and use data in their admissions process. As a Washington Post article discussed this month, it feels invasive for colleges to track applicants’ online activities, especially when that tracking isn’t clearly disclosed. However, we must disentangle the conversation about fairness in admissions from the conversation about how colleges use data in admissions, because more privacy-protective data practices cannot resolve unfairness in the admissions process.
A selective college’s admission process can be anxiety inducing. Prospective students are often asked to provide a small set of materials, including test scores, grades, letters of recommendation, and a writing sample—and then they wait. These students can only guess at their likelihood of admission based on public information about grade point averages and SAT or ACT scores of students who have been admitted previously.
When applicants are rejected, they receive only a form letter with no explanation for the rejection. This lack of transparency creates distrust about the process. That’s why the federal courts are hearing cases that aim to make affirmative action unconstitutional, why civil rights groups are opposing the use of SAT scores for admission, and why there is deep skepticism about legacy students.
It’s also why so many people found the #AdmissionsGate bribery scandal so compelling. At the national level is public sentiment that the most qualified applicants aren’t the ones admitted to selective institutions. While people’s political persuasion might influence their feelings about legacy admissions, affirmative action, and over-reliance on standardized testing, most people agree that the college admissions process, along with higher education generally, is broken.
Using Technology to Track Demonstrated Interest
Beyond reviewing a student’s grades, test scores, and recommendation letters, college admissions officers often try to gauge students’ “demonstrated interest.” There is no one definition of demonstrated interest, but the term generally means any outward sign of enthusiasm or curiosity about a school during the application process. In other words, how likely is it that a particular student will enroll if accepted?
In years past, colleges rated students’ demonstrated interest by how often they contacted the admissions office, whether they submitted supplemental materials in their application packets, whether they attended school events, or whether they visited campus.
Today, digital technology allows colleges to know how often students open their emails and how much time they spend on the school’s website. Colleges can consider these factors, along with other traditional admissions information, to predict the likelihood that a student will accept an admissions offer.
Colleges must manage their enrollment. If they admit too few students, they won’t have enough tuition money to support the incoming class. If they admit too many students, they might not have the facilities to accommodate all of them. Colleges use early admissions and waiting lists to manage some of these issues, but these methods have downsides. Early admissions favor students who can commit to a school without knowing the aid package they will receive, and waiting lists can put students in the precarious position of not knowing which college they will attend until a few weeks before the start of the semester.
Colleges also often use demonstrated interest to keep their acceptance rates low. Colleges are often rated based on their exclusivity, both actual and perceived. One way to measure exclusivity is to calculate the percentage of applications that are accepted for admission. The lower the percentage of acceptances, the more exclusive the school. Encouraging more students to apply while maintaining the same number of accepted students lowers the percentage and boosts exclusivity.
Unfortunately, higher education enrollment has been declining, and the decline shows no signs of stopping. It’s unlikely that many colleges will be able to increase the applicant pool, and colleges don’t want to grant any more admissions than absolutely necessary to fill a class. Gauging demonstrated interest means that colleges don’t waste acceptances on students who are unlikely to attend. And it is not unreasonable for colleges to prefer students who have shown that they want to attend.
However, using demonstrated interest as an application factor has downsides, too. Lower-income applicants or first-generation college students may not have the time, resources, or guidance to know that they must prove their interest to colleges. The application process does not make it clear that this is necessary. Demonstrated interest, like many other admissions metrics, can thus often disadvantage the students who would benefit most from higher education. But that problem has very little to do with the privacy issues raised by tracking issues such as cookies and metrics.
Ethical Problems with Tracking
Consumer-facing companies have used tools such as cookies and pixels for years to track users and match people’s identities on the internet. While such tools are commonly used for analytics and marketing, it is no surprise that colleges leverage this technology to predict which students will actually attend.
This technology can become especially problematic when it’s used to do more than just observe how much a prospective applicant interacts with a college’s website, such as predicting whether an applicant will drop out. Colleges are judged on whether students graduate from their programs. If schools have low graduation rates, they not only drop in ranking but also risk losing federal funding. Lower-income students are especially at risk of dropping out. This is due not to academic but financial difficulties. Because low-income students are less likely to have a safety net, such as personal savings or parental support, any financial setback means they might have to leave college. If a school is concerned about its graduation rate, it could attempt to screen out students who are more likely to leave before graduation, rather than provide better financial services to students. It would not be difficult to use previous students’ data to determine the likelihood of students dropping out.
This kind of data-based decision-making can have massive consequences for students and colleges alike. Large public research institutions already use data and targeted advertising to market themselves to wealthy out-of-state students. These students pay higher tuition than in-state students do, and since they are wealthy, they require less or no aid to attend. This provides the university with a stable source of income, because the wealthier students are less likely to drop out or need additional aid to remain in school. But this also reduces the number of admission slots for resident taxpayers who would benefit from in-state tuition.
Where Do We Go From Here?
There are no current federal laws that prevent colleges from collecting and using data in the ways described above. The Family Educational Rights and Privacy Act (FERPA) only prevents colleges from disclosing information in an education record without student consent and does not cover applicants. Even for current students, website or email tracking data is unlikely to qualify as part of an education record unless schools use that information for a specific educational purpose, such as the admissions process.
However, FERPA allows schools to use data from an education record to build an algorithm to predict future applicants’ graduation rates. The school official exception allows schools to share data with outside entities if those entities perform functions that a school would traditionally perform. Grading or rating admissions applications is certainly one of those functions.
Unfortunately, whenever there are ratings systems such as the US News and World Report, colleges will always have an incentive to game them. Congress could alleviate that pressure by providing high quality, easily accessible data to prospective students about colleges and their programs. The proposed College Transparency Act would provide useful information to students about topics such as graduates’ earnings and loan repayment. These data points are harder for colleges to game, unlike the appearance of exclusivity. But congressional action won’t address the growing competition among colleges for students from wealthy families, especially as state legislatures continue to cut funding for public universities.
So what can be done? Unfortunately, college applicants cannot do much. In this situation, colleges hold all the power. But colleges can and should do more:
- They should be transparent about the admissions process. This includes providing clarity and transparency about the categories they consider for applicants, the relative weight of those categories, and comprehensive definitions of the categories. Schools that use automated decision-making or scoring in the admissions process also need to disclose those activities.
- All automated decision-making or algorithmic scoring must be audited to ensure it is not discriminatory. This evaluation should cover not only protected classes such as race and gender but other important categories such as first-generation college students and veterans. While colleges might not have a legal responsibility to those types of students, they have an ethical responsibility to ensure their doors are open to them.
Colleges should not be penalized for using data. Understanding the prospective student body is necessary to provide services, set long-term goals, and address weaknesses in recruiting. But as with any data initiative, colleges should collect only what is absolutely necessary, observe a retention schedule, and when in doubt, delete. As colleges continue to use innovative and expanding technology to modify their admissions processes, they should take stock of their admissions programs. Do they provide advantages to wealthy students? Do they presume that applicants will have a dedicated guidance counselor to help them through the process? How easy is it for applicants to ask questions?
Yet, auditing algorithms and practicing transparency won’t fix an unfair system. College administrations must modify their programs to create a fair admissions process.