FPF’s Education Privacy Newsletter – February 2020
As always, the team at the Future of Privacy Forum is keeping a close watch on a variety of proposals and policies affecting children’s privacy and data security around the nation. February’s newsletter covers the latest legislative efforts in Washington, D.C., explores questions about managing FOIA requests without running afoul of privacy laws, and examines the latest news on facial recognition technology in schools.
BUT FIRST… New at FPF
- FPF released an animated Student Privacy 101 video series on Safer Internet Day, along with an introductory blog post outlining three key areas of student privacy: legal compliance, mitigating risks, and transparency. The videos are creative commons (CC BY-NC-SA), so feel free to use, share, and/or remix them for training and student privacy awareness.
- FPF recently announced a first-of-its-kind award recognizing privacy-protective research collaboration between a company and academic researchers. Nominations are open through March 12, 2020!
Regulating and protecting children and student privacy and safety remains a focal point for federal policymakers.
- Representative Kathy Castor (D-FL) recently introduced the Protecting the Information of our Vulnerable Children and Youth Act that proposes significant changes to the Children’s Online Privacy Protection Act (COPPA), including protecting teens up to age 18 and creating new enforcement mechanisms, such as a private right of action.
- Federal legislators previously introduced another bill to amend COPPA in early January, the PROTECT Kids Act.
- A bipartisan cybersecurity bill was introduced on February 13 that would provide funding to state and local government agencies to help secure their networks and create state and local committees that would communicate government needs to CISA.
What else should the FTC explore as part of its COPPA review? A group of privacy attorneys urged the FTC to consider brain-consumer interface technology, such as the “headband” technology previously used in some schools in China.
- Researchers suggest to the FTC that they employ a “holistic approach to children’s ‘best interests’” that includes equitably distributing the responsibility for protecting children’s privacy online beyond the parent and child–similar to the approach taken by the UK’s ICO. Check out this quick comparison of the ICO’s Age Appropriate Design Code to COPPA.
Following reports that the University of Missouri is using location trackers to monitor student attendance, privacy advocates and stakeholders raised concerns about the implications for student privacy. Although location tracking in schools isn’t new, a growing number of schools are under fire for using digital tools to track attendance, including VCU, Syracuse, UNC, and even schools in Australia.
- Joel Schwarz breaks down privacy considerations underlying the use of attendance trackers, and the University of Missouri responded to concerns stating “the only thing [the app is] doing is recording when a student is in attendance in a particular classroom at a particular time.”
- On the other end of the spectrum, a Louisiana school board filed a lawsuit against a parent to prohibit her from monitoring her child’s location using a location tracker–as prescribed by a doctor–while her child is in school. The school board claims that the device is capable of “illegally intercept[ing] communication[s] of other students, faculty, and staff.”
Facial recognition technology continues to raise controversy in both K-12 and higher ed. The New York Times profiled the Lockport saga; the reporter later tweeted that when briefed on the topic, Lockport seems to believe they are ahead of the school safety game. However, there is little evidence that facial recognition technology will help keep schools safe.
- As the New York State Senate considers a bill creating a moratorium on facial recognition technology use in schools, Lockport School District’s superintendent requested an exemption if the moratorium is enacted.
- UCLA, one of the first college campuses considering the technology decided against implementing it after pushback from the campus community.
- “Stop Facial Recognition on Campus,” a campaign started by Fight for the Future and Students for Sensible Drug Policy, developed a scorecard “ranking” universities based on willingness to use facial recognition technology. It also organized open letters to higher ed administrators co-signed by 40+ organizations and faculty and staff urging the ban of facial recognition use on college campuses and announced March 2nd as a national day of action.
- Michigan’s Oakland Community College has repeatedly barred students from organizing under the campaign, and only “partially relented” once the ACLU and Fight for the Future raised concerns about whether the school’s actions violate the First Amendment.
Recent news stories highlight that schools and districts may need more training on how FERPA and state sunshine laws interact.
- The Arizona Department of Education reportedly failed to properly redact the personal information of more than 7,000 families seeking vouchers to finance private education. The disclosure was made in response to an open records request submitted by an organization that campaigned against the voucher program. The Department issued a public statement that they will work with the Department of Education’s Privacy Technical Assistance Center to develop better policies.
- The Illinois Attorney General “Public Access Counselor” issued an informal opinion that redacting only a student’s name is insufficient when responding to open records requests.
- The Illinois AG is on to something–researchers recently found that industry methods of de-identifying or anonymizing information are largely insufficient as well.
- This detailed post regarding FERPA and Arkansas’ open records law provides insight into how student privacy and open records laws can interact and impact educators.
- Connecticut’s Freedom of Information Commission (the regulatory body that oversees the state’s open records law) recently ruled that records regarding staff misconduct are subject to public disclosure even if they include student information because they are “employee disciplinary records”.
FYI: Quick Hits
- New Mexico’s Attorney General filed a lawsuit against Google, alleging that Google’s G Suite for Education is collecting student information without parental consent in violation of COPPA. Google, however, called the claims “factually wrong.”
- The Indiana University school system provided students with a GPA calculator that inadvertently allowed students to look up the grades of other students, across all nine campuses. The tool was quickly disabled, but not before more than 250,000 students were affected.
- Privacy advocates fear developers of technology like AI-powered baby monitors are fear-mongering and “fueling a dependence in parents feeling they need to check on their kids at all times.” Meanwhile, poor security protections in the parental monitoring app KidsGuard exposed sensitive information about its users on the internet, and law enforcement convinced a child to share the details of his father’s location, which was available through a family monitoring application, for evidence that the father committed a crime.
- A high school in Detroit launched a pilot program to put sensors intended to identify vaping, smoking, and THC use in bathrooms. However, the sensors also include audio analytics which can detect abnormal noise levels to pick up on ‘aggressive behavior’ in unsupervised areas where cameras cannot be installed.”
- Two South Carolina school districts have replaced metal detectors with “weapon detecting AI” body scanners, similar to those used at airports. Privacy advocates worry there is little evidence these scanners will increase school safety.
- University of California San Diego’s student government has voiced concerns about the city of San Diego implementing smart street lights–street lights equipped with sensors and cameras that collect a range of data, from pedestrian movement to “environmental conditions.” Though law enforcement claims there is no real-time monitoring, students demand more transparency and community involvement in the process.
- The city of Bristol, England is employing algorithms to assign “risk scores” to help determine how to effectively distribute social services to disadvantaged youths. Some information included in the calculus comes straight from schools–such as attendance records.
- Tik-Tok is trying to keep kids off the app; there is a new age gate and the company is banning advertisements geared towards children and rejecting content that is too “juvenile.”
- Facebook introduced new privacy-protective features to Messenger Kids, its messaging application tailored for children under 13.
- Teenagers are taking privacy into their own hands and “hacking” Instagram–they’ve begun to share access to their Instagram accounts with friends so that instead of leaving behind one “digital footprint,” they leave behind many.
- EdWeek examines what role, if any, voice technology should have in the classroom.
- New York’s new student privacy regulation took effect on January 29, 2020. Check out this breakdown of what edtech companies, schools, and districts are now required to do.
- The American Association of Collegiate Recruiters and Admissions Officers released recommendations for how criminal history and disciplinary information is used in college admissions.
- This year’s Computers, Privacy, and Data Protection Conference in Brussels included a recorded panel on “Children’s Privacy in the Digital Age.”
- The University of Oxford and Alan Turing Institute published findings that urge ethical oversight when machine learning is used in children’s social care initiatives.
Interested in staying updated? Subscribe to our newsletter here.