Case Study: A California Parent Caught Off-Guard by Chromebooks
Katherine W. was seven years old, in the third grade, when her teacher first issued Google Chromebooks to the class. Katherine’s father, Jeff, was concerned. Jeff feared that Chromebooks and G Suite for Education use might come at the cost of his daughter’s privacy. He negotiated with his daughter’s teacher so she could use a different computer and not have to use a Google account. But as third grade came to a close, the district made clear that there would be no exception made the next year.
Under the Family Educational Rights and Privacy Act (FERPA), the data that students often use to log into Google services—like name, student number, and birthday—can’t be shared with third parties—including Google—without written parental consent.
But the district never sought written consent from Jeff or his wife. The district provided no details about the types of devices students would be required to use or the data that would be collected on students. Rather than allowing Jeff to sign his daughter up for the Chromebook program, the district consented on his behalf, making the device mandatory for Katherine—with no ability to opt out. This means that Katherine is required by the school to use Google with a personalized Google account, and Google can create a profile of her—that is, a dossier of information that vendors collect on users for advertising, market research, or other purposes—and use it for commercial purposes the moment she clicks away from G Suite for Education.
Jeff went through several emails and a tense meeting before the district agreed to provide Katherine with a non-Google option for fourth grade—but once again declared that such an accommodation would not be possible for fifth grade.
That’s when EFF reached out to the district. Our legal team drafted a letter to the district to outline the privacy concerns associated with school-issued Chromebooks. The letter urged the district to permit “all students—if their parents so decide—to use alternative devices, software, and websites, for the upcoming school year and every year.”
For Jeff, the biggest concern isn’t just the data Google collects on students. It’s the long-term ramifications for children who are taught to hand over data to Google without question.
As Jeff explained it, “In the end, Google is an advertising company. They sell ads, they track information on folks. And we’re not comfortable with our daughter getting forced into that at such an early age, when she doesn’t know any better.”
3. Parent Concerns About Data Collection and Use
When parents’ questions went unanswered, they were left with serious data concerns, particularly when devices and ed tech programs came home with students. Parents who responded to the survey were particularly concerned about personally identifiable information (PII) that could be used to identify a specific student, such as first/last name, birth date, student ID, graduation date, address, etc.
One Utah public school parent summed up a range of concerns:
Schools should not require students to use tools that involuntarily, or without express parental permission, collect data on students. This includes internal processing of data in order to “improve products,” understanding user behavior to promote advertising, and sharing data with third parties.
A parent from a Maryland public school had suspicions about data collection, retention, and eventual use by ed tech companies:
They are collecting and storing data to be used against my child in the future, creating a profile before he can intellectually understand the consequences of his searches and digital behavior.
Parents were also conscious of the possibility that their children’s data would be shared, sold, or otherwise commodified in the “untapped industry of selling students’ information for advertising and profiling.” The details were generally unclear, as school privacy policies said “not a word about how our kids’ learning is essentially becoming Google’s data.” One Maryland parent wrote:
The school system does not even acknowledge that our child’s data is being collected and possibly sold.
Within schools themselves, respondents observed practices that threatened to reveal students’ PII on a smaller scale. Poor login and password management practices using PII were of particular concern. One California public school used students’ birthdates as passwords. According to another parent:
The passwords are defaulted to student ID. Students are not allowed to change these passwords, and they have received emails stating that students are to stop attempting to change passwords. The student ID numbers are printed, unredacted, on schedules handed out to students and, per my child, “follow a pattern that is easily guessed.”
When students came home with their school-issued devices and online homework, parents’ data concerns extended from students’ data to the family’s home networks and devices. In addition to imposing surveillance on students at home as well as in the classroom,14 ed tech had the potential to make other members of the household feel vulnerable. One public school parent in Pennsylvania wrote about their student accessing ed tech services on a personal device:
I have no idea how to find out the extent of information they [ed tech providers] have access to on our personal computers.
Another parent in a Virginia public school was concerned about their student using a school-issued device at home:
The students are required to use the laptops at home for assignments, but that could expose our home networks to the school system.
Parents’ concerns above highlight the extent to which student privacy violations may go beyond the classroom. Student data—or, more broadly, data collected on students in the course of educational activities at school, at home, and elsewhere—may interact with advertising, drive inferences and profiles about individual students, or be shared with third parties.