Skip to main content
SearchLoginLogin or Signup

Privacy and Paternalism: The Ethics of Student Data Collection

Educational software is pervasive in contemporary high school classrooms, yet many students are unaware that learning platforms monitor their activities, reporting concerning documents and messages to school administrators and law enforcement. The scope of student data ...

Published onAug 26, 2022
Privacy and Paternalism: The Ethics of Student Data Collection


Educational software is pervasive in contemporary high school classrooms, yet many students are unaware that learning platforms monitor their activities, reporting concerning documents and messages to school administrators and law enforcement. The scope of student data collection and analysis raises challenges for existing privacy laws like the Family Educational Rights and Privacy Act (FERPA). Developing new privacy policies for student data would require addressing fundamental philosophical questions about the autonomy and rights of children.

🎧Listen to this case study.

Keywords: user data privacy, student data, contextual integrity, educational technology, children’s rights, surveillance.

Kathleen Creel
Khoury College of Computer Sciences and Department of Philosophy and Religion, Northeastern University

Tara Dixit
Chantilly High School, Chantilly, Virginia

Learning Objectives

  • Understand the role of educational technology and monitoring software in K-12 schools.

  • Identify implications of educational technology and monitoring software for student privacy and well-being.

  • Analyze the moral status of children using philosophical frameworks.

  • Apply the contextual integrity theory of privacy to understand the ethics of collecting data about students.

  • Discuss strengths and weaknesses of current student privacy laws.

1. Introduction

For many high school students in the United States, classwork and homework begin the same way: by opening a laptop and logging into a learning platform. Students spend hours of their days composing essays, practicing math problems, emailing teachers, and taking tests online. At schools with policies governing in-class phone use, students message friends on school-provided devices that come to feel like their own. But many do not realize that schools and third-party companies can collect their personal communications as data and, in some cases, as evidence.1

Schools want to protect students from cyberbullying peers and from harming themselves or others. To address these issues, administrators install monitoring software. A single word in an email, instant message, or search bar indicating bullying (“gay”) or self-harm (“suicide”) can trigger an alert to school administrators and law enforcement. In September 2021, monitoring software in a Minneapolis school district sent over 1,300 alerts to school administrators stating that students were viewing “questionable content.”2 Each alert extracted flagged text from students’ messages, emails, or school documents and sent it to school administrators for review.3 In other districts, monitoring software escalated students’ mentions of suicide to police.4 A national survey of parents of school-aged children found that a majority of parents did not want their child’s educational data to be shared with law enforcement.5 Despite widespread, and increasing, parental concern over how and with whom student data is shared, with near ubiquitous use of learning platforms in the classroom, students and parents are left without practical ways to opt out of engagement with companies whose data collection policies and practices put students’ privacy at risk.

In addition to protecting students, education researchers and makers of education software also want to use the rich trove of data students generate to improve students’ education. Artificial intelligence (AI)-based educational technology aims to “meet students where they are” by using the student’s data to track progress and create personalized learning plans.6 Makers of educational software hope to understand how students are interacting with software in order to improve future offerings,7 while education researchers hope to use student data to understand learning and educational disparities.8

However, data collection comes with costs. In 2016, the Electronic Frontier Foundation studied 118 education technology software services and apps commonly used by schools. Seventy-eight retained data after students had graduated; only 48 encrypted personal information about students. Some school districts require educational software providers to disclose their data collection practices to parents and allow them to opt their child out, yet only 55 percent of parents surveyed had received a disclosure and 32 percent said that they were unable to opt out.9 This lack of data encryption, transparency, and choice is concerning. Despite policies that aim to provide both transparency and access, most students and parents are unaware of what data is being stored and who has access to it.

Adults are routinely subjected to similar invasions. Yet the special moral status of children makes the tension between their protection and education and the protection of their privacy especially fraught. On one model of childhood privacy, the paternalism view, students are children to be protected and educated by responsible adults who make decisions on their behalf. Parents and guardians may succeed in maintaining their child’s privacy from others by using laws like the Family Educational Privacy Rights Act (FERPA) to shield their child’s student records.10 However, the child has no privacy from her guardians and educators: parents and school administrators may read her social-emotional–learning diaries and text messages at any time. The child is a “moral patient”: a being deserving of protection, but lacking the agency to make their own choices without oversight. Protecting her from a chance of harm from herself or others is worthwhile, no matter how small the chance of harm, because there is no countervailing duty to protect her privacy.11

On another view, the essential role of the child is not to be protected but to “become herself.”12 Through exploration and play, the child develops her own agency, autonomy, and ability to freely choose. To the extent that a child is aware of being monitored in her experimentation and aware that her decisions and explorations may be interpreted out of context and punished accordingly, she cannot explore freely. The child also needs privacy in order to develop herself in relation to others. Respecting the privacy of communications between children allows them to develop genuine friendships, as friendship is a relationship between two people that cannot survive constant surveillance by a third party.13

Respecting digital privacy also builds relationships of trust between children and adults, as philosopher Kay Mathiesen has argued.14 Trusting a child gives her an “incentive to do the very thing” she is trusted to do,15 building her capacity to behave in a trustworthy manner. By contrast, the act of surveillance can decrease both the child’s trust in the adult16 and the adult’s trust in the child,17 even if no malfeasance is discovered. Researchers have expressed concern that digital surveillance may also decrease a student’s civic participation18 and their willingness to seek help with mental health issues.19 As such, on this view, parents and school officials are obliged to respect the privacy rights of children unless there is a significant reason to violate them; perpetual background surveillance without cause is inappropriate.

On some issues, both perspectives align. Neither students nor their parents would choose to trigger false alarms that send law enforcement knocking or knowingly allow learning platforms to resell identifiable student data to commercial vendors. But the broader dilemma remains. Knowing when to treat children “as children” and when to treat them as responsible agents is, as philosopher Tamar Schapiro has argued, the essential predicament of childhood.20 We cannot wait for adult data privacy to be settled before tackling it.

2. Privacy and Contextual Integrity

Before the use of computers in education, students communicated primarily in person, which prevented most third parties from knowing the contents of their conversations, and did their schoolwork on paper or the ephemeral chalkboard. What data they did produce was confined to paper records locked in filing cabinets. Such data was difficult to aggregate: if shared at all, it was mimeographed and mailed or read aloud over the telephone. A third party aspiring to collect data on students would be stymied by the friction of acquiring the data and the expense of storing it.

Today, students generate electronic data every time they send email or messages, take quizzes and tests on learning platforms, and record themselves for class projects. The volume, velocity, and variety of the data they generate has shifted dramatically.21 As cloud computing prices fall, analytics companies become incentivized to collect and store as much data as possible for their future use.22 Educational software and websites are no exception. Companies analyze and mine data about students’ day-to-day activities to find useful patterns for their own purposes, such as product improvement or targeted advertising.23

Without additional legal protection, the trend toward increasing student data collection, storage, and reuse is likely to continue. Given the contested moral status of children, how should their privacy and persons be protected? What policies or laws should we adopt? In order to choose one policy over another, we need first a reason for the choice—a justification of why a certain norm of privacy is correct based on a broader justificatory framework. We will seek it in Helen Nissenbaum’s classic analysis of privacy as “contextual integrity.”24

2.1. Contextual Integrity

Contextual integrity suggests that every social context, from the realm of politics to the dentist's office, is governed by “norms of information flow.” Each social context is governed by different expectations regarding what people in different roles should do or say.25 For example, it would be appropriate for a dentist to ask a patient his age but unusual for a patient to reverse the question.26 In addition to norms of appropriateness, each social context has norms governing the proper flow or distribution of information. Thus, privacy is defined as the “appropriate flow of information” in a context, not as secrecy or lack of information flow.27

According to Nissenbaum, there are five parameters that define the privacy norms in a context: Subject, Sender, Recipient, Information Type, and Transmission Principle.28 Any digression from norms typical for a context constitutes a violation of contextual integrity. However, the principle is not fully conservative: new practices can be evaluated in terms of their effects on “justice, fairness, equality, social hierarchy, democracy,” and autonomy as well as their contribution to achieving the goals relevant for the context.29 On these grounds, the new privacy norm may be chosen above the old.30

The contextual integrity model can be used to evaluate reliance on educational software in schools. Any learning platform can be analyzed in terms of the appropriateness of its privacy policy to the privacy norms of the classroom. Consider Gaggle’s privacy policy. Gaggle, an online platform designed for use in the classroom that seeks to replace communication tools such as blogging software and email clients with similar software equipped with content filters, states that, “Gaggle will not distribute to third parties any staff data or student data without the consent of either a parent/guardian or a qualified educational institution except in cases of Possible Student Situations (PSS), which may be reported to law enforcement.”31

Imagine that a student sent a message to another student at 8 p.m. on a Saturday and Gaggle flagged it as a potential indicator that the student is depressed. Analyzing this scenario according to the contextual integrity norm, the five parameters would be:

  • Subject: Student 1 (the author of the message) and Student 1’s mental health concerns

  • Sender: Student 1 (the author of the message)

  • Recipient: Student 2 and Gaggle. If Gaggle alerts Student 1’s parents, school administration, or law enforcement of student activity, they also become recipients.

  • Information Type: Student data (in this case, a message and its associated metadata, such as sender, recipient, timestamp, and location)

  • Transmission Principle: The recipient will not share the student data with third parties without the consent of the parent/guardian or educational institution, except in cases of Possible Student Situations (PSS).

The desire to protect students and intervene to help them when they struggle with depression or anxiety is laudable and may initially appear to justify Gaggle’s new norms for classroom communication. However, the context of childhood friendship before the introduction of digital messaging was one in which it was possible for a student to discuss their feelings of sadness with another student on a weekend, outside of the classroom, without being overheard and without school intervention. Whether in person or on the telephone, the context of the interaction between Sender and Recipient was one of friendship mediated by the transmission principle of the telephone, which permitted information flow without disclosure to a third party. Pre-Gaggle messaging also presumes a disclosure-free channel for communication between friends.

Given these changes, the introduction of Gaggle meaningfully alters both the transmission principle and the set of recipients, thereby violating the preexisting privacy norms. In the contextual integrity framework, in order to argue that the new privacy norm is beneficial, proponents could appeal to its positive contribution to “justice, fairness, equality, social hierarchy, democracy,” or autonomy. If the new privacy norms do not contribute to these goods, they must contribute instead to the goals of the context. In order to evaluate whether the privacy norms reshaped by Gaggle are beneficial, we must determine what goals, and whose goals, should be considered relevant to the context.

3. Student Privacy Laws

The foregoing analysis assumes that the relevant context of communications between children who are friends are those of friendship: namely, that the norms of privacy that apply to adult friendships and adult message communications should also apply to childhood friendships and childhood message communications. However, if the first, guardian-centered viewpoint presented above is correct, it may be that the relevant context of analysis is primarily that the sender and recipient are both children, not that they are friends. Legally, that is the case: in the United States, a child has no right to privacy from their parents.32 Parents may monitor the online communications of children as they please.

Privacy from school officials or other adults is dependent on school policy and on the wishes of the parents or guardian until the child reaches the age of 18. There are three primary federal laws in the United States that aim to protect student privacy and security: the Family Educational Rights and Privacy Act (FERPA), the Children’s Online Privacy Protection Act (COPPA), and the Protection of Pupil Rights Amendment (PPRA).33 Although each attempted to meet the information security needs of its day, the three collectively fall short in protecting student privacy from contemporary data collection.34

FERPA provides the strongest federal privacy protection for student data, but has not been updated in the past decade.35 FERPA gives parents three rights: the right to access their child’s education records, the right to a hearing in which they may challenge inaccuracies in those records, and the right to prevent personally identifiable information (PII) from their child’s record from being disclosed to third parties without their written consent.36

FERPA typically allows school officials to share student data only if both direct identifiers, such as student names, and indirect identifiers, such as a student’s date or place of birth, are removed. However, school officials may access PII data provided they have a “legitimate educational interest” in doing so.37 This privilege is extended at the school’s discretion to educational technology providers who take over roles previously performed by school officials, are under “direct control” of the district, and agree to be bound by the same provisions against reuse and reselling as a school official would be. Since most educational technology providers fall under these provisions, they are permitted to collect and store personally identifiable information about students.

Other laws similarly allow third-party software providers to collect data without requiring transparency as to the uses of that data. COPPA protects the personal information of children under the age of thirteen by requiring “operators of websites and online services” to gain parental consent in order to “collect, use, or disclose” the PII of their children.38 However schools “may act as the parent’s agent”39 and can allow online software providers to collect data for students under the age of thirteen and use it for educational purposes without parental consent.40 For K-12 students, PPRA requires parental consent before a school may require that students respond to a survey about topics such as their political affiliation, religious practices, or income.41

The patchwork of federal student privacy laws, supplemented by state legislation such as California’s Student Online Personal Information Protection Act (SOPIPA),42 has changed little in the last decade. But as the social practices to which the laws are applied changes—as student data becomes more voluminous, its transmission becomes easier, and as educational software providers take on activities once performed by school officials—the informational norms that privacy laws sought to protect are violated. The contextual integrity framework helps us understand why this is.43 Even if the prevailing social context (a high school), the subject of the data (students), sender (school officials), many of the recipients (school officials, school districts), and the laws governing transmission remain the same, the addition of recipients such as the providers of educational software and the changes in principle of transmission (paper to email, or email to centralized learning platform) generates a violation of contextual integrity and therefore of privacy.

In order to illustrate how a change in educational technology can violate contextual integrity without violating FERPA, consider the case of InBloom. InBloom was an ambitious nonprofit initiative launched in 2011 to improve the US education system by creating a centralized, standardized, and open source student data-sharing platform for learning materials. Educators and software professionals from several states started building the platform. Although the platform complied with FERPA, it meaningfully changed the transmission principle under which student data was transmitted. Before, student data had been stored only locally, at the school level, and in the fragmented databases of educational technology providers. Now it would be pooled at the state level and national levels, granting both school officials and their authorized educational software providers access to a much larger and more integrated database of student data, including personally identifiable information and school records. This is a violation of the contextual integrity of student data, and the backlash InBloom faced from parents, activists, and local school officials was on the grounds of the changes it would prompt in the transmission and storage of data.44 The InBloom incident highlights the need for updated student privacy legislation, and perhaps for legislation that incorporates principles of contextual integrity. While InBloom shut down in 2014, many of the parent and activist criticisms of its data pooling would apply equally to the for-profit educational technology companies that continue to collect and store data in its absence.45

4. Privacy and Justice

Another factor relevant for the evaluation of contextual integrity is the social identities of the actors involved and how they interact with the roles they inhabit. Student privacy concerns can be compounded when combined with existing biases and discrimination, including those based in class, sexuality, race, and documentation status.46 According to a study by the Center for Democracy & Technology (CDT), low-income students are disproportionately subjected to digital surveillance.47 This is because many schools distribute laptops and other devices to low-income students, a side effect of which is increased access to student online activity. In some cases, school officials can view the applications a student opens and their browsing history in real-time. Privacy concerns have been exacerbated with virtual learning, as schools expanded laptop distribution significantly during the COVID-19 pandemic.48

The privacy of LGBTQ students is also threatened by school surveillance software. In a Minneapolis school district, there were several incident reports of Gaggle flagging LGBTQ-specific words like “gay” and “lesbian” because they may signify bullying, which led to at least one LGBTQ student being unintentionally outed to their parents.49 Fear of outing may cause LGBTQ students to curtail their online discussions, which could be especially impactful since queer youth often seek meaningful connection to other queer youth online in order to understand their identities.50

Learning platforms may exacerbate existing systemic racism and bias in school-based disciplinary measures as Black and Hispanic student suspensions, expulsions, or arrests have been greater than for White students for similar offenses.51 Existing teacher biases, often associated with school-based disciplinary actions, may be embedded into AI-based education software, resulting in adverse impacts on marginalized students. Researchers at the University of Michigan studied the sociotechnical consequences of using ClassDojo, a data-driven behavior management software for K-8 students. ClassDojo allows teachers to add or subtract “Dojo points” from students for behaviors such as “helping others” or “being disrespectful.” The researchers found that use of ClassDojo had the potential to reinforce teacher biases, as when teacher stereotypes about which students were likely to be more “disrespectful” or “disruptive” were seemingly substantiated by ClassDojo behavior records gathered by the teachers themselves, and also found that ClassDojo had adverse psychological effects on students.52

Violations of student privacy also disproportionately impact undocumented students. FERPA prohibits school officials from sharing student information, including immigration status, directly with government agencies such as Immigrations and Customs Enforcement (ICE). However, ICE can access this data in other ways. The “third-party doctrine” holds that if individuals “voluntarily give their information to third parties” such as banks, phone companies, or software vendors, they no longer retain a “reasonable expectation of privacy.”53 When the makers of educational websites or purveyors of student surveys sell student data to brokers, they make it possible for ICE to buy data about undocumented students.54 This can have effects on enrollment of undocumented students. As students realize that their documentation status is no longer private, they may withdraw from school out of concern that their family members may be targeted for deportation.55

5. Addressing Privacy in Context

In addition to the planned and routine violations of contextual integrity that may occur when educational software providers resell supposedly anonymized data or school officials aggregate student data across contexts, there are the accidents. Large databases, or “data lakes” as they are evocatively called, are prone to rupture and spill. The US Government Accountability Office analyzed 99 data breaches across 287 school districts, from July 2016 to May 2020. According to their report, thousands of students had their academic records and PII compromised.56 Bigger and more affluent school districts that used more technology and collected more student data were impacted most. The report states that compromised PII like social security numbers, names, addresses, and birth dates can be sold on the black market, causing financial harm to students who have otherwise clean credit histories. Compromised records containing special educational status or the medical records of students with disabilities who are on an Individualized Education Program (IEP) can lead to social and emotional harm.57

The government’s awareness of the frequency of data leaks and their negative consequences for student privacy, financial well-being, and medical confidentiality establishes a context for legislative solutions. Many data leaks flow from third-party educational software providers who are not following information security best practices. For example, nearly 820,000 students’ personal data from the New York City public school system was compromised in early 2022. Before the leak, the school district was unaware that the educational software provider responsible had failed to take basic measures such as encrypting all student data.58

In addition to encryption, other best practices include anonymizing data when possible, including using techniques like differential privacy. Although anonymizing data by removing direct identifiers of students provides some measures of privacy, Latanya Sweeney’s work has shown that when linked to other data sources, it can be possible to reverse-engineer the anonymized data, which is known as linkage attacks.59 Most individual Americans can be identified by their “anonymized” Facebook accounts or their health care information.60

To protect against linkage attacks and further ensure privacy, differential privacy is a technique that introduces statistical “noise” in sensitive data by slightly modifying select data fields in order to ensure that individuals are not identifiable. Differentially private individual data may be inaccurate; however, the aggregate results remain fairly accurate. Even if the data set is accessed, individuals’ privacy will be less likely to be compromised.

Differential privacy is used by researchers to secure their data, within companies that hold PII, and by the US Census Bureau.61 The statistical algorithms to implement differential privacy are most effective on large data sets as the utility of small data sets decreases due to noise.62 Thus, school systems with large and sensitive data sets may increase privacy with this technology. Differential privacy has not been widely adopted within educational technology. In addition to the complexity of implementing differential privacy compared to data anonymization, many educational technology systems need the ability to identify specific students and track their progress. Implementing differential privacy internally, as opposed to when releasing data to researchers, could impede these pedagogical functions.

Data stored long enough is likely to leak. Introducing requirements that stored student data be encrypted and anonymized would protect data subjects from reidentification when it does. But not all student data can be anonymized and still remain useful. For data that cannot, one proposed solution is the establishment of “information fiduciaries.” A fiduciary is a representative who is required to make decisions in the best interests of their clients. Stipulating that schools and the educational software providers with whom they contract must act as fiduciaries when they handle student data would confer additional legal obligations to act in the best interests of the students and their privacy.63

5.1. The Value of Data Collection

Requiring that an information fiduciary act in the best interests of the students, however, returns us to our original questions: what are the best interests of students, how should their interests be weighted, and who should have the authority to decide? The paternalism perspective advocated for protecting students, even at the expense of their privacy, while they develop the capacity to make autonomous decisions. The second perspective suggested that prioritizing students’ privacy is in their best interests so that they may develop autonomy and maintain trusting relationships with the adults in their lives. Many adult caretakers take up a perspective between these two, changing their comparative valuation of children’s protection and autonomy as children age. However, people acting from either perspective would take themselves to be acting in the best interest of the child. Establishing an information fiduciary does not solve the substantive moral question of which of the child’s interests the fiduciary should prioritize.

One factor difficult to weigh is the educational value of big data collection for K-12 students. Since K-12 students are evaluated so frequently and thoroughly, completing daily or weekly homework assignments and in-class assessments, sophisticated analytics typically are not needed to determine when they are struggling academically or what skills they need to focus on, as they are used in the university setting.64 Some studies show benefit from the use of educational data to personalize learning,65 although comprehensive reviews show that the hype surrounding personalized education often far outstrips its current benefits.66 The rhetoric of personalized learning was that “adaptive tutors” created with the help of big data analytics “would be more efficient at teaching students mastery of key concepts and that this efficiency would enable teachers to carry out rich project-based instruction.” Instead, the primary use of adaptive online learning is in providing students with ancillary practice in math and early reading.67 The primary value of educational data collection has been for researchers: both academic researchers interested in education and educational disparities68 and educational technology researchers interested in improving the products they sell.

6. Conclusion

Educational software will continue to be adopted by school systems for its perceived value in increasing access to high-quality and personalized education.69 As a result, student privacy issues will escalate. Current federal privacy laws such as FERPA require updating in order to meet these challenges, as they do not hold school districts or educational software providers to the government’s own standards of student privacy and information security. Yet ensuring that school officials and educational software providers respect the contextual integrity of information transmission for student data, or adopt policies that represent a student-centric rather than guardian-centric perspective on children’s rights and privacy, might require more than a simple update.

Discussion Questions

  1. FERPA and COPPA were passed before the advent of contemporary educational software. Given the contextual shifts represented by the new portability and combinability of digital student data, what steps should lawmakers take when updating or designing new educational privacy laws? How could principles of contextual integrity be incorporated into privacy regulation?

  2. Should schools employ Gaggle, which claims to prevent incidents of bullying and suicide, if it means student data may be shared with school officials and potentially with law enforcement? What information or context might be relevant to making this decision?

  3. Here is a statement from the ​​US Department of Education Student Privacy Policy Office: “Under FERPA, a school generally may not disclose PII from a student’s education records to a third party unless the student’s parent has provided prior written consent.”70 Applying the Contextual Integrity framework, what privacy norms are represented in this statement?

    a. Subject:

    b. Sender:

    c. Recipient:

    d. Information Type:

    e. Transmission Principle:

Discuss the norm represented by this transmission principle. Then discuss the norm represented by the modified principle according to which schools may act on behalf of the parents and disclose PII to a third party at their discretion. When using educational software, who should have the right to consent to share students’ data: schools, students, or parents?


The authors are grateful to the series editors for their generous guidance throughout the submission and revision process, and to four very helpful reviewers for their comments.


Archard, David William. “Children’s Rights.” Stanford Encyclopedia of Philosophy. Stanford University, December 14, 2018.

Berendt, Bettina, Allison Littlejohn, and Mike Blakemore. “AI in Education: Learner Choice and Fundamental Rights.” Learning, Media and Technology 45, no. 3 (2020): 312–24.

Bommasani, Rishi, Drew A. Hudson, Ehsan Adeli, Russ Altman, Simran Arora, Sydney von Arx, Michael S. Bernstein, et al. “On the Opportunities and Risks of Foundation Models.” Preprint, submitted August 16, 2021.

Bowie, Liz. “Baltimore City Student Laptops Are Monitored for Mentions of Suicide. Sometimes, the Police Are Called.” Baltimore Sun, October 12, 2021.

boyd, danah. It’s Complicated: The Social Lives of Networked Teens. New Haven: Yale University Press, 2015.

Brayne, Sarah. Predict and Surveil: Data, Discretion, and the Future of Policing. New York: Oxford University Press, 2021.

“California Legislative Information.” Bill Information - SB-1177 Privacy: students. Accessed June 14, 2022.

Center for Democracy & Technology. Student Data and Information Privacy: A Survey of Parents of K-12 Students, September 2020.

“Children’s Privacy.” Federal Trade Commission, February 11, 2022.

“Children’s Online Privacy Protection Rule; Final Rule.” Federal Register. Federal Trade Commission (January 17, 2013): 3972–4014.

“Complying with COPPA: Frequently Asked Questions.” Federal Trade Commission, August 9, 2022. - Schools.

Connor, Carol McDonald. “Using Technology and Assessment to Personalize Instruction: Preventing Reading Problems.” Prevention Science 20, no. 1 (2017): 89–99.

Davis, Andrew. “Do Children Have Privacy Rights in the Classroom?” Studies in Philosophy and Education 20, no. 3 (2001): 245–54.

Dee, Thomas, and Mark Murphy. “Vanished Classmates: The Effects of Local Immigration Enforcement on Student Enrollment.” NBER Working Paper Series no. 25080 (September 2018)

Elsen-Rooney, Michael. “Data of 820,000 NYC Students Compromised in Hack of Online Grading System: Education Dept.” New York Daily News, March 25, 2022.

Fourcade, Marion, and Kieran Healy. “Seeing Like a Market.” Socio-Economic Review (December 23, 2016): 1–21.

Gandy Jr., Oscar H. The Panoptic Sort: A Political Economy of Personal Information. New York: Oxford University Press, 1993.

Garfinkel, Simson. “Differential Privacy and the 2020 US Census.” MIT Case Studies in Social and Ethical Responsibilities of Computing (Winter 2022).

Gebhart, Gennie. “Spying on Students: School-Issued Devices and Student Privacy.” Electronic Frontier Foundation, February 15, 2018.

Gondara, Lovedeep, and Ke Wang. “Differentially Private Small Dataset Release Using Random Projections.” Proceedings of Machine Learning Research 124 (2020): 1–10.

Goodman, Joshua, Julia Melkers, and Amanda Pallais. “Can Online Delivery Increase Access to Education?” NBER Working Paper Series no. 22754, October 2016.

Gray, Mary L. Out in the Country: Youth, Media, and Queer Visibility in Rural America. New York: New York University Press, 2009.

Hankerson, DeVan L., Cody Venzke, Elizabeth Laird, Hugh Grant-Chapman, and Dhanaraj Thakur. “Online and Observed: Student Privacy Implications of School-Issued Devices and Student Activity Monitoring Software.” Center for Democracy & Technology, 2021.

Hellman, Deborah. “Big Data and Compounding Injustice.” Journal of Moral Philosophy, Virginia Public Law and Legal Theory Research Paper No. 2021-27 (forthcoming).

“Inbloom Inc., Launches to Enable Personalized Learning through Easier Access to Information and Technology.” CISION. PRNewswire, February 5, 2013.

Jones, Kyle M., Alan Rubel, and Ellen LeClere. “A Matter of Trust: Higher Education Institutions as Information Fiduciaries in an Age of Educational Data Mining and Learning Analytics.” Journal of the Association for Information Science and Technology 71, no. 10 (2020): 1227–41.

Kant, Tanya. “Identity, Advertising, and Algorithmic Targeting: Or How (Not) to Target Your ‘Ideal User.’” MIT Case Studies in Social and Ethical Responsibilities of Computing (Summer 2021).

Keierleber, Mark. “Democratic Lawmakers Demand Student Surveillance Companies Outline Business Practices, Warn the Security Tools May Compound 'Risk of Harm for Students'.” The74, October 4, 2021.

Keierleber, Mark. “'Don't Get Gaggled': Minneapolis School District Spends Big on Student Surveillance Tool, Raising Ire after Terminating Its Police Contract.” The74, October 18, 2020.

Keierleber, Mark. “Exclusive Data: An inside Look at the Spy Tech That Followed Kids Home for Remote Learning - and Now Won't Leave.” The74, September 14, 2021.

Kramer, Roderick M. “Trust and Distrust in Organizations: Emerging Perspectives, Enduring Questions.” Annual Review of Psychology 50. no. 1 (1999): 569–98.

Lu, Alex Jiahong, Gabriela Marcu, Mark S. Ackerman, and Tawanna R. Dillahunt. “Coding Bias in the Use of Behavior Management Technologies: Uncovering Socio-Technical Consequences of Data-Driven Surveillance in Classrooms.” Designing Interactive Systems Conference 2021 (2021): 508–20.

Mathiesen, Kay. “The Internet, Children, and Privacy: The Case against Parental Monitoring.” Ethics and Information Technology 15, no. 4 (2013): 263–74.

Molnar, Alex, and Faith Boninger. Sold Out: How Marketing in School Threatens Children’s Well-Being and Undermines Their Education. Lanham, MD: Rowman & Littlefield, 2015.

Nissenbaum, Helen. “Contextual Integrity Up and Down the Data Food Chain.” Theoretical Inquiries in Law 20, no. 1 (2019): 221–56.

Nissenbaum, Helen. Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford, CA: Stanford University Press, 2009.

Nissenbaum, Helen. “Privacy as Contextual Integrity.” Washington Law Review 79 (2004): 119–57.

Patgiri, Ripon, and Arif Ahmed. “Big Data: The V's of the Game Changer Paradigm.” 2016 IEEE 18th International Conference on High Performance Computing and Communications; IEEE 14th International Conference on Smart City; IEEE 2nd International Conference on Data Science and Systems (HPCC/SmartCity/DSS) (December 2016): 17–24.

Pettit, Philip. “The Cunning of Trust.” Philosophy and Public Affairs 24, no. 3 (1995): 202–25.

“Protecting Student Privacy While Using Online Educational Services.” Privacy Technical Assistance Center (PTAC), US Department of Education, February 2014.

Reich, Justin. Failure to Disrupt: Why Technology Alone Can’t Transform Education. Cambridge, MA: Harvard University Press, 2020.

Rhoades, Amy. “Big Tech Makes Big Data Out of Your Child: The FERPA Loophole EdTech Exploits to Monetize Student Data.” American University Business Law Review 9, no. 3 (2020): 445–73.

Rocher, Luc, Julien M. Hendrickx, and Yves-Alexandre de Montjoye. “Estimating the Success of Re-Identifications in Incomplete Datasets Using Generative Models.” Nature Communications 10 (2019): 3069.

Rotenberg, Ken J. “The Conceptualization of Interpersonal Trust: A Basis, Domain, and Target Framework.” In Interpersonal Trust during Childhood and Adolescence, ed. Ken J. Rotenberg, 8–27. Cambridge: Cambridge University Press, 2010.

Royal Society. Protecting Privacy in Practice: The Current Use, Development and Limits of Privacy Enhancing Technologies in Data Analysis, March 2019.

Russell, N. Cameron, Joel R. Reidenberg, Elizabeth Martin, and Thomas Norton. “Transparency and the Marketplace for Student Data.” SSRN Electronic Journal 22, no. 3 (2019): 108–56.

Schapiro, Tamar. “What Is a Child?” Ethics 109, no. 4 (1999): 715–38.

Schneier, Bruce. Data and Goliath. New York: W. W. Norton & Company, 2016.

Schouten, G. “On Meeting Students Where They Are: Teacher Judgment and the Use of Data in Higher Education.” Theory and Research in Education 15, no. 3 (2017): 321–38.

Shmueli, Benjamin, and Blecher-Prigat, Ayelet, “Privacy for Children.” Columbia Human Rights Law Review 42 (2011): 759–95.

Strauss, Valerie. “Student Privacy Concerns Grow over ‘Data in a Cloud’.” Washington Post, January 3, 2014.

“Student and Staff Data Privacy Notice.” Student Safety That Saves Lives. Gaggle.Net, Inc. Accessed June 10, 2022.

Sweeney, Latanya. “K-Anonymity: A Model for Protecting Privacy.” International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 10, no. 05 (2002): 557–70.

United States Government Accountability Office. Data Security: Recent K-12 Data Breaches Show That Students Are Vulnerable to Harm. Report no. GAO-20-644, September 2020.

“United States Department of Education - Protecting Student Privacy.” A Parent Guide to the Family Educational Rights and Privacy Act (FERPA). United States Department of Education, July 9, 2021. parent guide to ferpa_508.pdf.

“What Is the Protection of Pupil Rights Amendment (PPRA)?” Protecting Student Privacy. U.S. Department of Education. Accessed June 10, 2022.

“Who Is Most Affected by the School to Prison Pipeline?” American University, February 24, 2021.

Xu, Xu. “The Social Costs of Digital vs. in-Person Surveillance.” SSRN Electronic Journal (2019): 1–44.

Zeide, Elana. “Student Privacy Principles for the Age of Big Data: Moving beyond FERPA and FIPPS." Drexel Law Review 8 (2015): 339–94.

No comments here