Legal Considerations for AI in Education — FERPA, COPPA, and GDPR
Most school leaders who adopt AI tools are not intentionally ignoring privacy law — they simply don't know which laws apply, what those laws require, or where the line falls between legally safe and legally risky AI use. The legal landscape for AI in education involves at least three major federal/international frameworks (FERPA, COPPA, GDPR), a growing patchwork of state student privacy laws, and a collection of vendor agreements that most administrators sign without fully reading.
A 2024 Center for Democracy & Technology (CDT) survey found that 42% of school districts using AI tools had not executed Data Processing Agreements with their AI vendors, and 31% of administrators could not identify which federal law governs student data privacy. These gaps don't exist because administrators don't care — they exist because the legal framework is genuinely complex and the guidance is written for lawyers, not principals.
This guide translates privacy law into school leader language. It tells you what each law requires, when it applies to AI tools, what compliance looks like in practice, and where the most common legal mistakes occur.
Important disclaimer: This article provides general educational information about privacy law. It is not legal advice. Consult your district's legal counsel for guidance specific to your situation, jurisdiction, and planned AI implementations.
FERPA: The Foundation
What FERPA Covers
The Family Educational Rights and Privacy Act (1974) governs access to and disclosure of student education records in schools that receive federal funding — which is virtually all public schools and many private schools.
| FERPA Element | What It Means | How It Applies to AI |
|---|---|---|
| Education records | Records directly related to a student that are maintained by the school or an agent of the school | AI tools that store student work, grades, behavior data, or identifiable information create education records |
| Disclosure restrictions | Schools cannot disclose education records without parent consent (with specific exceptions) | Sharing student data with an AI vendor is a disclosure — it requires legal authorization |
| School official exception | Schools can share education records with "school officials" (including contractors) who have "legitimate educational interest" | AI vendors can qualify as school officials through a DPA — this is the primary legal mechanism for AI tool access |
| Directory information | Basic information (name, address, grade level, etc.) that schools can share with fewer restrictions | AI tools that access only directory information have lower FERPA burden, but parents can opt out of directory disclosure |
| Parent rights | Parents can inspect records, request amendments, and control disclosure (rights transfer to students at 18) | Parents can ask what AI tools access their child's records — schools should be able to answer |
When AI Tools Trigger FERPA
Does the AI tool access or store student education
records?
Education records include:
✓ Student names linked to academic information
✓ Grades, test scores, assessment results
✓ Student work samples (essays, assignments)
✓ Attendance records
✓ Disciplinary records
✓ IEP or 504 documentation
✓ Student-generated content in the AI tool
✓ AI-generated analysis of individual student data
NOT education records:
✗ Teacher-generated content with no student data
✗ De-identified aggregate data (no reasonable basis
to identify individuals)
✗ Directory information (with proper notice)
✗ Teacher's personal notes not shared with others
IF the AI tool accesses education records:
→ FERPA applies
→ DPA required (school official exception)
→ Vendor must have legitimate educational interest
→ Vendor must be under school's direct control
regarding data use
→ Data use limited to authorized purpose
→ Re-disclosure prohibited
IF the AI tool does NOT access education records:
→ FERPA does not apply to that specific use
→ Other laws may still apply (COPPA, state laws)
The DPA: Your Legal Shield
A Data Processing Agreement is the legal instrument that makes AI vendor access to student records FERPA-compliant. Without a DPA, sharing student data with an AI vendor is a FERPA violation — even if the vendor has excellent security practices.
Essential DPA provisions for AI tools:
| Provision | What It Should Say | Red Flag If Missing |
|---|---|---|
| Purpose limitation | "Vendor processes data solely for the educational purpose authorized by the school" | Vendor can use data for any purpose |
| No secondary use | "Vendor shall not use student data for targeted advertising, marketing, or model training" | Student data may train AI models |
| Re-disclosure prohibition | "Vendor shall not disclose student data to third parties except as required by law" | Data may be shared with partners or affiliates |
| Data deletion | "Upon contract termination, vendor shall delete all student data within 30 days and provide certification of deletion" | Data may persist indefinitely after contract ends |
| Security obligations | "Vendor shall implement and maintain reasonable security measures" (specify encryption, access controls) | No security commitments |
| Breach notification | "Vendor shall notify school within 48 hours of any data breach" | No breach notification obligation |
| Audit rights | "School reserves the right to audit vendor's data practices" | No accountability mechanism |
COPPA: When Students Are Under 13
What COPPA Requires
The Children's Online Privacy Protection Act (1998, updated 2013) regulates the online collection of personal information from children under 13. Unlike FERPA, COPPA applies to the operator of the online service — the AI vendor — not the school directly. But schools get involved because they can consent on behalf of parents in educational contexts.
| COPPA Element | What It Means | AI Implication |
|---|---|---|
| Applies to children under 13 | If students under 13 interact with the AI tool, COPPA applies | Elementary and middle school AI tools almost always trigger COPPA |
| Verifiable parental consent | Vendors must obtain consent before collecting personal information from children | Schools can consent on behalf of parents for educational use — but only for educational use |
| School consent limitation | Schools can consent only for use "for the benefit of the school" — not for commercial purposes | AI vendors cannot use under-13 data collected under school consent for commercial purposes (advertising, model training, profiling) |
| Minimal collection | Vendors must collect only information "reasonably necessary" for the activity | AI tools should not require unnecessary personal information from young students |
| Parental access | Parents can review and request deletion of their child's information | Schools and vendors must be able to respond to parent requests |
COPPA in Practice for AI Tools
Is the AI tool used BY STUDENTS (not just by teachers)?
├─ NO → COPPA does not apply (teacher-facing tools
│ don't trigger COPPA because teachers are adults)
└─ YES → Continue
Are any students under 13?
├─ NO → COPPA does not apply
└─ YES → Continue
Does the tool collect personal information?
(Name, email, username, photo, voice, persistent
identifier, geolocation, or any combination that
could identify a child)
├─ NO → COPPA may not apply (rare — most tools
│ collect some identifier)
└─ YES → COPPA APPLIES
SCHOOL MUST:
1. Verify the vendor is COPPA-compliant
2. Provide consent on behalf of parents (for
educational use only)
3. Ensure vendor uses data only for educational
purpose — NOT for commercial purposes
4. Notify parents that the school has consented
5. Maintain ability to review and delete student
data upon parent request
Practical tip for schools: If your AI tool is used only by teachers — for lesson planning, content generation, assessment creation — COPPA does not apply because no child's personal information is being collected. This is one reason that teacher-facing AI tools like EduGenius carry lower legal complexity than student-facing AI tools — the teacher interacts with the AI, and students interact with the teacher-reviewed output.
GDPR: International and EU-Connected Schools
When GDPR Applies to Schools
The General Data Protection Regulation applies to schools in two situations:
- Schools located in the EU/EEA: GDPR is the primary data protection framework
- Schools outside the EU that process data of EU residents: If your school enrolls students who are EU citizens or uses AI tools operated by EU-based companies
| GDPR Principle | What It Requires | School Application |
|---|---|---|
| Lawful basis | Must have a legal basis for data processing (consent, legitimate interest, legal obligation, etc.) | Schools typically rely on "legitimate interest" or "public interest" for educational data processing |
| Purpose limitation | Data collected for one purpose cannot be used for another | AI vendors cannot repurpose student data collected for education |
| Data minimization | Collect only what's necessary | AI tools should not require excessive personal data |
| Right to erasure | Data subjects can request data deletion | Parents/students can request deletion of data from AI tools |
| Data protection impact assessment (DPIA) | Required for "high-risk" processing — which AI often qualifies as | Schools using AI for profiling, automated decision-making, or large-scale data processing should conduct DPIAs |
| International transfers | Data transfers outside the EU require specific legal mechanisms | Using a US-based AI tool from an EU school requires Standard Contractual Clauses or equivalent |
For U.S. schools: GDPR typically doesn't apply unless you enroll international students from the EU or use EU-based AI vendors. However, GDPR's principles — purpose limitation, data minimization, deletion rights — represent best practices that strengthen privacy regardless of legal obligation.
State Student Privacy Laws
The Patchwork Problem
Beyond FERPA and COPPA, approximately 40 U.S. states have enacted their own student data privacy laws, creating a complicated patchwork:
| State | Key Law | Notable Requirement Beyond FERPA |
|---|---|---|
| California | SOPIPA (2014) | Prohibits targeted advertising based on student data; prohibits selling student data; requires deletion upon request |
| New York | Education Law 2-d (2014, updated 2020) | Requires DPAs for all third-party vendors; parents' bill of rights; data inventory reporting |
| Illinois | SOPPA (2021) | Requires breach notification within 30 days; prohibits targeted advertising; requires data governance plan |
| Colorado | Student Data Transparency and Security Act (2016) | Requires publicly available data inventory; transparency portal |
| Connecticut | Student Privacy Act (2022) | Prohibits AI-generated profiles used for non-educational purposes; requires operator compliance |
| Texas | Student Privacy Act (2017, updated 2023) | Prohibits selling student data; requires annual security audit; operator certifications |
Action item: Check your state's specific student data privacy requirements. Many states participate in the Student Data Privacy Consortium (SDPC), which provides standardized DPA templates that address both FERPA and state-specific requirements. Using the SDPC DPA template for your state simplifies compliance.
AI-Specific Legal Risks
| Risk Scenario | Legal Issue | Prevention |
|---|---|---|
| Teacher pastes student names/grades into ChatGPT | FERPA violation — disclosure of education records to an unauthorized party | Clear policy: no student PII in general-purpose AI tools without DPA |
| AI tool uses student data to train models | FERPA (purpose limitation), COPPA (commercial use), state laws (selling/commercial use) | DPA provision prohibiting model training; verify in terms of service |
| AI tool makes automated decisions about students (grading, placement, discipline) | Due process concerns; potential IDEA/Section 504 violations for students with disabilities | All AI-assisted decisions reviewed by humans; never delegate protected decisions to AI |
| Student-facing AI collects voice data | COPPA (personal information from under-13); state biometric laws (Illinois BIPA, Texas CUBI) | Verify what data the AI collects; obtain appropriate consent; prefer text-only interaction |
| AI-generated reports shared with parents contain errors | Privacy (if wrong student's data included); liability (if recommendations based on incorrect analysis) | Human review of all AI-generated parent-facing output before distribution |
| Data breach at AI vendor | FERPA breach notification; state breach notification laws; potential liability | DPA with breach notification clause; vendor security requirements; cyber insurance |
Compliance Checklist for School Leaders
AI TOOL LEGAL COMPLIANCE CHECKLIST
Before deploying any AI tool:
□ FERPA
□ Determine if tool accesses education records
□ If yes: execute DPA with vendor
□ Verify DPA includes: purpose limitation, no
model training, breach notification, deletion
clause, re-disclosure prohibition
□ Document legitimate educational interest
□ COPPA (if student-facing, students under 13)
□ Verify vendor is COPPA-compliant
□ Provide school consent for educational use
□ Verify vendor does not use data commercially
□ Notify parents about tool and school consent
□ STATE LAW
□ Check state student privacy requirements
□ Use state-specific DPA template if available
□ Verify any state-specific prohibitions
(targeted advertising, data selling, biometric
data)
□ Comply with state breach notification
requirements
□ GENERAL
□ Terms of Service reviewed (not just clicked
through)
□ Privacy policy reviewed — specifically for
AI model training and data sharing
□ Tool added to data governance inventory
□ Staff trained on appropriate use
□ Parent communication about tool deployment
Key Takeaways
- 42% of districts using AI tools lack Data Processing Agreements (CDT, 2024). Without a DPA, sharing student data with an AI vendor is a FERPA violation — regardless of how good the vendor's security practices are. The DPA is the legal foundation of compliant AI use. See AI for School Leaders — A Strategic Guide to Transforming Education Administration for strategic context.
- Teacher-facing AI tools carry lower legal risk than student-facing tools. When teachers use AI for content generation and lesson planning without inputting student data, FERPA and COPPA are largely not triggered. Platforms like EduGenius that focus on teacher-facing content creation minimize the privacy compliance burden. See Building a Culture of Innovation — Leading AI Adoption in Schools for adoption culture.
- The most common FERPA violation is teachers pasting student data into general-purpose AI tools (ChatGPT, Claude, Gemini without enterprise agreements). Clear policy and training are the prevention. See Comparing AI Deployment Models — Cloud, On-Premise, and Hybrid for deployment considerations.
- COPPA applies when students under 13 interact with AI tools directly. Schools can consent on behalf of parents for educational use only — not for commercial purposes. Student-facing AI in elementary schools requires COPPA compliance by the vendor. See How District Technology Directors Should Evaluate AI Vendors for evaluation.
- Check your state's privacy law. ~40 states have student data privacy laws beyond FERPA, with requirements ranging from DPA mandates to biometric data restrictions. Use SDPC templates for state-specific compliance. See AI for School Communication — Newsletters, Announcements, and Parent Outreach for parent communication.
- AI should not make autonomous decisions about students. Automated grading, placement, and discipline decisions raise due process and discrimination concerns. All AI-assisted decisions require human review, especially for students with disabilities. See Best AI Content Generation Tools for Educators — Head-to-Head Comparison for tool evaluation.
Frequently Asked Questions
Do I need a DPA if teachers use ChatGPT only for their own lesson planning (no student data)?
If teachers genuinely never input student names, grades, work samples, or any personally identifiable information, FERPA is not triggered for that specific use. However, "never" is hard to enforce, and the line between "no student data" and "let me just ask about my struggling reader in 3rd period" is easily crossed. The safest approach is to either (1) execute a DPA with OpenAI/Anthropic/Google for enterprise educational use, or (2) issue a clear policy prohibiting any student identifiers in general-purpose AI prompts. Option 1 protects against accidental slips. Option 2 relies on teacher discipline. Most districts benefit from pursuing both.
Can parents opt their child out of AI completely?
FERPA gives parents the right to control disclosure of education records. If an AI tool accesses their child's records, parents can object to that specific disclosure — and the school generally must honor the objection. For AI tools used by teachers that don't access student records (teacher-facing content generation), there's no student record disclosure to opt out of, and the generated materials are functionally identical to any other instructional resource. Schools should be prepared to offer reasonable accommodations while explaining which AI uses involve student data and which don't — the distinction usually resolves most parent concerns.
What liability does the school have if an AI vendor has a data breach?
The school retains responsibility for the student data it shares, even when the breach occurs at the vendor. FERPA holds the school (the educational agency) accountable for protecting student records. This is why DPA provisions matter enormously: breach notification requirements, security standards, and liability allocation in the DPA determine how much of the practical burden falls on the vendor vs. the school. Cyber insurance for educational institutions increasingly covers third-party vendor breach costs — check whether your district's insurance policy covers this scenario. The best protection is choosing vendors with strong security practices, executing comprehensive DPAs, and minimizing the data shared to what's strictly necessary.
Is de-identified data safe to use with any AI tool?
FERPA does not cover data that has been properly de-identified — meaning that there is "no reasonable basis" to identify individual students from the data. But de-identification is harder than it sounds. Removing names and ID numbers is not sufficient if the combination of remaining data points (grade, gender, race, specific course, specific scores) could identify a student — which it often can in small schools or small classes. The FERPA de-identification standard requires either expert determination or the safe harbor method (removing 18 specific identifiers). Aggregate data (school-level averages, grade-level distributions) is generally safe. Individual-level data with names removed may not be.