AI Risks for K-12 Students: Lessons from the U.S. to Protect Children Globally

AI Risks for K-12 Students: Lessons from the U.S. to Protect Children Globally

The rapid integration of AI into education systems around the world offers exciting opportunities but also raises critical concerns about student privacy and data security. Across different nations, existing legislation often lags behind the capabilities of these advanced tools, leaving children vulnerable to new risks. While this piece focuses on U.S. legislation, particularly the Family Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Act (COPPA), the issues and deficiencies highlighted resonate with challenges faced by education systems globally.

Billy Carrie, an Information Security Engineer with extensive experience in IT and cybersecurity, has worked on improving the technical processes of K-12 school districts, enhancing learning experiences for multinational organisations, and securing systems across various industries. With a strong focus on security and Identity Access Management, Billy offers valuable insights into the gaps in existing laws and actionable steps educators and policymakers can take to protect students in the AI era.

Below, Billy introduces the challenges of protecting K-12 (approximately 5-17 years old) students’ data, even with frameworks like FERPA and COPPA in place, and discusses how these lessons can guide global improvements in safeguarding children from AI-related risks.

Protecting K-12 Students in the AI Era: FERPA, COPPA, and the Missing Pieces

By Billy Carrie

If student data is at risk from the implementation of AI tools, then why aren’t students protected from existing frameworks such as FERPA and COPPA?


During my time as a technical support specialist in the State of Georgia Public School District, I observed how important FERPA and COPPA was in protecting student grades, photos and other personal identifiable information. While these have been a staple in student safety and privacy, new technologies pose a threat to students’ data.


What new technology tool in particular poses a threat to students? Artificial intelligence technology. AI tools used in school systems can be very beneficial. For example, chatbots automate daily K-12 processes like parent communications, attendance tracking, and generate reports to save students, teachers, and staff time and efforts.

That sounds amazing right? What is the downside? AI tools are only as good as the data they are given. So, for these tools to be beneficial, student data, teacher data, and parent data all must be entrusted to the AI company who owns the tool.


FERPA and COPPA in Action


With FERPA and COPPA in place, shouldn’t student data be safe even with the evolution of technology?


FERPA


Family Educational Rights and Privacy Act (FERPA) safeguards the privacy of traditional student education records, such as grades and attendance. However, its protections do not extend to AI generated data like predictive performance insights or behavioral analytics. For instance, if a school shares a student’s test scores with an educational app, parental consent is required, but the same does not apply to AI-derived analytics. This gap raises concerns about sensitive data being shared with third-party companies without parental knowledge or consent.


Without safeguards, such as strict data governance and transparent consent protocols, third parties can exploit this data. One significant risk is profiling—where algorithms label students as 'low-performing,' potentially limiting their access to advanced learning opportunities or scholarships. Addressing FERPA’s limitations is essential to protect students from these long- term consequences.


COPPA

The Children’s Online Privacy Protection Act (COPPA) is designed to protect the online privacy of children under 13. It requires websites, apps, and online services that collect data from children to obtain verifiable parental consent. However, COPPA’s protections are limited when it comes to sophisticated data collection methods used by AI-driven platforms in education. For example, while COPPA restricts the collection of basic personal information like names and addresses, it does not specifically address the use of AI to generate complex insights, such as behavioral analytics or learning patterns, from children's interactions with digital platforms.


This lack of clarity leaves significant gaps in protecting children’s data. Without explicit safeguards, companies can collect and analyze vast amounts of sensitive information without adequately informing parents. One potential risk is the use of predictive analytics to influence decisions about a child’s educational journey, such as tailoring content based on perceived skill levels. This could inadvertently lead to "algorithmic gatekeeping," where children are steered toward or away from certain educational opportunities based on AI-generated assumptions, potentially limiting their growth. Strengthening COPPA’s scope to cover AI-driven insights is critical to ensuring comprehensive protections for children in the digital age.


Actionable Checklist for K-5 Student Safety


What are some actionable and practical ways that teachers, administrators, and parents can protect students against AI technology tools?:


Evaluate Tools


o Validate AI-based apps and vendor platforms have privacy certifications and adhere to FERPA/COPPA guidelines.


o Engage your IT Helpdesk and HR department to verify a third-party risk
assessment has been completed before implementation.

Demand Transparency


o Ask vendors how data is collected, processed, and stored. Look for clear data usage terms.


Limit Permissions


o Work with your IT Helpdesk team to restrict data sharing to only necessary parties.

Monitor AI Data Collected


o Conduct quarterly audits on AI-based apps for fairness and performance accuracy. Partner with experts if needed.

Educate Teachers and Parents


o Offer workshops for teachers and parents on navigating the AI landscape safely.

TLDR


AI tools in schools offer benefits like automating processes, but they also raise concerns about
student data privacy, as FERPA and COPPA do not fully cover AI-generated data (e.g. behavioral analytics, predictive insights).


Key Actions to Protect Students:

Evaluate Tools: Ensure vendors follow privacy laws and complete risk assessments.

Demand Transparency: Ask how data is collected, used, and stored.

Limit Permissions: Restrict unnecessary data sharing.

Monitor AI: Audit AI tools regularly for fairness and accuracy.

Educate Stakeholders: Provide training for teachers and parents on AI safety.

Read more