As artificial intelligence becomes a bigger part of students’ learning environments in visible ways and sometimes behind the scenes, families are asking important questions about what it means for learning, child development, and student safety.
To help answer those questions, Texas PTA’s Straight Talk ’26 panel brought together experts in public education, technology and mental health to share practical insights about how AI is being used in schools, how it works, and where support and safeguards matter most.
How does AI show up in classrooms at the elementary, middle, and high school level?
Kyle Berger, chief technology officer for Arlington ISD, said many families are feeling uncertainty about AI and that schools are responding in different ways. He described the introduction of the tool in classrooms as a gradual, age-appropriate approach that helps students build familiarity with AI and use it more responsibly over time.
“I think it is pretty obvious to everybody there’s a lot of AI anxiety,” he said. “It’s showing up differently across schools because local responses vary. Some schools are saying they’ll block it all. Some schools are saying, ‘let’s be really careful and thoughtful.’ Some schools are saying, ‘let’s go all in.’”

According to Berger, AI use also looks different by grade level.
“In elementary school, primarily what you’ll find is kind of behind the scenes. It’s just kind of invisible to the learner.”
At that level, AI may be built into tools that help teachers adjust instruction based on student needs. Berger said it can also help teachers save time and focus more on the human side of teaching.
“It’s really that tool that empowers [teachers to save time] and give that human teaching aspect back to our students,” he said. “And that’s very important at the elementary level.”
In middle school, Berger said, “this is kind of where it gets tricky,” as students begin interacting with AI more directly and seeing it more clearly in the tools they use.
“Yes, [some students are] attempting to get it to write papers and do some different assignments as well. But teachers are also leveraging it to adjust the [classroom] environment to better support learning.”
By high school, Berger said students are “really digging into deeper research, building out more content and discovery, coding help, and exploring ideas.”
Across all grade levels, that gradual shift prepares students for what comes next by helping them build confidence and responsibility as AI becomes more common in school and beyond.
What do parents most often misunderstand about AI?
Clay Smith, a solutions engineer at Google, said one of the biggest misconceptions is that AI tools know what is true.
“This AI, the generative AI that we experience, it’s math. It’s just math,” Smith said. “One thing I think that people misunderstand is that they think these tools know truth. They don’t know truth. They are predicting based on patterns and context. And so when we use them, we have to make sure that we’re grounding what we’re asking for and what we’re receiving.”
He also emphasized the importance of verification and critical thinking.
“Without that grounding, we have something called a hallucination. That’s where it makes things up. And so we want to make sure we are double-checking,” Smith said. “We want students to understand that the tool is there to support thinking, not replace thinking.”
How can AI affect children’s well-being and relationships?
Roy Rios, a licensed clinical therapist and director of prevention at the Texas Council on Family Violence, said AI can be useful, but children need to learn how to properly interact with it.
“It can also create confusion if there’s not enough guidance,” Rios said.
He warned that tools designed to feel conversational or relational can affect how children understand connection.
“There can be a tendency to blur the line between real human connection and simulated connection,” he said. In contrast to humans, AI bots are available 24/7 and are built often to affirm and to agree with young people.

What we see is that if young people are turning to these types of tools regularly, it could skew the development around emotional and social skills.”
Rios said that does not mean the technology itself is bad, but it does mean adults need to stay involved. Because AI can shape perceptions and decision making, young people may not be equipped to understand that information needs to be validated.
“Deepfakes and other AI-generated content also creates some risk for young people who don’t know how to use it in healthy manners.”
He also offered an important reminder for families: “Overall, what we see is that when greater awareness and conversations are being had around these types of tools, protective factors are greatly increased.”
What should families know about privacy and guardrails?
Berger said schools can build protections into the AI tools they choose, but those protections only go so far.
“There is a big difference between AI in a school setting and … consumer-based AI,” he said.
In schools, many tools operate within what Berger described as a “walled garden,” or closed environment, where districts can better manage access and protections. Outside of school, however, families are often dealing with tools that collect or use information in ways schools cannot control. Awareness matters just as much as school policy.
“It’s all of our jobs to be talking with our students, with each other about AI because we are the best firewall of prevention. It isn’t software. It’s your human mind and how you’re evaluating using the system.”
He encouraged parents to ask practical questions about the tools their children are using.
“What’s the intent? And how is my child expected to use that?” Berger said. “So we can kind of see those bounds and then support our children at home, too.”
The panel also made clear that some of the most important guardrails happen at home. Charley Ayres, director of university relations at Texas A&M University-Central Texas, reminded parents that children are watching how adults use screens and technology.
“Look in the mirror,” Ayres said. “There is such a thing as modeling. They’re being taught at home how to handle the technology. So that’s a critical point.”
As AI becomes more present in school and everyday life, children need guidance in how to question, verify, and use these tools responsibly. AI literacy does not happen overnight. What matters most is helping children build strong judgment, balance, and confidence with any technology they encounter.
Watch the Straight Talk ’26 discussion to learn more about how AI is shaping learning for children and families.


