Children Getting Affected By AI – What AI Means For Kids

The traditional classroom

With excess electronic devices in reach of children, they are subjected to an overload of AI, and they are completely unaware of this phenomenon. The need of the hour is that a new curriculum should be designed to sync them with AI and generate awareness about how it works. This shall further enable them to play their role in the usage and development of future technology.

A 10 year old student defines artificial intelligence as: “It’s kind of like a baby or a human brain because it has to learn, and it stores […] and uses that information to figure out things.” Many people in the adult age group would find difficulty in phrasing a fitting definition of such a complicated tech niche. The student did it smartly, despite a tender age of just 10 years.

Artificial Intelligence in today’s world has surrounded the kids of our era. A better understanding of algorithms shall enable them to understand what information they see and how they influence society. With this in process, children would serve to become more significant applicants of AI. It would motivate them to shape a better future.

Payne mentions, “It’s essential for them to understand how these technologies work so they can best navigate and consume them,” He continues, “We want them to feel empowered.”

Kids Would Be The Focal Point

Children are indeed more focused on being educated about AI for several reasons.

  • The first reason is that a technical argument prevails that several studies have shown that exposing children to technical concepts instills in them better problem-solving & critical-thinking skills. Their tender age is primed to learn computational skills faster and better, than later on in their lives.
  • Another argument is that the middle school years are considered critical for a child’s growth and analytical development. Jennifer Jipson, who is a serving professor of psychology and child development, mentions that if students of this age are taught technology, it would probably motivate them to pursue it as a shining career. This would serve the AI tech industry with diversified talent and perpetually enriched human resource. Early learning of the technology could enable the children to know the ethics and social impacts of technology. This way the children will become conscious developers of the technology and thus would be raised as more learned citizens.
  • The third argument is espoused by Rose Luckin, a professor of learner-centered design at University College London. The argument is that the youth is vulnerable in falling prey to ethical risks that come with following people’s behavior and might become addicted to technology. Children are at severe risk of becoming passive consumers which could hamper their company, privacy, and long-term development.
Artificial intelligence

Payne, the developer of the curriculum says, “Ten to 12 years old is the average age when a child receives his or her first cell phone, or his or her first social-media account.

We want to have them really understand that technology has opinions and has goals that might not necessarily align with their own before they become even bigger consumers of technology.” mentions Payne.

Opinion Based Algorithms

Payne’s curriculum covers a series of exercises that drive students to speculate the core of algorithms. They would instigate learning by algorithms as recipes, with inputs, a set of instructions, and results. Then the kids are asked to “build,” or write down instructions of an algorithm that would yield desirable outputs.

The kids in the summer pilot, immediately, started to grasp the lessons out of it. “A student pulled me aside and asked, ‘Is this supposed to be opinion or fact?‘” She says. This cascaded their own discovery process to observe how they had unintentionally transformed preferences into their algorithms.

The activity then brings in a concept whereby students draw what Payne considers an “ethical matrix” to evaluate how different stakeholders and their values can affect the design of a sandwich algorithm. During the same pilot, Payne then attached the exercises to contemporary events. Collectively, the students went through an abridged Wall Street Journal article [Paywall] on how YouTube executives were pondering to develop an exclusive kids-version of the app with a remodeled recommendation algorithm. The students observed themselves the effect of parameters like investor demands, parental pressures, or children’s preferences on a company to alter their conventional algorithm and to opt for a customized redesign.

Another pack of activities introduces the concept of AI bias to the students. Google’s Teachable Machine tool, a code-free interactive platform for training basic machine-learning models, was used to build a cat-dog classifier. But unbeknownst to them, they were provided a biased data set to analyze. The students then learned through experimentation and discussion that how the data set leads the classifier to be more accurate for cats. They were then given an opportunity to correct the problem.

Payne again synced the exercise with a real-world example during the pilot by showing the children footage of a fellow Media Lab researcher, Joy Buolamwini, testifying to Congress about prejudices in the face recognition system. “They were able to see how the kind of thought process they had gone through could change the way these systems are built in the world,” says Payne.

The Education In Future

Payne seeks to continue introducing like programs, retrieving public feedbacks, and exploring different spots for expanding the reach. Her goal is to introduce a publicized version to make it easily accessible for the masses.

Besides, she believes this would serve an exemplary teaching technique for the children to learn about technology, society, and ethics. Luckin and Jipson also agree that it provides a powerful template and an intellectual exercise as to how education could evolve to suffice the demands of an emerging technology-driven era.

“AI, as we see it in society right now, is not a great equalizer,” tells Payne. “The education is, or at least, we hope it to be. So this is a foundational step to move toward a fairer and more just society.”

  1. Pingback:Facebook Made an AI Assistant In Minecraft (Updated) - TechCrunchX

Leave a Reply

Your email address will not be published. Required fields are marked *