More than robots

More than robots

Artificial intelligence, machine learning and algorithmic decision-making are rapidly changing the web, our culture – and how we learn. Technology has a long, and often divisive, history in education. It can open new opportunities, but it can also create new challenges – either way, it is children who are impacted the most.

AI in education is predicted to be worth over 20 Billion USD by 2027. The promise of AIEd[1] is often built around the capacity to personalise content and provide instant feedback. Intelligent Tutoring Systems’ can – so the theory goes – collate and analyse complex data sets to understand an individual child's needs, tap into the world's greatest knowledge and deliver a bespoke course of study.

Whilst, this techno-transactional framing of learning has many flaws, one can see the appeal of the perfect, always available, personal tutor - particularly in a pandemic. However, as with so much about technology, the reality is often generally more complex and mundane - less sci-fi slaughter-bots and more human rights and literacy - but the consequences are often no less profound.

During the lockdowns necessitated by the Covid-19 pandemic, pupils in England were unable to sit traditional exams. Instead, they received grades that had been determined by an algorithm. Almost 40% of pupils received a lower grade than they anticipated. This led to a backlash from pupils, parents, and teachers that forced the government to retract the grades and issue teacher-led grades instead. The incident affected so many pupils, and was so public in nature that, perhaps for the first time, it shone a light onto the reality of machine led decisions in education and provided a glimpse of the more likely future of AI for most families.

18% of English secondary schools are already using AI in teaching. But nearly all schools at primary and secondary level use technology to assess pupils and manage data about them. These ‘educator-facing and system level technologies’ may not widely use machine learning yet, but the market is steadily maturing, and their influence can be significant. For example, The UK schools inspectorate, Ofsted[2], have already used machine learning to identify schools requiring inspection.

As more data becomes available, as technologies become more sophisticated and as teacher workloads continue to increase, the use of machines to identify patterns and tailor responses will become more widespread. Technological tools could help identify pupils who are in need and ensure they get the right support at the right time to improve their outcomes - but the same technology could also create greater inequality.

61% of parents anticipate that AI will be important to the classroom of the near future. However, many parents (77%) are also concerned about the consequences of the decisions being made, how accountable and how secure they are. The potential complexity of the algorithms and the multitude of data used to drive them can make understanding how machine-led decisions are made extremely difficult. The way systems are designed often also embeds and reinforces existing power dynamics that may disadvantage those who traditionally have less power. As Dr Daan Kolkman notes opaque AI systems can mean “Authoritative models may come to dictate what types of policies are considered feasible”

A lot has already been written about how AI systems can ‘automate inequality’ by amplifying existing biases. For AI in education, the impacts of those biased, ‘authoritative models’ and the machine led decisions that result are experienced by children – children who often have the least critical understanding, the least say and the least influence.

If we acknowledge that data bias and poor design can lead to potentially harmful AI and that children are most likely to be affected, then the question arises: how can children’s rights, views, and needs be integrated into AI data, design, and accountability? How can we ensure equity for children in AIEd?

Regulation can play an important role, The EU General Data Protection regulations and the UK equivalent include sections on automated decision making. The UN Convention on the rights of the child general comment 25 helps bring artificial intelligence and automated design making into scope, and Unicef has developed guidance for the design of rights respecting AI. All of which help to bring attention to the issue, and provide avenues for recourse. But how much do children and their families really know about the systems being used right now, what the potential impacts may be now and in the future, and what rights they may have over these systems? And how feasible is it to expect families to drive these issues and demand change?

As the UK grading incident shows, “critical audiences” can be formed when there is a significant and high profile event - but what about the everyday, the mundane?

AI systems can create opportunities for increasing education equity. But, if we are careless or unprepared, then we will create conditions that may reduce people to automatons, and we will miss the chance to create conditions in which everyone can thrive.

If, on the other hand, we create AI systems that are wise enough to respect children’s rights, if we ensure the data we use is not biassed against children, and we enable young people to participate in the design and governance of the systems, then AI offers many positive opportunities and a future that is much more than robots.

[1] AIEd is an acronym that is not specifically defined, it could refer to various topics related to Artificial Intelligence and Education.

[2] Office for Standards in Education, Children's Services and Skills:

Cliff Manning / Twitter: @cliffmanning

Cliff is Research and Development Director at Parent Zone and founded More Than Robots - a platform to share research, ideas and good practice around digital participation and youth engagement.

Parent Zone sits at the heart of digital family life – supporting parents with the challenges and opportunities of raising children in a digital age. As a social enterprise our mission is to improve outcomes for children in a connected world. Find out more about how we do this at

We use our own and third party cookies to improve the information that is made public on our website and to collect statistical information. If you continue browsing, please consider accepting its use. You can change the settings and learn more here.