Education Secretary Addresses Global AI Safety Summit

UK Gov

Education Secretary Addresses Global AI Safety Summit

In her speech, the Education Secretary, announced stronger safety measures to be introduced around AI in education.

Hello everyone and thank you so much for coming.

And thank you to those of you who have worked so hard to make today happen.

I am delighted to welcome our friends from around the world here to London.

And we meet today on Chancery Lane, an historic street that goes back nine centuries.

It was originally built by the Knights Templar to connect their old temple in Holborn to their new one just south of here.

A street with a rich history, built to connect the old to the new…

A perfect place for this summit, because that’s what I want to talk about today.

How we take what’s always been best about education, going back centuries – great teachers, collaboration, curiosity,

and combine it with this exciting new wave of AI and edtech.

Connecting the old to the new.

The past to the future.

And what a bright future it can be.

AI can deliver a whole new era.

It can be the biggest boost for education in the last 500 years.

The most radical force for progress since the invention of the printing press in the 15th Century.

That invention changed everything.

The wealth of knowledge that had been the privilege of the elite quickly became accessible to all.

And the world saw an explosion in literacy.

The rapid spread of ideas.

A new wave of education.

And that’s the level of transformation we’re talking about here.

The potential is huge, but so is our responsibility.

And this government takes that responsibility incredibly seriously.

It’s a responsibility to get this right.

To work with you and with teachers, leaders and parents to harness the power of AI in ways that are both safe and effective.

But we are ambitious for what AI can mean for our children’s education.

I will join forces with my colleague the Secretary of State for Science, Innovation and Technology, combining the innovation of the Prime Minister’s AI incubator team and the new digital centre of government with my department’s expertise in education.

We see the opportunities of AI to superpower the learning of every child – especially children from disadvantaged backgrounds and with special educational needs and disabilities.

With the help of AI, teachers can tailor every lesson they deliver to respond to the needs of every child.

To better support the needs of every child.

And that excites me like nothing else.

The idea that finally, for the first time in history, we can maximise learning for all children.

We can make education work for every child.

That’s what drives me.

What sits at the heart of my vision for our children.

But be in no doubt – there will still be a teacher at the front of every classroom.

A living, breathing human – a beating heart at the centre of the learning process.

The strongest connection between past and future.

Because no matter how transformational technology becomes, learning will remain a deeply human act.

AI changes everything, but not this.

That spark of connection, the bond between one generation and the next, that must always come from one human to another.

And the true power of AI is to support that special connection.

To improve it, to strengthen it, but never to replace it.

That’s what students in our schools tell us.

We recently spoke to more than a thousand young people across the country to put their views at the centre of how we use generative AI in the classroom.

The full findings will be presented in the next session, so I won’t give it all away now.

But the message was clear: students strongly value personal attention from their teachers and social interaction with their friends.

They want AI to support those relationships, to help teachers provide personalised feedback.

But they don’t want AI to eliminate the human connection.

And they’re absolutely spot on.

So under this government, AI will back our teachers, but never remove them.

AI will empower our teaching assistants, never make them obsolete.

I want that simple truth be our guide today.

We’ll cover educational quality, agency, equity and how we check our progress and make improvements to better serve learners.

That’s all vital, because this part of London also comes with a warning from one of my favourite British authors of all time.

Charles Dickens.

Chancery Lane and the Court of Chancery provided the inspiration for his novel Bleak House.

Dickens described a legal system that had become bogged down in its own complexity.

He criticised a system that had lost sight of its purpose, that was harming the people it was meant to serve.

And there’s a lesson for us when it comes to AI in education.

Not to get lost in the fog of complexity.

But to keep sight of our true purpose – to lift the learning of children, young people and adults across the world.

To support teachers to teach and learners to learn. To protect and promote that precious covenant between generations.

That’s it. That must be our focus.

Not chasing profit or rolling out flashy tools for the sake of it.

The purpose of AI in education is to expand opportunity for every child and young person – to help them achieve and thrive.

But, first and foremost, AI in education must be safe.

I know there are debates going on in all our countries about the role of technology in childhood,

but here and now I want to focus on how we make sure the EdTech used by our teachers and students is safe.

And, colleagues, you know how much that matters.

Because it isn’t helpful unless it’s safe. It isn’t effective unless it’s safe.

The public are watching us, and their first demand is that we make AI safe in our schools.

That’s why last year my department worked with top tech firms like Google and Microsoft to get on the front foot- with a framework of safety expectations in place for schools.

It’s why my department has developed resources and guidance to help teachers use AI safely and effectively.

And now, this government is going further on safety.

I’m pleased to announce today that we’re updating our safety standards for schools,

building on our robust set of forward-leaning measures to get ahead of emerging harms.

Our updated safety standards for schools safeguard mental health.

High profile cases have alerted the world to the risk of a link between unregulated conversational AI and self-harm.

So our standards make sure pupils are directed to human support when that’s what’s needed.

They also prevent AI acting as a substitute for cognitive development.

We know that learning happens when pupils make a genuine effort to answer a question before receiving help.

AI must support that effort, not replace it.

It must encourage, not spoon feed.

Offer assistance, not shortcuts.

Help to tease out the answer.

The standards also manage risks to emotional and social development, especially for young pupils and children with special educational needs and disabilities.

We’ve got to make sure AI products don’t replace vital human interactions and relationships.

Experts tell us and research confirms that when AI tries to look like us, mimicking our social cues, a machine in human’s clothing, it can foster in our children unhealthy levels of trust and disclosure.

This shadow of human experience does their development no good.

Finally, the updated safety standards restrict persuasive or exploitative design too.

We don’t want our children kept on apps or on screens longer than necessary for their education.

We’re going to do this right.

And I know that the edtech industry – both in the UK and across the world – is a powerhouse of creativity and innovation.

So I have no doubt that companies have the drive and expertise to

https://www.gov.uk/government/speeches/education-secretary-speech-at-global-ai-safety-summit

View Original | AusPol.co Disclaimer

Leave a Reply

Your email address will not be published. Required fields are marked *

We acknowledge and pay our respects to the Traditional Owners of country throughout Australia


Comments | Disclaimer
All rights are owned by their respective owners
Terms & Conditions of Use