UK’s Generative AI: Product Safety Expectations

The UK’s Department for Education publishes its outcomes-oriented safety recommendations for GenAI products, addressed to edtech companies, schools and colleges.

UK’s Generative AI: Product Safety Expectations

St. Gallen, February 20, 2025 – On 22 January 2025, the UK’s Department for Education (DfE) published its Generative AI: Product Safety Expectations. This is part of the broader strategy to establish the country as a global leader in AI, as outlined in the Government’s AI Opportunities Action Plan

As a leading edtech company with over 20 years of experience, Avallain was invited to participate in consultations on the Safety Expectations. Avallain Intelligence’s focus on clear ethical guidelines for safe AI development, demonstrated through TeacherMatic and other AI-driven solutions across our product portfolio, is reflected in our role in these consultations, where we were well-positioned to contribute expert advice.

Product Expectations for the EdTech Industry

The Generative AI: Product Safety Expectations define the ‘capabilities and features that GenAI products and systems should meet to be considered safe for use in educational settings.’ The guidelines, aimed primarily at edtech developers, suppliers, schools and colleges, come at a crucial time. Educational institutions need clear frameworks to assess the trustworthiness of the AI tools they are adopting. The independent report, commissioned by Avallain, Teaching with GenAI: Insights on Productivity, Creativity, Quality and Safety, provides valuable insights to help inform these decisions and guide best practices.

Legal Alignment, Accountability and Practical Implementation

The guidelines are specifically intended for edtech companies operating in England. While not legally binding, the text links the product expectations to existing UK laws and policies, such as the UK GDPR, Online Safety Act and Keeping Children Safe in Education, among others. This alignment helps suppliers, developers and educators navigate the complex legal landscape. 

From an accountability point of view, the DfE states that, ‘some expectations will need to be met further up the supply chain, but responsibility for assuring this will lie with the systems and tools working directly with schools and colleges.’ Furthermore, the guidelines emphasise that the expectations are focused on outcomes, rather than prescribing specific approaches or solutions that companies should implement.

Comparing Frameworks and An Overview of Key Categories

In line with other frameworks for safe AI, such as the EU’s Ethics Guidelines for Trustworthy AI, the Generative AI: Product Safety Expectations are designed to be applied by developers and considered by educators. However, unlike the EU’s guidelines, which are field-agnostic and principles-based, the DfE’s text is education-centred and structured around precise safety outcomes. This makes it more concrete and focused, though it is less holistic than the EU framework, leaving critical areas such as societal and environmental well-being out of its scope.

The guidance includes a comprehensive list of expectations organised under seven categories, summarised in the table below. The first two categories — Filtering and Monitoring and Reporting — are specifically relevant to child-facing products and stand out as the most distinctive of the document, as they tackle particular risk situations that are not yet widely covered.

The remaining categories — Security, Privacy and Data Protection, Intellectual Property, Design and Testing and Governance — apply to both child- and teacher-facing products. They are equally critical, as they address these more common concerns while considering the specific educational context in which they are implemented.

Collaboration and Future Implications

By setting clear safety expectations for GenAI products in educational settings, the DfE provides valuable guidance to help edtech companies and educational institutions collaborate more effectively during this period of change. As safe GenAI measures become market standards, it is important to point out that the educational community also needs frameworks that explore how this technology can foster meaningful content and practices across a diverse range of educational contexts.


Generative AI: Product Safety Expectations — Summary

  • Filtering
    1. Users are effectively and reliably prevented from generating or accessing harmful and inappropriate content.
    2. Filtering standards are maintained effectively throughout the duration of a conversation or interaction with a user.
    3. Filtering will be adjusted based on different levels of risk, age, appropriateness and the user’s needs (e.g., users with SEND).
    4. Multimodal content is effectively moderated, including detecting and filtering prohibited content across multiple languages, images, common misspellings and abbreviations.
    5. Full content moderation capabilities are maintained regardless of the device used, including BYOD and smartphones when accessing products via an educational institutional account.
    6. Content is moderated based on an appropriate contextual understanding of the conversation, ensuring that generated content is sensitive to the context.
    7. Filtering should be updated in response to new or emerging types of harmful content.
    8. Filtering should be updated in response to new or emerging types of harmful content.
  • Monitoring and Reporting
    1. Identify and alert local supervisors to harmful or inappropriate content being searched for or accessed.
    2. Alert and signpost the user to appropriate guidance and support resources when access to prohibited content is attempted (or succeeds).
    3. Generate a real-time user notification in age-appropriate language when harmful or inappropriate content has been blocked, explaining why this has happened.
    4. Identify and alert local supervisors of potential safeguarding disclosures made by users.
    5. Generate reports and trends on access and attempted access of prohibited content, in a format that non-expert staff can understand and which does not add too much burden on local supervisors.
  • Security
    1. Offer robust protection against ‘jailbreaking’ by users trying to access prohibited material.
    2. Offer robust measures to prevent unauthorised modifications to the product that could reprogram the product’s functionalities.
    3. Allow administrators to set different permission levels for different users.
    4. Ensure regular bug fixes and updates are promptly implemented.
    5. Sufficiently test new versions or models of the product to ensure safety compliance before release.
    6. Have robust password protection or authentication methods.
    7. Be compatible with the Cyber Security Standards for Schools and Colleges.
  • Privacy and Data Protection
    1. Provide a clear and comprehensive privacy notice, presented at regular intervals in age-appropriate formats and language with information on:
    2. The type of data: why and how this is collected, processed, stored and shared by the generative AI system.
    3. Where data will be processed, and whether there are appropriate safeguards in place if this is outside the UK or EU.
    4. The relevant legislative framework that authorises the collection and use of data.
    5. Conduct a Data Protection Impact Assessment (DPIA) during the generative AI tool’s development and throughout its life cycle.
    6. Allow all parties to fulfil their data controller and processor responsibilities proportionate to the volume, variety and usage of the data they process and without overburdening others.
    7. Comply with all relevant data protection legislation and ICO codes and standards, including the ICO’s age-appropriate design code if they process personal data.
    8. Not collect, store, share, or use personal data for any commercial purposes, including further model training and fine-tuning, without confirmation of appropriate lawful basis.
  • Intellectual Property
    1. Unless there is permission from the copyright owner, inputs and outputs should not be:
      • Collected
      • Stored
      • Shared for any commercial purposes, including (but not limited to) further model training (including fine-tuning), product improvement and product development.
    2. In the case of children under the age of 18, it is best practice to obtain permission from the parent or guardian. In the case of teachers, this is likely to be their employer—assuming they created the work in the course of their employment.
  • Design and Testing
    1. Sufficient testing with a diverse and realistic range of potential users and use cases is completed.
    2. Sufficient testing of new versions or models of the product to ensure safety compliance before release is completed.
    3. The product should consistently perform as intended.
  • Governance
    1. A clear risk assessment will be conducted for the product to assure safety for educational use.
    2. A formal complaints mechanism will be in place, addressing how safety issues with the software can be escalated and resolved in a timely fashion.
    3. Policies and processes governing AI safety decisions are made available.

About Avallain

At Avallain, we are on a mission to reshape the future of education through technology. We create customisable digital education solutions that empower educators and engage learners around the world. With a focus on accessibility and user-centred design, powered by AI and cutting-edge technology, we strive to make education engaging, effective and inclusive.

Find out more at avallain.com

About TeacherMatic

TeacherMatic, a part of the Avallain Group since 2024, is a ready-to-go AI toolkit for teachers that saves hours of lesson preparation by using scores of AI generators to create flexible lesson plans, worksheets, quizzes and more.

Find out more at teachermatic.com

Contact:

Daniel Seuling

VP Client Relations & Marketing

dseuling@avallain.com

Avallain Reinforces its Commitment to Research-Driven Solutions with a Newly Commissioned GenAI Report

How is GenAI being integrated into schools to enhance teaching and learning? ‘Teaching with GenAI: Insights on Productivity, Creativity, Quality and Safety’ delves into this critical question by exploring the opportunities GenAI offers, the challenges it poses and how it’s shaping the future of education.

Avallain Reinforces its Commitment to Research-Driven Solutions with a Newly Commissioned GenAI Report

St. Gallen, January 30, 2025 – Education technology pioneer Avallain introduces, Teaching with GenAI: Insights on Productivity, Creativity, Quality and Safety. This independent report commissioned by the Avallain Group and produced by Oriel Square Ltd, is key research that provides valuable insights for educators and policymakers alike.

This timely and comprehensive report explores how generative AI is being integrated into schools to enhance teaching and learning outcomes and the critical opportunities and challenges it presents.

Professor Rose Luckin, of University College London and Founder of Educate Ventures Research, says the report is ‘an essential read for any education leader navigating the AI landscape.’

Navigating the Opportunities, Challenges and Risks of GenAI

The ‘Teaching with GenAI: Insights on Productivity, Creativity, Quality and Safety’ report provides detailed insights into how GenAI saves time and boosts efficiency, allowing educators to streamline workflows and dedicate more time to impactful teaching. It delves into the tools and training needed to create meaningful learning materials, providing practical advice for designing engaging and effective content. The report examines how GenAI fosters creativity and innovation in teaching practices, encouraging educators to reimagine their instructional approaches.

Beyond this, the report also stresses the importance of quality control in GenAI applications, identifying areas where oversight is essential to ensure high standards in AI-generated content. Critical advice is offered around data security and tackling inbuilt bias, helping educators and institutions confidently address these key concerns. More importantly, the report provides actionable recommendations on how schools and organisations can effectively integrate and apply GenAI to maximise its potential while ensuring ethical and responsible use.

As Professor John Traxler, Academic Director of the Avallain Lab, explains, ‘While schools and educators acknowledge the potential of GenAI tools to assist in key pedagogical tasks, they also express concerns about content accuracy, the risk of perpetuating biases and the impact of these tools on their evolving role in the classroom. This underscores the need to provide educators with GenAI solutions tailored to educational contexts and the critical analysis skills required to engage with these technologies safely and effectively.’

A Commitment to Research-Driven Solutions

The rapid rise of GenAI has introduced both unprecedented possibilities and complex challenges in the educational landscape. With a long history of developing educator-led technology, Avallain has always believed that research-driven approaches are essential to ensuring technology supports learning outcomes.

‘This report reflects our commitment to research-driven solutions that empower educators. By exploring the benefits, potential and challenges of GenAI through the experiences of teachers and specialists, we aim to provide valuable insights and actionable recommendations to the educational community. Together, we are navigating this transformative field to deliver technology that ethically and safely supports teachers and students.’ As Ignatz Heinz, President and Co-Founder of Avallain, highlights.

Over 50% of teachers in England use AI tools to reduce workload, and 40% use them to personalise learning content.

Avallain’s Approach to Ethical and Safe GenAI Integration

As GenAI enters classrooms, Avallain is doubling down on this commitment with these informative reports and its broader AI strategy, Avallain Intelligence, which aims to responsibly integrate AI across the entire edtech value chain. This initiative is built on the principle that ethical AI is essential—not only for achieving better outcomes, enhanced productivity and safe, innovative learner interactions but, more importantly, as a foundation for the reliable adoption of these tools in our societies, particularly in our educational systems.

Carles Vidal, Avallain Lab Business Director, explains further, ‘Avallain’s unwavering commitment to Ethical AI is reflected in a range of AI solutions designed in alignment with the Ethical Key Requirements, outlined in the EU’s Ethics Guidelines for Trustworthy AI. These guidelines uphold the principles of respect for human autonomy, prevention of harm, fairness and explicability.’

The newly commissioned report aligns with this AI strategy by exploring critical ethical, safe, and effective implementation considerations. It provides actionable recommendations for schools and educators to adopt these technologies while ensuring responsible use confidently.

Leveraging Insights to Drive GenAI in Education

Avallain strives to remain at the forefront of educational innovation, actively monitoring and analysing educators’ difficulties as they integrate generative AI into their teaching practices. With a particular focus on ethics and pedagogy, these insights shape the ongoing development of Avallain’s next generation of GenAI features implemented in our solutions. Explore the full report and gain a deeper understanding of how GenAI can enhance teaching and learning. 

Download your free copy here.

Register Now for Upcoming Live Report Briefings

As part of our commitment to supporting educators and institutions, look out for upcoming report briefings to explore key insights from the report, including practical and ethical steps for integrating GenAI effectively. This is an opportunity to engage in discussions about the future of AI in education.

Secure my place

About Avallain

At Avallain, we are on a mission to reshape the future of education through technology. We create customisable digital education solutions that empower educators and engage learners around the world. With a focus on accessibility and user-centred design, powered by AI and cutting-edge technology, we strive to make education engaging, effective and inclusive.

Find out more at avallain.com

About TeacherMatic

TeacherMatic, a part of the Avallain Group since 2024, is a ready-to-go AI toolkit for teachers that saves hours of lesson preparation by using scores of AI generators to create flexible lesson plans, worksheets, quizzes and more.

Find out more at teachermatic.com

Contact:

Daniel Seuling

VP Client Relations & Marketing

dseuling@avallain.com

International House World Organisation Partners with Avallain to Empower the Next Generation of Expert Educators

The partnership confirms Avallain’s traction in the international, accredited ELT market via its customisable digital education solutions that enable educators to create, deliver and manage innovative learning experiences.

International House World Organisation Partners with Avallain to Empower the Next Generation of Expert Educators

Lustmühle, September 12 2024 – Digital education supplier Avallain today announces that it has signed International House World Organisation (IH World) as a client. IH World can now leverage Avallain Magnet, the powerful and customisable learning management system, in combination with Avallain Author’s AI-powered content creation and management features.

IH World will use these solutions to develop, enhance, deliver and analyse world-class digital products, courses, and programmes for teacher training purposes. Avallain Magnet and Avallain Author, both leveraging AI-integrated technology, will equip the new generation of expert educators and learners with the skills they need to succeed. 

IH World is one of the world’s most respected language teaching organisations, with more than 130 language schools operating in over 40 countries. The addition of IH World to Avallain’s client base is a seal of trust from another well-respected education organisation. The partnership sets a new standard for essential digital teacher training in the wider ELT market.

“We’re proud to now be working closely alongside International House World Organisation, integrating our cutting-edge, AI-backed suite of content creation and management tools, and our end-to-end learning management system for the benefit of excellent educators”, says Ian Johnstone, Vice President Partnerships at Avallain. “I am confident that we will see great outcomes from this collaboration,” he adds.

Avallain Magnet is an end-to-end LMS, featuring a fully responsive and customisable interface. Its numerous LMS functionalities range from user and subscription management over virtual classrooms and built-in messaging to grade books, reporting and analytics. The out-of-the-box LMS is seamlessly integrated with Avallain Author, a flexible AI-powered content creation and management tool, with editorial workflows and interactive activity types honed for publishers, institutions, and schools. These capabilities contribute to a more accessible and impactful ELT space, leading to unmatched teacher training outcomes.

Avallain consistently releases new product features, ensuring its clients remain at the forefront of innovation in digital education. A prime example is a re-envisioned Discussion Forums feature, which addresses the specific needs of IH World and goes beyond the isolated boards of the past. The feature enables teachers and students to engage in discussions on both new and familiar topics, incorporate links to relevant courses, and facilitate activities related to the conversations, keeping conversations informed by content, direction and purpose. An AI-powered conversation aide is also in the works.

Avallain is committed to exceeding client expectations. Its Customer Success and Product teams work in tandem to always provide a comprehensive and consistent client experience. 

“We’ve been impressed with Avallain’s personal support every step of the way”, Emma Hoyle, Managing Director at IH World, notes. “Avallain Author and Avallain Magnet accelerate our creative processes and authoring of learning content. The learning management system streamlines learning, delivering quality content and experiences for teachers. We’re excited to see how we will continue to collaborate with Avallain in the future.”

Avallain will now continue to work with IH World to develop innovative solutions that enable teachers, learners, and IH affiliates to grow from the partnership.

Learn more about Avallain Magnet here.

About Avallain

Avallain is on a mission to reshape the future of education through technology. The company creates customisable digital education solutions that empower educators and engage learners around the world. With a focus on accessibility and user-centred design, powered by AI and cutting-edge technology, Avallain strives to make education engaging, effective and inclusive. Find out more at www.avallain.com

Contact:

Daniel Seuling

VP Client Relations & Marketing

dseuling@avallain.com