The technology and digital regulatory environment in the EU and the UK is experiencing significant evolution in 2025 and beyond. These legal developments present both significant opportunities and complex compliance challenges, requiring businesses to maintain a comprehensive view across the tech regulation landscape.
This article highlights the key milestones for the next 12 months, focusing on four main areas: online safety, AI, data governance, and cybersecurity and resilience.
ONLINE SAFETY ACT: ESTABLISHING NEW DUTIES FOR ONLINE SAFETY
The UK Online Safety Act (OSA) establishes an extensive regulatory framework designed to protect children and adults online by phasing in duties of care on covered service providers,1 including duties to prevent the spread of online content and activity that is illegal or harmful to children. In April 2025, the Office of Communications (Ofcom) published its Children’s Risk Assessment Guidance and its Protection of Children Codes of Practice (Codes). The publication of the Codes introduced summer deadlines for providers of services likely to be accessed by children, which must complete children’s risk assessments and implement measures to protect children from online harms before 25 July 2025.
The children’s safety duties follow on from March and April 2025 deadlines for all service providers to complete illegal content risk assessments and children’s access assessments, and comply with duties of care related to illegal content. For more information, see UK Online Safety Act — Summer 2025 Deadlines, UK Online Safety Act — Spring 2025 Deadlines, and UK Online Safety Act 2023.
Children’s safety online continues to be a regulatory priority in the UK and Ofcom is expected to maintain its OSA enforcement focus by actively enforcing children’s safety duties once in effect on 25 July 2025. When conducting OSA risk assessments and implementing mitigation measures (particularly content moderation), providers should consider similar risk assessment and moderation requirements they may be subject to under the EU Digital Services Act, EU Digital Copyright Directive, EU AI Act, and the EU/UK GDPR. Whilst these requirements vary in scope and approach, identifying core risks that can be assessed and mitigated consistently across a provider’s services and technologies may facilitate a robust, streamlined compliance strategy and efficient operational implementation.
AI ACT: NAVIGATING NEW REGULATORY CHALLENGES
AI is another area of intense regulatory activity this year and beyond. The EU AI Act’s ban on prohibited AI uses came into force in February 2025. AI literacy obligations also came into effect, and while literacy requirements may not be a primary focus for enforcement, supervisory authorities might consider literacy compliance during broader AI Act investigations triggered by higher-risk AI activities.
The obligations for General Purpose AI (GPAI) under the AI Act are set to take effect in August 2025, with the European Commission’s GPAI guidelines and GPAI Code of Practice (the final version of which was published on 10 July) expected to be endorsed in late July — although ongoing negotiations around certain mechanics of the Code of Practice may affect the timeline for implementation and enforcement. The code will shape the practical implications of the AI Act for GPAI model providers and downstream deployers, focusing on EU copyright guardrails, transparency around training data, and systemic risk assessment and mitigation.
A significant milestone for the AI Act will occur in August 2026 when requirements for high-risk AI systems come into force. High-risk use cases may arise in contexts such as credit checks, personal insurance, biometric identification, and recruitment, presenting practical compliance implications for both AI providers/developers and deployers. Although the obligations will not take effect until 2026, the lead time is short given the extensive requirements, potential operational and strategic implications, and the complex interplay of the AI Act with other data and tech regulations like the EU GDPR, the EU Digital Services Act, and the EU Data Act.
EU data protection authorities are already active in enforcing AI data uses from an EU GDPR angle, focusing on large generative AI providers. Businesses have the opportunity to leverage existing transparency efforts, data guardrails, cybersecurity, vendor management processes, and broader innovation and information governance as part of their strategies to effectively address AI legal risks and support innovation.
For more information, see EU AI Act: Obligations for Deployers of High-Risk AI Systems and EU AI Act Published: A New Era for AI Regulation Begins.
In the UK, regulators continue to be active in producing guidance around AI, in particular the data protection regulator (the Information Commissioner’s Office) which is expected to release a statutory code of practice on AI and automated decision-making later this year as part of its AI and biometrics strategy.
DATA ACT: SHAPING THE FUTURE OF DATA GOVERNANCE
The majority of obligations under the EU Data Act will come into force on 12 September 2025. The Data Act introduces requirements for connected products regarding data access, data sharing, transparency, data portability, and data-sharing contracting, and for cloud and edge computing providers regarding service switching, functional equivalence, interoperability, and preventing unlawful government and international data access.
The new regime for connected products and related digital services could significantly expand the consumer and industrial/enterprise data markets and create new data-driven opportunities for businesses. In parallel, the complex interplay of the Data Act with the GDPR, EU Cyber Resilience Act, and EU Digital Markets Act will require considered navigation to ensure existing compliance mechanisms are leveraged and new compliance approaches are consistent.
For cloud services, addressing cybersecurity, resiliency, and data governance standards will be crucial when implementing the Data Act’s interoperability protocols, highlighting another key area of intersection between tech and data regulation. These requirements are also likely to impact the wider cloud supply chain over time.
DORA: STRENGTHENING CYBER RESILIENCE
The EU Digital Operational Resilience Act (DORA) came into force on 17 January 2025 as part of the broader EU cyber and resilience regulatory framework. DORA imposes a range of resiliency requirements on EU-regulated financial services entities and designated “critical” technology providers. These critical providers are expected to be designated in October or November 2025 and will need to implement bespoke DORA compliance plans under the supervision of the European Supervisory Authorities.
DORA’s impact extends beyond financial services entities and designated critical providers, indirectly affecting almost all providers of technology and data-related services to financial services, including those established outside the EU. Enhanced resiliency requirements will be integrated into tech and data supply chains to ensure DORA compliance. A key area of focus this year for relevant businesses will be DORA’s intersection with NIS 2.
Check out this Latham podcast and blog post for more information on DORA.
NIS 2: EXPANDING CYBERSECURITY STANDARDS
NIS 2 significantly broadens the scope of the Network and Information Systems Directive (NIS) cybersecurity regime, enhancing mandatory cyber standards, incident notification, and response requirements, while increasing fines. NIS 2 is implemented at a national level across the EU, with national registration requirements for in-scope entities coming into effect in Q1 2025 and prompting businesses to assess whether and where they may be regulated.
NIS 2 — and the UK NIS Regulations, expected to be updated in 2025 or 2026 to align more closely with NIS 2 — intersect with DORA in core areas such as systems, network and data security, supply chain security, and incident response. Although the regimes are not fully aligned, both require risk-based and proportionate cybersecurity and resilience measures. By addressing NIS 2 / UK NIS Regulations and DORA requirements in a single, risk-based cyber risk management programme, an organisation can streamline compliance across both frameworks and simplify security collaboration across the data and technology supply chain.
NAVIGATING THE YEAR AHEAD
Amid a flurry of regulatory changes in a short period, businesses should maintain a comprehensive view of the tech regulation landscape to ensure their compliance strategies are consistent and efficiently maximise innovation and data and tech opportunities. Across this landscape, certain core, common requirements are emerging:
- Transparency: Robust transparency requirements across technologies, online services, and data flows, to end users, along the supply chain, and to regulators
- Active and adaptable governance: Proactive, structured, resourced governance of technology and data risks; actively managed risk assessments; flexible internal policies, reviewed regularly and integrated into business operations; comprehensive data and technology mapping; audits and testing
- Accountability and documentation: Clear and high expectations of accountability; extensive documentation and recordkeeping requirements; increasing need to evidence governance and compliance
As the regulatory framework continues to develop and enforcement trends emerge, new prospects and challenges will arise, requiring careful navigation of this complex legal landscape.
ENDNOTES
- 1The OSA applies to providers of online user-to-user services and search services (both UK providers and non-UK providers with links to the UK), catching a large number of digital platforms and services.