Microsoft flex routing — your data from Frankfurt to Texas
← Blog
Data Sovereignty

Microsoft flex routing — your data from Frankfurt to Texas

Microsoft changed Copilot defaults in the EU and most admins missed it. What it means for GDPR, NIS2, DORA — and how to take back control of AI data.

On 3 April 2026 Microsoft published message MC1269223. It announced that from 17 April, for tenants in the EU and EFTA, it would enable flex routing. In plain English: when European data centres are under load, Copilot queries may be processed in the US, Canada or Australia. Enabled by default. Three weeks to react.

The internet exploded. Compliance specialists started counting how many of their clients were affected. Law firms discovered that six months earlier they’d put a “data processed exclusively in the EU” promise into client contracts. IT departments hurried to check their settings.

Under a wave of criticism, Microsoft partially backed off. For some tenants a new message MC1269219 appeared, in which flex routing is disabled by default. But for others — the previous version still applies. Different admins see different settings. The mess is big enough that industry specialists are writing about it openly.

And that’s the real problem. Not flex routing itself. But what this incident revealed about the nature of using global AI tools in a European company.

”Stored in the EU” is not the same as “processed in the EU”

For years, European companies operated on a simplification. “Microsoft keeps our data in the EU” meant it was safe from a GDPR perspective. An auditor asked where the data was — the admin replied “in the EU Data Boundary” — and that was enough.

Flex routing demolished that simplification.

Here’s what a Copilot query looks like from the inside:

  1. A user types: “Find the latest contracts with customer X and summarise the key terms”.
  2. Copilot gathers context in the background — emails with the customer, files from SharePoint, calendar meetings, metadata.
  3. The entire bundle — user prompt plus sensitive company context — is packaged and sent to the LLM on a GPU cluster.
  4. The model generates the answer.

Flex routing concerns step 3. The files stay in the EU. But the context extracted from those files and sent for processing — can leave.

From a GDPR Art. 44 perspective, that’s a transfer. From NIS2 — a potential issue for essential service operators. From DORA — it requires analysis for financial sector entities. It doesn’t matter that the data “at rest” didn’t move from Amsterdam. What matters is that a copy — in the context of other company data — ends up on the other hemisphere.

Why this incident happened now

Flex routing is not a bug. It’s a business response to growing demand. Copilot adoption is growing faster than Microsoft can build GPU capacity in Europe. Something has to give. Either peak-hour performance, or data locality. Microsoft chose pragmatism — sending load where compute is available.

This isn’t the first such situation. And it won’t be the last. It’s only the first that made headlines with fanfare and a three-week consultation window.

Every company dependent on a global AI provider will face similar incidents every few months. Sometimes it’ll be a change to which model has access to your data. Sometimes a retention policy change. Sometimes the addition of a new component provider that wasn’t there before. Always — a decision made outside your control, and you find out from a note in the admin panel.

What your admin should do today

Regardless of the state of flex routing in your tenant — check it. Now. Here’s a short procedure:

How to disable flex routing

  1. Sign in to the Microsoft 365 Admin Center with an AI Administrator role
  2. Navigate to Copilot → Settings → Flexible inferencing during peak load periods
  3. Select Do not allow flex routing
  4. Save
  5. Repeat in Power Platform Admin Center for Dynamics 365, Power Platform and Copilot Studio
  6. Setting propagation takes up to a week — schedule an audit after 7 days

If you’re on Multi-Geo, this setting won’t be visible — Multi-Geo has separate data-locality controls.

This applies to: Microsoft 365 Copilot, Copilot Chat, Dynamics 365 Copilot, Power Platform Copilot, Copilot Studio.

If you have a Data Protection Officer or a compliance team — document the decision. Even if you leave flex routing on for performance reasons, note the rationale in the Data Processing Agreement. When an audit comes, questions about AI will only grow more frequent.

The deeper problem: the defaults aren’t yours

Let’s pause for a moment on the mechanism. Between the documentation publication (3 April) and activation (17 April) there were exactly two weeks and two days. In that time every European admin using Copilot had to:

  • notice the Message Center post (where hundreds of entries land every month),
  • read it with understanding,
  • analyse the legal implications with DPO and compliance,
  • make a decision,
  • apply it in two different admin panels,
  • factor in a week of propagation.

If your admin was on leave or busy with another project — your data, during peak hours, likely flew to Texas. And you’ll only learn about it when someone writes a LinkedIn post.

This isn’t a technical issue. It’s a question of trust structure. If the default settings of the platform you use to process sensitive company data change without your consent — you’re not an administrator. You’re a user. And your company is a hostage to decisions made in Redmond.

Four levels of AI sovereignty — pick yours

Not every company needs the highest level of sovereignty. Not every company can afford the lowest cost either. Here’s what the spectrum looks like:

Level 1 — on-device. AI runs locally on the user’s device (laptop, office server). Nothing leaves the building. Highest level of control, lowest model power. Rarely used outside military and the most sensitive healthcare scenarios.

Level 2 — on-premise. AI runs on your infrastructure (your own servers, private cloud, your Kubernetes). Full control over where data lands, which models are used, who has access. Requires a DevOps team. For regulated companies, consultancies and large organisations with in-house IT — the standard.

Level 3 — regional EU cloud. A SaaS platform whose entire infrastructure physically sits in the EU, with a policy that doesn’t allow cross-border routing regardless of load. This is precisely what Microsoft Copilot does not have — because flex routing breaks this model. This is what e.g. Ragen Cloud has (hosted in the EU, without “flexible” overflow outside borders), Proton, European open-source providers like Scaleway and OVH.

Level 4 — global cloud. The standard model of Microsoft Copilot, Google Workspace AI, ChatGPT Enterprise. Fast start, global infrastructure, global model providers. Great for most companies. But data-locality assurances are conditional — as we just saw with flex routing.

Where does Ragen AI sit? We offer both Ragen Cloud (level 3 — full EU, no cross-border defaults) and on-premise (level 2 — hosted by you, under your security policy). We use European open-source models hosted by Scaleway and OVH, and for scenarios that need frontier models we let you pick the provider consciously, per organisation, with a full audit log.

The philosophical difference is simple: with us there’s no default traffic spill outside the EU, because we never built that functionality in the first place.

5 audit questions: is your AI compliant

If you have any AI deployment in your company — Copilot, ChatGPT Enterprise, your own chatbot, a sales assistant — ask yourself (or ask the vendor) these five questions:

  1. Where does model inference physically happen? Not: “where is the data”. But: in which data centre in which country is the GPU running that processes our prompt.

  2. Does the vendor have a “cross-border during peak load” clause? Check the contract. Check the product documentation. Check the fair-use policy. If it says anywhere that “in case of load we may…” — you have a problem.

  3. Can default security settings be changed by the vendor without your consent? If yes — what’s the notification procedure? How much response time do you get? How do you find out?

  4. What exactly goes into “RAG context”? Files, emails, calendar, contacts, chat messages. Each of those categories may have its own regulatory requirements (GDPR for HR data, legal privilege for contracts, financial regulation for financial data).

  5. Do you have a log of where your data went in the last 30 days? Not: “where it should have gone according to policy”. But: a log showing where it actually went. Without that, you can’t prove compliance at audit time.

If the answer to any of these is “I don’t know” or “I need to check with the vendor” — you have a compliance gap. Not critical, but real.

What this leads to in the coming months

I expect three things:

First, there will be more flex-routing-style incidents in other forms. Not just from Microsoft — from Google, AWS, OpenAI. Each of them has similar capacity dilemmas and a similar temptation to solve them at the cost of a customer’s default configuration.

Second, European supervisory authorities (national DPAs, CNIL, DPC, BfDI) will start asking companies about AI processing locality more frequently in GDPR audits. The standard question “where is the data?” will split into “where is the data and where is it processed, separately”.

Third, European AI providers — Scaleway, OVH, Mistral, our Ragen — will grow faster. Not because they’re cheaper or technically better. Because for a growing number of companies “data does not leave the EU” stops being an ideological abstraction and becomes a customer contract requirement.

Check your settings. Then check your strategy.

The first steps are operational: check the Copilot panel, disable flex routing if this matters for your company, document the decision.

The second step is strategic: consider whether your company is ready for incidents like this to keep happening. Because they will. Here’s who Ragen fits — if your profile matches, you’ll find a readiness checklist there too.

If you need help running an AI data sovereignty audit — we offer it free, 30 minutes. We map which AI tools your company uses, where data physically lands, where the GDPR, NIS2 and DORA risk points are. No sales, no product pitch — concrete analysis and recommendations.

Book the audit — 30 min, free

If you already know you need an AI deployment with full control over where your data lands — estimate costs with our AI cost calculator and talk to us about the details.

See also


Sources: