
Most people's AI context is a mess. Highlights in one tool. Notes in another. Conversations spread across three different chat interfaces that don't talk to each other. Every time you switch models or start a new session, you start over or spend time digging around.
Plurality built AI Context Flow to fix exactly this, the AI portability problem. All context captured in one place and carried into any tool, any agent, any website. Your accumulated knowledge travels with you instead of staying locked in whatever platform generated it.
That was phase one of what Plurality has been building. Phase two is where it gets really interesting.
The Infrastructure Problem
Portable context is useful. But portable context that flows through someone else's servers is a different kind of problem.
Model context Protocol (MCP) is how agents access tools, data, memory, and APIs. It's powerful infrastructure, and more AI products are being built on top of it every month. But the operator running that MCP server sees everything flowing through it. Most people thinking about AI privacy focus on the model. The infrastructure layer underneath it gets less attention, and it's just as exposed.
Plurality ran into this directly while building AI Context Flow. Context carries intent, history, and sensitive data. Routing it through untrusted infrastructure and hoping for the best wasn't really a viable option.
Why This Needs Oasis
Oasis solves the infrastructure exposure problem through Runtime Offchain Logic, which runs compute inside TEEs.
When the MCP server runs inside a TEE, the operator cannot read context in memory. The host OS cannot tamper with execution. Remote attestation proves exactly what code is running. Plurality runs its MCP server on ROFL, which means the infrastructure layer is now verifiable, not just trusted by policy.
Context is stored and processed without being visible in plaintext outside the enclave. The model still sees what it needs to do its job. The infrastructure operator does not.
What's Next
Plurality is building a context marketplace on top of this foundation.
Domain experts spend years building knowledge: curating sources, connecting information, forming conclusions. Right now that knowledge sits in folders or gets scattered across tools. The marketplace lets people package it into context packs and list them for others to inject into any AI tool.
The privacy constraint matters here more than anywhere else. A marketplace brokering knowledge between parties creates an obvious exposure problem: the broker sees everything. Building on Oasis means context can be shared and monetized without being visible to the parties facilitating the transaction.
More details coming soon. Learn about AI context flow here.




