
The barrier to entry for coding has dropped essentially to zero. You no longer need to know Solidity or be a developer to build onchain applications.
To make this new reality as accessible as possible, we've introduced an llms.txt file, a Context7 MCP integration, and this guide to putting it all together. Here's how to start vibecoding with Oasis.
Why context matters
When you prompt an LLM to build something, it's working from whatever it picked up during training. This is often patchy, outdated, or wrong. It might produce code that looks reasonable but breaks in practice.
The solution is to give your tool direct access to Oasis docs so it can consult the right material rather than guess. There are two ways to do this: llms.txt and MCP.
What is llms.txt?
llms.txt is a standardized file format (llmstxt.org) that gives AI a structured index of a project's documentation. Like a sitemap, but designed specifically for AI.
It provides brief descriptions and links to detailed markdown files so that a model can quickly find and read the right documentation for any given task. There are two files:
- https://docs.oasis.io/llms.txt — a curated index with page titles, descriptions, and URLs
- https://docs.oasis.io/llms-full.txt — the complete documentation content inlined in one file
You can paste either URL directly into any LLM chat, or add them as project context in tools that support it (e.g. Cursor's @docs feature, or a CLAUDE.md file).
Use llms.txt for a quick overview or when working within tight context limits; use llms-full.txt when you want the agent to have access to everything.
What is MCP?
Model context protocol (MCP) is an open standard for giving AIs structured access to external context: documentation, codebases, tools, and runtime information.
Without it, a model only sees the prompt you type. With it, it can look up documentation on demand and query external tools when needed. It turns a model from a semi-blind generator into a collaborator that's aware of its environment.
Oasis documentation is indexed on Context7, an MCP server that serves docs to AI coding assistants. The library ID is llmstxt/oasis_io_llms_txt.
Getting set up
Option 1: Cursor
Add the following to your .cursor/mcp.json:
json
Cursor will now connect to Context7 and your model will have access to the full Oasis documentation when generating code.
Tip: Add a rule to your Cursor settings to make sure the model always consults the Oasis docs: Always use Context7 MCP with library ID llmstxt/oasis_io_llms_txt when you need Oasis documentation.
Option 2: Claude Code
Run:
bash
Verify it's configured:
bash
You should see context7 listed. You're good.
Tip: Add the following to your project's CLAUDE.md so the model always pulls from the right source: Always use Context7 MCP with library ID llmstxt/oasis_io_llms_txt when you need Oasis documentation.
Option 3: Other AI tools
Context7 supports 40+ clients including VS Code, JetBrains, Windsurf, and Zed. See the full client list for setup instructions specific to your tool.
Before you start
You'll need a few things in place before you prompt anything.
- Node.js is required for Hardhat. Check if it's installed by running node -v in your terminal. If you see a version number it's there.
If not, download the LTS version from nodejs.org, install it, then reopen your terminal and check again.
- A wallet: you'll need a wallet with a private key.
The safest approach is to create a fresh account in MetaMask just for this: click your account icon, select "add a new account", and name it "dev wallet" or something.
- Testnet tokens: go to faucet.testnet.oasis.io, select Sapphire from the network dropdown, paste in your dev wallet address, and request tokens.
They're free and will arrive within a minute or two.
Example: deploy a confidential smart contract
Open a fresh project in your IDE with the Oasis MCP connected and type:
"Create a confidential smart contract on Sapphire with Hardhat. It should store a secret message that only the owner can set. Anyone can submit a guess, but the actual secret should never be visible onchain."
With the MCP, the model pulls from the docs, finds the right Hardhat config, understands that contract state on Sapphire is private by default, and generates a working contract with a full project structure including deploy script, interaction examples, and tests.
If you're using Claude Code, it will ask for permission to run commands during this process, select "don't ask again".
It will also generate a .env.example file; copy it to .env and add your dev wallet private key:
To get your private key from MetaMask: click the three dots next to your dev account → account details → show private key → enter your password. Add it to the .env file as PRIVATE_KEY=0x...
Then deploy:
bash
You'll get a contract address back. And voila, that's a confidential smart contract live on testnet, built entirely from a prompt.
To confirm it's confidential, try calling eth_getStorageAt on your contract at explorer.oasis.io/testnet/sapphire — you'll get back zeros. The secret is stored onchain but invisible to anyone outside the contract.
Due to its confidentiality properties, Oasis unlocks use cases that aren't possible on standard EVM chains. With your AI coding tool connected to the docs, describe what you want to build, and let the model do the heavy lifting.
Docs: https://docs.oasis.io/build/tools/llms/
Have questions or want to share what you've built? Join the Oasis Discord and let us know.


.png)

.png)