The language your AI agent uses to talk to InfiniteOcean. Plain sentences that save data, build websites, send emails, and run code. You don't need to learn it — your AI already knows it. But here's what it looks like.
-- Save a user save users @alice name: "Alice" role: "admin" -- Get them back users @alice -- Find all admins find users where role = "admin" sort name limit 10 -- Pick specific fields find users where role = "admin" then pick name -- Run some code save fn/greet @main "def main(ocean, request): return {'hello': request.get('name', 'World')}" save fn/greet @_config execute: "public" -- Schedule it to run every morning every day at 9am daily-greet: run fn/greet -- Share access share users with @collaborator-key
This is real. Every command runs live. No signup needed.
Send your commands as text. Each line runs in order. Results come back instantly.
# curl curl -X POST https://api.infiniteocean.io/exec \ -H "Content-Type: application/json" \ -H "X-Ocean-Key: YOUR_KEY" \ -d '{"query": "save users @alice name: \"Alice\"\nusers @alice"}' # Response [ { "ok": true, "drop_id": "uuid", "key": "alice" }, { "ok": true, "data": { "name": "Alice" } } ]
Multiple commands run top to bottom. If one fails, the rest still continue.
Save, get, update, and remove records.
save users @alice name: "Alice"
users @alice
update users @alice age: 30
remove users @alice
users
Filter, sort, full-text search, semantic search, aggregate.
find users where age > 10 sort name
find users sort created_at desc
search users "alice" mode text
find orders group category sum amount count
count users
Deploy Python functions, run them, fetch URLs.
save fn/x @main "def main(o,r): ..."
run fn/x input: "data" async
status "job-id"
fetch "https://..."
Run things on a timer. React when data changes.
every day at 9am digest: save ...
every 5 minutes check: run fn/x
when orders changes call "https://..."
schedules
webhooks
Control who can read and write your data.
share myapp with @collaborator-key
unshare myapp from @collaborator-key
Modes: protected (reads public), shared (reads locked), writeonly (writes public)
Atomic add/subtract on any field.
add 1 to stats @views count
add 10 to stats @views count
subtract 2 from stats @views count
Send emails, manage custom domains.
send email to "[email protected]" subject "Hi" body "..."
mail claim "username"
mail domains
mail domain add "yourdomain.com"
Events, RSVP, ICS feeds.
event cal @standup title "Daily" start "..."
events cal
rsvp cal @standup accepted
calendar feed "my-calendar"
Rotate keys, manage sub-keys.
key rotate
key migrate "old-key" "new-key"
subkey read "sub-key-id"
subkey update "id" name: "new"
Initiate uploads, list and delete files.
upload "media/photos" "sunset.jpg"
blob list "media/photos"
blob delete "media/photos" @"sunset.jpg"
Platform stats, user management, io grants.
admin stats
admin users
admin logins
admin grant "KEY" 1000
Send push notifications to browsers.
push "Title" to ("KEY") body "text"
push subscribe "route/*"
push subscriptions
push unsubscribe "id"
Create your own currency. Mint, transfer, burn.
value create "gold" name "Gold" symbol "GLD"
value mint "gold" 1000
value transfer "gold" 100 to "KEY"
value burn "gold" 50
value balance "gold"
value supply "gold"
value currencies
value holders "gold"
value ledger "gold"
Define patterns, validate your data.
define "todo" version "1" fields title: "string"
patterns
validate "todo" title: "Buy milk"
Search across all your data by date, location, identifier, amount.
primitives dates "$gte": "2026-01-01"
... identifiers "$has": type: "email"
Save many at once. Chain transforms.
bulk (save a @x v: 1, save b @y v: 2)
find users then pick name then first 10
Full SQL when you need it.
sql "SELECT * FROM users WHERE age > 25"
sql "INSERT INTO users (name) VALUES ('Alice')"
Check who you are, your balance, your routes.
whoami
balance
transfer 100 to "KEY"
routes
context
Stream data out. Browse entity versions.
export users
export orders latest
ledger
list users
2FA, multi-sig, GDPR compliance.
totp setup
totp verify "123456"
multisig propose "route" action: "grant"
gdpr erase "route"
gdpr redact "route" @key fields: "email"
Custom domains, handles, purchasing.
domain add "myapp.com"
handle set "myname"
resolve "myname"
purchase "route" @item
Verify data integrity. SHA-256 hash chain.
chain status
chain verify
Node info, sync status, documentation.
nodes
node status
sync status
docs
docs "topic"
Chain commands together. Results flow from one step to the next. (The pipe | also works as an alternative.)
-- Get the 3 most recent error messages find logs where level = "error" sort created_at desc then pick msg then first 3 -- Count users count users -- Get the first order find orders sort created_at then first -- Extract names and emails users where status = "active" then pick name, email
Data transforms: pick, first, last, flat, sort (add asc or desc, default asc), skip, limit, count. (pluck and .field dot syntax still work for backwards compatibility.)
Find something, then do something with it:
-- Email the latest customer find customers sort created_at desc then first then send email to email subject "Welcome!" body "Thanks for joining." -- Back up the first 10 users find users then first 10 then save backups/users @snapshot -- Delete old logs find old/logs then remove
Action transforms: save route, send email to field, remove. The send email reads the recipient from a field in the piped data. If no body is given, the data itself becomes the email body.
-- 1. Deploy a news fetcher function save fn/news @main "import urllib.request, json def main(ocean, request): data = json.loads(urllib.request.urlopen('https://api.example.com/news').read()) for a in data[:10]: ocean.write('news/articles', a['id'], a) return {'fetched': len(data[:10])}" save fn/news @_config execute: "public" -- 2. Run it every hour every hour news-fetch: run fn/news -- 3. Get notified when articles arrive when news/articles changes call "https://hooks.slack.com/..." -- 4. Publish a dashboard save _site @news "<html><body><h1>News</h1></body></html>" -- 5. Lock it down share fn/news with @your-key mode protected share news with @your-key mode protected
| Task | Drops | The Old Way |
|---|---|---|
| Store data | save users @alice name: "Alice" | Multiple tools, custom code, database setup |
| Find things | find users where role = "admin" | Multiple tools, custom code, database setup |
| Deploy function | save fn/x @main "def main(o,r): ..." | Complex deployment pipeline |
| Schedule | every day at 9am daily: run fn/x | Multiple services wired together |
| Send email | send email to "[email protected]" subject "Hi" body "Hey" | Email service with complex setup |
| Push notification | push "New!" to ("key") body "Update" | Pages of security configuration |
| Full-text search | search docs "auth flow" | Separate search service |
| Access control | share route with @collaborator | Layers of complexity |
| React to changes | when orders changes call "url" | Layers of complexity |
| Aggregate | find orders group category sum amount | Multiple tools, custom code, database setup |
route -- letters, digits, _ / - route @key -- entity key route @"[email protected]" -- quoted key for special chars
name: "Alice" age: 30 active: true -- key:value pairs (preferred) { name: "Alice", age: 30, active: true } -- {} also works ("a", "b", "c") -- arrays (parens preferred) ["a", "b", "c"] -- [] also works "string" 42 3.14 true false null -- literals
name = "Alice" age > 10 tags contains "urgent" status in ("active", "pending") email exists age > 10 and role = "admin"
Field names are bare (no dot prefix needed). Single = for equality.
users @alice -- get (route + @key) users @alice name: "Alice" -- save (route + @key + payload) users where age > 10 -- find (route + where) users -- list (bare route)
find orders group category sum amount count find users group role count find users then pick name, email count users
every day at 9am digest: save reports @daily type: "digest" every hour check: run fn/health every monday at 8am weekly: send email to "[email protected]" subject "Week start" every 5 minutes tick: add 1 to stats @ticks count every midnight cleanup: remove temp @old -- Options: minute, hour, day, weekday, midnight, noon, monday-sunday, N minutes, N hours, at 9am, at 5:30pm
-- Newlines or semicolons separate commands save a @x v: 1 a @x -- Same thing on one line save a @x v: 1; a @x -- Comments start with --
Works with AI tools. Connect Claude, Cursor, or any AI assistant — it can build complete apps using Drops:
{
"mcpServers": {
"ocean": {
"url": "https://api.infiniteocean.io/mcp/default",
"headers": { "X-Ocean-Key": "YOUR_KEY" }
}
}
}
The agent gets a single exec tool and can write multi-line Drops commands to save data, deploy functions, set up schedules, send emails, and control access — all in one tool call. It also gets a built-in docs command to read platform documentation.