Cloudanix Joins AWS ISV Accelerate Program
Database JIT · powered by Cloudanix Agentic JIT

Prod database access.
without a jumpbox, without shared passwords, without mystery queries

Your engineers need to query prod. Today that means a bastion, a shared db_reader password, and audit logs nobody can trust. Database JIT replaces all of it: request in one click, approve in Slack, run cdx db connect, and open the database in the IDE you already love — with every query stamped to a real human.

✓ Postgres · MySQL · Aurora · SQL Server · Mongo · Redshift · Snowflake ✓ Works with DBeaver, DataGrip, TablePlus, pgAdmin, Workbench ✓ Data never leaves your VPC
🗄 Request database access Production
Database ew1-cloudanix-db-1
DB user ew1-cloudanix-db-1_writer
Duration 30 min 1 hr 2 hr
Reason Jira CX-4821
~/ — cdx db connect — 120×28
$ cdx db connect -i 061e2fc9 -o af10a481 -w 14414595
CDX CLI Version: 0.2.15
Authentication verified for AWS
Environment configured
Connected to proxy server on port 10715
Database access token received
Database connection established
→ listening for IDE on localhost:40342
DataGrip — ew1-cloudanix-db-1 · localhost:40342
SELECT address FROM users
WHERE id = 10;
✓ 1 row · 23 ms 221B Baker Street, London
signed as sujay@cloudanix.com not db_writer
The problem

Every team that owns a prod database owns four compromises.

You can have security, or you can have engineers who can do their job — pick one. That's the deal most companies have made, and it stopped working the day your data moved to the cloud.

🏰

The jumpbox tax

The database sits in a private subnet — as it should. So now you run a bastion host. Another thing to patch, another set of SSH keys to rotate, another “is the tunnel up?” standup question. Every support ticket starts with five minutes of network plumbing before anyone touches a query.

🔑

The shared-password problem

Your DB doesn't have 40 users. It has db_reader and db_writer, and the password is in a 1Password vault 40 people can open. Rotating it breaks a Lambda nobody remembers owning, so you don't. That password is one laptop away from being on GitHub.

👻

Audit logs that audit nothing

Something went wrong at 3am and ew1-cloudanix-db-1_writer dropped a row. Which human was that? CloudTrail says “the writer account.” DB logs say “the writer account.” Slack says everyone was asleep. Your audit trail ends at the shared user — and so does your forensic story.

📉

No behavior signal to act on

Every query looks the same — all signed by db_writer. There is no baseline for “what Sujay usually queries on a Tuesday,” so there's nothing anomalous to alert on. UEBA on database activity is only as good as the identity attached to the activity. Today you have none.

How it works

One request. One approval. One command. Your own IDE.

No new SQL client. No browser-based “DB console.” No breaking the way your engineers already work.

  1. 1

    Ask, don't guess

    Pick a database and user, set a duration, drop in a Jira ticket. One click. No SSH keys, no bastion config, no “@devops can you help me get in.”

  2. 2

    Approve where the team lives

    Slack or Teams. Context on who, what, why, and for how long. Read intents can auto-approve — writes on prod escalate. One click, and the clock starts.

  3. 3

    Tunnel opens outbound

    cdx db connect on the laptop dials outbound to a proxy running in your VPC. The database stays in its private subnet. No bastion. No inbound firewall hole. Nothing new exposed to the internet.

  4. 4

    Short-lived creds. Familiar IDE.

    Temporary credentials are minted and handed to localhost. Point DBeaver, DataGrip, TablePlus, pgAdmin, or Workbench at it — the IDE sees a normal database. The engineer works the way they always have.

  5. 5

    Every query, stamped to a human

    Even though the DB sees the shared db_writer account, Cloudanix tags each query with the real human's identity. UEBA runs on that stream. Full audit lands in your S3 bucket. Session expires, creds revoked, tunnel closed.

Real-world scenarios

Same JIT. Whoever needs to get in.

Support engineers, on-call DBAs, contractors, auditors — one flow for everyone who touches your data, with the guardrails tuned per case.

scenario · CX ticket · prod-db · read replica
sujay (support) Customer CX-4821 says their address is wrong. Need to see row for id = 10. Nothing more.
Sujay · DataGrip
🎧
tunnel
Cloudanix JIT
signed creds
prod-db · replica
🗄
scope granted SELECT · public.users · 30m · replica only · auto-approved
psql> SELECT address FROM users WHERE id = 10;
 address
-------------------------------------
 221B Baker Street, London
(1 row)
✓ 1 row · stamped to sujay@cloudanix.com ⛔ Session closed · 02:14
scenario · aurora-prod · DBA · write
priya (on-call DBA) Transaction on aurora-prod has been blocking for 18 minutes. Need to kill PID 74521 and release the lock.
⚠ write · prod · irreversible ✓ Approved by oncall-lead · 11s
Priya · DBeaver
🛠
tunnel
Cloudanix JIT
signed creds
aurora-prod
🗄
scope granted writer · 60m · reason: incident INC-7741 · ticket linked
aurora> SELECT pg_cancel_backend(74521);
 pg_cancel_backend
-------------------
 t
(1 row)
aurora> -- lock released. incident resolved.
✓ Every statement stamped · priya@cloudanix.com ⛔ Session closed · audit → INC-7741
scenario · migration vendor · 3rd party laptop
alex (contractor) Running the migration batch on billing-db. From my laptop, through the same cdx CLI your team uses. No shared password on my machine.
Alex · TablePlus
🤝
tunnel
Cloudanix JIT
signed creds
billing-db
🗄
scope granted migration_writer · 4h · SOW-204 · auto-expire today
billing-db> \i migrate_v42.sql
→ 38 statements executed
→ 14,207 rows migrated
→ indexes rebuilt
✓ Every statement stamped · alex@partner-co.com ⛔ Access ends 17:00 · no access tomorrow
scenario · SOC 2 audit · evidence collection
nadia (auditor) Need to run three compliance queries on users and audit_events and grab the output as evidence. Read-only, with a recording.
Nadia · pgAdmin
📋
tunnel
Cloudanix JIT
signed creds
prod-db · replica
🗄
scope granted SELECT · users · audit_events · 1h · recorded
psql> SELECT count(*) FROM audit_events
       WHERE event_time > now() - interval '90 days';
 count
-------
 812,430
(1 row)
✓ Evidence bundle exported · signed ⛔ Session recording → s3://audit-evidence/
works the same with Postgres MySQL Aurora SQL Server Oracle MongoDB Redshift Snowflake BigQuery …any JDBC / ODBC database
The model

Your engineer sees a database. You keep the keys.

What the engineer sees
  • A one-click request in the Cloudanix console or Slack
  • Their favourite IDE connected to localhost
  • A normal database, with the tables they expect
  • No passwords on their laptop. No bastion to SSH. Nothing to set up again tomorrow.
cdx db connect
What security keeps
  • The database in its private subnet, no new ingress, no jumpbox
  • The real DB credentials — they never leave your vault
  • A session tied to a real human, even when the DB user is shared
  • Every query, every row count, every rollback — in your S3 bucket, with UEBA running on it
Governance & data stay in your cloud.
IDE-native

Use the tool you already know.

Database JIT exposes a normal database endpoint on localhost. If it speaks JDBC, ODBC, or a native driver — it just works. No browser-based SQL console to learn, no context-switch for your team.

DataGrip
DBeaver
TablePlus
pgAdmin
MySQL Workbench
psql / mysql CLI
Any JDBC / ODBC client
What you get

Everything a compliant, auditable database flow requires — none of the plumbing your team hates.

🏰

Zero jumpbox

Outbound tunnel to a proxy in your own VPC. The database stays private. No bastion to patch, no SSH keys to rotate.

🪪

Identity-stamped queries

Shared DB users stay shared. Activity is still tied to a real human — the one forensics and compliance actually need.

Short-lived credentials

Minted per session, expire on a clock, revoked the moment the work ends. No long-lived DB password on a laptop.

Tiered approvals

Auto-approve read intents. Escalate writes on prod to Slack or Teams. Tune the thresholds per team, per database.

📜

Audit in your bucket

Full query log with identity, timestamp, rows returned, session recording — written to your S3. Data never leaves your account unless you ask.

🧠

UEBA on query activity

Real identities make real baselines. Alert on the SELECT that Sujay has never run before, on a Sunday, from a new country.

🧩

Works with every database

Postgres, MySQL, Aurora, SQL Server, Oracle, Mongo, Redshift, Snowflake, BigQuery. Managed or self-hosted.

🧑‍💻

Any IDE, no lock-in

DataGrip, DBeaver, TablePlus, pgAdmin, Workbench, psql — engineers keep the tools they love.

🌐

One JIT for humans & agents

Same policy engine, same approval flow, same audit — whether the requester is a person or an AI coding agent over MCP.

Ready to see your graph?

Connect a cloud account in under 30 minutes. See every finding rooted in identity, asset, and blast radius — with a fix path attached.

Book a Demo