Introduction
Hard-coding credentials (DB passwords, API tokens, SAS keys, hosts) in notebooks or jobs is risky. In Databricks you store them as secrets inside a secret scope, then read them safely at runtime (not printed in plain text). Databricks supports two scope types: Azure Key Vault-backed and Databricks-backed. (Microsoft Learn)
What is a Secret Scope in Databricks?
Secret scope = a named container for secrets.
Types:
- Azure Key Vault–backed: the scope references an AKV; secrets live in AKV and are read-only from Databricks (you manage them in Azure).
 - Databricks-backed: secrets are stored in an encrypted store managed by Databricks.
You create scopes, set permissions (ACLs), add secrets, and reference them in notebooks/jobs. (Microsoft Learn) 
01:04 — Azure Key Vault–backed Secret Scope (AKV)
A) Prepare Azure Key Vault (one-time)
- Create an AKV in your subscription (any resource group/region reachable by your workspace).
 - In Access configuration, set Permission model = Vault access policy (RBAC isn’t supported for AKV-backed secret scopes). (Microsoft Learn)
 - In Networking, allow access from your required VNets/IPs and check Allow trusted Microsoft services to bypass this firewall if appropriate. (Microsoft Learn)
 - Ensure your user has Key Vault Contributor / Owner (as documented) to create the scope binding. (Microsoft Learn)
 
Tip: AKV-backed scopes grant Databricks Get/List permissions to the workspace’s service app so it can read secrets; you still manage the secrets in AKV. (Microsoft Learn)
B) Add a secret to AKV (example)
Azure Portal → your Key Vault → Objects > Secrets → Generate/Import
- Name: 
db-password - Value: 
S0m3Str0ngP@ss - Enabled: Yes
 
(You can also use Azure’s SetSecret REST/CLI.) (Microsoft Learn)
Create the AKV-backed scope (Databricks UI)
- Open: 
https://<your-workspace-url>#secrets/createScope(case-sensitive:createScopewith capital S). (Microsoft Learn) - Scope name: 
akv(any name; case-insensitive). - Manage principal: choose Creator (Premium plan) or All workspace users. (Microsoft Learn)
 - Paste DNS Name (Vault URI, e.g., 
https://mykv.vault.azure.net/) and Resource ID (from KV Properties). (Microsoft Learn) - Create. Verify: 
databricks secrets list-scopes # expect to see: akv(Microsoft Learn) 
Use dbutils.secrets (help, list, get)
In a Databricks notebook:
# What can I do?
dbutils.secrets.help()
# List all scopes
dbutils.secrets.listScopes()
# List secrets (metadata only) in a scope
dbutils.secrets.list("akv")
# Read a secret (value is redacted in outputs)
password = dbutils.secrets.get(scope="akv", key="db-password")
Databricks redacts secret values printed to outputs as [REDACTED]. Use them directly in code instead of printing. (Databricks Documentation, Microsoft Learn)
Databricks-backed Secret Scopes
Use these when you don’t need AKV, want quick setup, or are multi-cloud. Scopes & secrets are managed in the workspace via CLI / API, and you can control access via secret scope ACLs. (Microsoft Learn)
Set up the Databricks CLI
Install the current CLI (0.205+). Verify:
databricks --version
Docs: CLI commands overview. (Microsoft Learn)
Authenticate the CLI with your workspace
# Creates/updates a profile interactively (opens browser):
databricks auth login
# Or set env creds / profiles as needed.
(You can also auth with a service principal for automation.) (Microsoft Learn)
Manage Secret Scopes using the CLI
1) Create a Databricks-backed scope
databricks secrets create-scope my-db-scope
List/verify:
databricks secrets list-scopes
2) Put a secret into the scope
CLI 0.205+ (‘put-secret’):
# Option A: JSON (string_value)
databricks secrets put-secret --json '{
  "scope": "my-db-scope",
  "key": "db-host",
  "string_value": "xyz.example.com"
}'
# Option B: via stdin (multi-line)
( cat << 'EOF'
very
secret
value
EOF
) | databricks secrets put-secret my-db-scope db-password
List secrets (metadata only):
databricks secrets list-secrets my-db-scope
Read (CLI returns base64; decode if you must read via CLI):
databricks secrets get-secret my-db-scope db-password | jq -r .value | base64 --decode
(Usually you read secrets in notebooks with dbutils.secrets.get.) (Microsoft Learn)
Compatibility: older CLI used
databricks secrets put --scope ... --key ... --string-value .... Preferput-secreton new CLI. (Databricks Documentation)
Use secrets in notebooks & jobs (examples)
A) JDBC with Python
user = dbutils.secrets.get("my-db-scope", "db-user")
pwd  = dbutils.secrets.get("my-db-scope", "db-password")
jdbc_url = dbutils.secrets.get("my-db-scope", "jdbc-url")
df = (spark.read.format("jdbc")
      .option("url", jdbc_url)
      .option("dbtable", "public.orders")
      .option("user", user)
      .option("password", pwd)
      .load())
End-to-end JDBC + secrets example: official tutorial. (Microsoft Learn)
B) Reference secrets in Spark config (preview) or environment variables
Cluster config supports {{secrets/<scope>/<key>}} placeholders:
- Spark conf: 
spark.password {{secrets/my-db-scope/db-password}}Use later:spark.conf.get("spark.password") - Env var: 
MY_DB_PWD={{secrets/my-db-scope/db-password}}Then use$MY_DB_PWDin an init script or code.
(Mind the security considerations in that doc.) (Microsoft Learn) 
Redaction: Databricks replaces printed secret literals with
[REDACTED]in notebook output (and attempts to redact in SQL too), but follow permission best practices—redaction isn’t a substitute for access control. (Microsoft Learn)
Permissions (ACLs) for secret scopes
Secret permissions are at scope level (READ / WRITE / MANAGE). Use the CLI to manage ACLs:
# Grant
databricks secrets put-acl my-db-scope alice@example.com READ
databricks secrets put-acl my-db-scope data-engineers MANAGE
# View
databricks secrets list-acls my-db-scope
databricks secrets get-acl  my-db-scope alice@example.com
# Revoke
databricks secrets delete-acl my-db-scope alice@example.com
Notes:
- The scope creator has MANAGE by default.
 - With AKV-backed scopes, anyone with access to the scope can access all secrets in that vault—use separate vaults to isolate. (Microsoft Learn)
 
Use-cases
- Database credentials (JDBC/ODBC/SQL Warehouse apps).
 - API tokens (Salesforce, Stripe, Slack, ServiceNow, etc.).
 - Cloud keys (ADLS SAS, service principal client secrets).
 - Webhook endpoints / SMTP creds for alerting jobs.
 
Best practices & gotchas
- Prefer AKV-backed scopes when your org standardizes on Azure-native secret lifecycle/rotation; manage secrets in AKV. (Microsoft Learn)
 - Use Databricks-backed scopes for quick start, PoCs, or multi-cloud teams without AKV.
 - Do not print secrets; rely on redaction only as a last line of defense. Lock down who can run notebooks on clusters that access secrets. (Microsoft Learn, Stack Overflow)
 - For Spark conf/env-var secret refs, review the security considerations (who can read configs/env vars, log visibility). (Microsoft Learn)
 - Name scopes by app/team (e.g., 
etl-prod,bi-prod) rather than individuals; rotate secrets regularly. 
Quick “cheat sheet”
AKV-backed scope (UI):
- KV Permission model = Vault access policy; set Networking/Firewall as needed. (Microsoft Learn)
 https://<workspace>#secrets/createScope→ name scope → DNS Name + Resource ID → Create. (Microsoft Learn)
Databricks-backed scope (CLI):
databricks auth login
databricks secrets create-scope my-db-scope
databricks secrets put-secret --json '{"scope":"my-db-scope","key":"db-user","string_value":"demo"}'
databricks secrets list-scopes
databricks secrets list-secrets my-db-scope
Use in code:
pwd = dbutils.secrets.get("my-db-scope","db-password")
References (key docs)
- Secret management overview (types, AKV setup, create scopes, ACLs, redaction, CLI commands). (Microsoft Learn)
 - CLI secrets command group (current syntax). (Databricks Documentation, Microsoft Learn)
 - dbutils.secrets reference (get/list helpers). (Databricks Documentation)
 - Secrets in Spark conf / env vars (placeholders & security considerations). (Microsoft Learn)