About Databricks
Databricks is a unified analytics platform that combines data engineering, data science, and machine learning workloads. It provides a collaborative environment for working with big data and AI, built on Apache Spark. Connecting Databricks to Serval enables automated data workflows, cluster management, and job orchestration directly from your service desk.What the Databricks integration enables
| Capability | Description |
|---|---|
| Account Management | List and manage workspaces, users, groups, and service principals at the account level |
| Cluster Operations | Create, start, stop, and manage compute clusters |
| Job Management | Create, run, and monitor data processing jobs |
| Workspace Operations | Manage notebooks, files, and workspace resources |
| Unity Catalog | Access and manage data governance resources |
Databricks Configuration
Prerequisites
- You must have admin access to your Databricks account console
- You must know your Databricks account ID
- You must have a workspace URL for workspace-level operations
1. Create a Service Principal
- Log in to the Databricks Account Console
- Click User management in the left sidebar
- Select the Service principals tab
- Click Add service principal
- Enter a name for the service principal (e.g., “Serval Integration”)
- Click Add
2. Generate an OAuth Secret
- Click on the service principal you just created to open its details
- Select the Secrets tab
- Under OAuth secrets, click Generate secret
- Set the secret’s lifetime (maximum 730 days)
- Important: Copy both the Secret and Client ID immediately
3. Assign the Service Principal to Workspaces
For the service principal to access workspace resources:- In the Account Console, go to Workspaces
- Click on the workspace you want to connect
- Go to the Permissions tab
- Click Add permissions
- Search for and select your service principal
- Assign the appropriate role (e.g., “User” or “Admin”)
- Click Save
The service principal must be explicitly assigned to each workspace it needs
to access. Account-level API access is automatic for account admins.
4. Grant Additional Permissions (Optional)
Depending on your use case, you may need to grant additional permissions for your service principal.Serval Configuration
- In Serval, navigate to Apps → Available → Databricks → Connect
-
Enter the following information:
Field Description Account ID Your Databricks account ID (see below) Workspace URL Your Databricks workspace URL (e.g., my-workspace.cloud.databricks.com)Client ID The Client ID (Application ID) from your service principal Client Secret The OAuth secret generated for your service principal Instance Name (Optional) A friendly name to identify this connection - Click Save
xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
You should now be able to build workflows that leverage Databricks APIs, such
as managing clusters, running jobs, or querying workspace resources.
Cloud Provider Support
Databricks operates on multiple cloud providers with different URL patterns:| Cloud Provider | Workspace URL Pattern | Account Console |
|---|---|---|
| AWS | *.cloud.databricks.com | accounts.cloud.databricks.com |
| Azure | *.azuredatabricks.net | accounts.azuredatabricks.net |
| GCP | *.gcp.databricks.com | accounts.gcp.databricks.com |
The Serval integration uses account-level OAuth tokens, which can access both
account-level APIs and workspace-level APIs for workspaces the service
principal has access to.
Troubleshooting
”Unable to list workspaces” Error
This typically means the service principal doesn’t have account-level permissions:- Verify the service principal exists in the Account Console
- Check that the Client ID and Secret are correct
- Ensure the service principal has been granted account admin or appropriate account-level roles
”Unable to list clusters” Error
This indicates workspace-level access issues:- Verify the service principal is assigned to the workspace
- Check that the workspace URL is correct
- Ensure the service principal has permissions to list clusters in that workspace
Need help? Contact [email protected] for assistance with your Databricks integration.

