Databricks Admin Center

Be in control of your Databricks spending

Databricks and Spark can help solve a lot of the tough analytical and data engineering problems. However, operations and cost monitoring are tough problems you still need to solve.

If you are working with Databricks workspaces, operating Spark clusters, data pipelines or platforms; or if you are simply interested in the cost and performance aspect of your clusters, you will soon realize, that there is a lot you have to figure out yourself.

Databricks Admin Console (DAC) helps to collect data from your Databricks workspaces and presents the actionable insights on dashboards. DAC is designed to help in the following areas.

Scroll Down

A single place to control all your operations and spending.


DAC collects DBU usage data on workspaces, users and clusters. It preserves the data so you can see trends and issues on cluster creation or usage. Upcoming versions of DAC will collect logs and metrics from the clusters, Spark drivers and executors. It enables teams responsible for operations to review, debug and analyse logs in one central page.


Cost is an important aspect of operating your data platform. DAC gives insights into users abusing the platform, or about underutilized or idle clusters. DAC can shut down any clusters breaching the certain threshold. With alerts, DAC can send notifications about workspaces, clusters or users consuming more money than budgeted. In the next version of DAC, budgets can be specified for workspace or users.

Debugging & Tuning

While DAC can already help to spot wrong cluster setup, the next versions of DAC can give actionable recommendations on performance tuning, usage and others. A lot of knowledge about debugging is scattered in the minds of consultants, Spark contributors and users. DAC collects these insights and knowledge and through analysing logs offers these recommendations and insights.

Open Source

The best thing of all: DAC is open-source! We started DAC because while working with multiple customers and analysing their cost profiles were hard, error-prone and manual work. By open-sourcing DAC we can help every team to benefit from easier monitoring and cost control for your Databricks workspaces.