Databricks Workflows is the fully-managed orchestrator for knowledge, analytics, and AI. Immediately, we’re glad to announce a number of enhancements that make it simpler to convey probably the most demanding knowledge and ML/AI workloads to the cloud.
Workflows gives excessive reliability throughout a number of main cloud suppliers: GCP, AWS, and Azure. Till immediately, this meant limiting the variety of jobs that may be managed in a Databricks workspace to 1000 (quantity diverse primarily based on tier). Clients operating extra knowledge and ML/AI workloads needed to partition jobs throughout workspaces with a purpose to keep away from operating into platform limits. Immediately, we’re glad to announce that we’re considerably growing this restrict to 10,000. The brand new platform restrict is mechanically obtainable in all buyer workspaces (besides single-tenant).
1000’s of consumers depend on the Jobs API to create and handle jobs from their functions, together with CI/CD techniques. Along with the elevated job restrict, we now have launched a sooner, paginated model of the jobs/checklist API and added pagination to the roles web page.

The upper workspace restrict additionally comes with a streamlined search expertise which permits looking by title, tags, and job ID.

Put collectively, the brand new options permit scaling workspaces to a lot of jobs. For uncommon circumstances the place the modifications in habits above will not be desired, it’s potential to revert to the outdated habits by way of the Admin Console (solely potential for workspaces with as much as 3000 jobs). We strongly suggest that every one prospects swap to the brand new paginated API to checklist jobs, particularly for workspaces with 1000’s of saved jobs.
To get began with Databricks Workflows, see the quickstart information. We’d additionally like to hear from you about your expertise and another options you’d prefer to see.
Be taught extra about: