In some cases you may want to prevent a repository from running multiple builds at the same time, or you may even want to prevent certain branches or events from executing at the same time.
You can set concurrency limits (below) to limit the number of named pipelines that can execute at the same time. You will see we name the pipeline “deploy” and we limit concurrency to “1”. This will instruct Drone to only execute 1 pipeline named “deploy” at a time.
In some cases you may want to limit concurrency based on event or branch. We can do this by defining multiple pipelines in our yaml. In the below example, the first pipeline that we define limits tag events to a single concurrent build. The second pipeline will execute push and pull_request events with no concurrency limits (unbounded).
Please note the internal scheduler makes a reasonable effort to enforce pipeline concurrency, however, the system does not make formal guarantees. If you require formal guarantees you should consider integrating a locking system into your pipeline (such as redislock or zookeeper).
This feature is only available to the Docker runtime. It is not available to the (experimental) native Kubernetes runtime because the Kubernetes scheduler does not provide any native primitives to limit job scheduling in this manner.
okay, gotcha. In that case does will setting DRONE_RUNNER_CAPACITY=1 do the thing?
If I can’t set concurrency per pipeline, I don’t mind setting it for the whole Drone setup.
DRONE_RUNNER_CAPACITY configures the number of concurrent pipelines a single agent can process. Kubernetes is agentless so this setting does not apply. The only way to use this feature is with the standard Docker runtime and agents.
Also please note the Kubernetes runtime is experimental and not recommended for production use at this time, per the documentation.
When an merged pull request triggers my deployment and I have just merged 3 pull requests this option prevents concurrent deployment but still executed 3 deployments right?
From my perspective an “take latest” option would be nice where the new pipeline always cancels all running ones.
this feature focuses on limiting the number of pipelines that can execute concurrently. You are asking to automatically cancel older pending pipelines when a newer commit is received. These are two separate features. There is an existing issue to track this feature that you can subscribe to for updates at [Feature] Builds auto cancellation · Issue #1980 · harness/gitness · GitHub.
Is this feature intended to support concurrency limits between different builds of the same repository? I’m running into a situation where when a github pr is opened, both the PR and the push event builds run and step on each’s other state causing both builds to fail. The pipeline looks something like this:
It is intended to support concurrency limits between different builds for the same named repository pipeline. Using your example the unique key is a combination of your repository name and pipeline name, apply us-west-2.
Sorry, I wasn’t specific enough: The issue I am encountering seems to be that the concurrency setting isn’t being detected between different builds of the same pipeline. I’ve worked around it for now by adjusting my workflow to avoid parallel builds, so this isn’t a breaking issue for me.
However since, it appears like the syntax should work, I’d like to continue debugging this locally and see if I can figure anything out. If I do, I’ll report it here.