We have an enterprise quay.io install and some madness in a Jenkins job that triggers on a webhook from quay. Any plugins for Drone to make it “do the right thing” if it gets a Jenkins webhook? I won’t be able to get any custom plugin added to quay, easier for me if Drone pretends to be Jenkins.
Just wondering.
You would need to create a small micro-service that sits between Quay and Drone that receives the webhook from Quay and uses the payload to make a Drone API call.
Can you provide some more information about the use case? Perhaps there is a different approach that could be used to solve this business problem. For example, if the goal was to trigger a deployment when a new image is published to Quay, you could probably architect this differently when using Drone.
So that’s basically what it is, when a new image is pushed to quay, a
Jenkins job pulls it, pushes it to Dockerhub (long story) and runs a few
100 lines of BASH to deploy it.
have you considered combining into a single pipeline, like this?
pipeline:
publish:
image: plugins/docker
repo: quay.io/foo/bar
tags: ${DRONE_BUILD_NUMBER}
republish:
# some custom step that downloads the image
# and re-publishes to dockerhub.
#
# docker pull quay.io/foo/bar:${DRONE_BUILD_NUMBER}
# docker tag quay.io/foo/bar:${DRONE_BUILD_NUMBER} foo/bar:${DRONE_BUILD_NUMBER}
# docker push foo/bar
bash:
image: alpine
commands:
- ./some_bash_script.sh
or alternatively you could have a custom drone plugin that publishes to multiple targets:
pipeline:
publish:
image: my-custom-plugin/docker-multi-publish
tags: [ quay.io/foo/bar, docker.io/foo/bar ]
bash:
image: alpine
commands:
- ./some_bash_script.sh
I might not have the full picture, but it seems like if publishing to quay always triggers a new pipeline, then these pipelines could actually be combined into a single execution.
It is pretty common to see a .drone.yml file where someone publishes an image to dockerhub, and then immediately deploys that image in the subsequent pipeline step.
So this is a funny case: we’re running Kubernetes as a service for tech savvy business users. They publish their image whenever they want, and it deploys. Broken or not, out it goes. It was a mixed bag of rackspace, heroku, et al before. Users build their images on their mac, or PC (or a Pi for all I know) and docker push it, that kicks off the pipeline.
In this case, your best bet is to setup a small micro service (or even function as a service if you use aws or gcp) that accepts the docker webhook and triggers a pipeline execution in drone.
We have a special type of pipeline event called a deployment event (aka promotion event). If it were me, I would trigger a deployment event.
http://docs.drone.io/promoting-builds/
In the future we could investigate accepting dockerhub webhooks in order to trigger deployment events. This would require a bit more research and would likely be a post 1.0 feature.