I am trying to run the pipeline in every push in concurrent manner. so if multiple push happened in specific branch this will trigger multiple pipelines to run concurrently.
Note: my system required login which allow one user/device at a given moment.
The problem is when running in pipelines in parallel my E2E test fail as used will cause other to logout during the test as they are using the same service/detached step.
kind: pipeline
name: Development
concurrency:
limit: 10
services:
# start a new postgres database server for the e2e test
- name: projectdatabase
image: postgres
ports:
- 5432
environment:
POSTGRES_USER: padmin
steps:
.
.
.
# run the application
- name: deploy-to-staging
image: ubuntu
# run this step container in the background : this so it removed automatically when pipeline execution finished
detach: true
commands:
- cd /drone/src/project/server
# run the golang executable aka run the project
- ./server
depends_on:
# wait until this steps finish
- build-frontend
- build-backend-and-migrate-database
# run puppeteer end-to-end test
- name: e2e-test
image: my-puppeteer-image
# required because this is private image and doesn't exist in docker.hub
#
#
pull: if-not-exists
commands:
# go to puppeteer e2e test
- cd /drone/src/project/e2e_test
# install npm modules
- npm install
# change the end point from the default/dev-env to the onces created in run application step
- find /drone/src/project -type f -name "*.js" -print0 | xargs -0 sed -i 's/localhost:3535/deploy-to-staging:8080/g'
# run the tests
- node test.js
# trigger for dev branch only
trigger:
branch:
- dev
I suspect the reason is because i run puppeteer to test to deploy-to-staging
ip which represent the pipeline step deploy-to-staging
as it’s string in my script i think they all end up pointing to the same docker container which cause my test to fail.