Cannot connect to docker daemon in kubernetes native

On drone 1.0.0-rc.3 in kubernetes native mode, I can setup drone, login, sync repos, and trigger builds, but they keep failing with the following error:

Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?

I’ve tried several different builds, including the docker test at https://github.com/drone/hello-world/blob/test-docker-plugin/.drone.yml. I’ve also tried several that don’t include the docker plugin, just executing scripts or echoing.

Any ideas?

Eric

A Kubernetes native configuration does not directly connect to the Docker daemon, so that fact that you are getting Docker errors may indicate the Drone server is not properly configured. If I had to guess, it sounds like you have the Drone server configured to run in single-machine mode as opposed to Kubernetes mode, but tough to say for certain without more details.

very possible - I mostly updated my previous kubernetes template for the deployment:

apiVersion: extensions/v1beta1
kind: Deployment
metadata:
  name: drone
  namespace: default
  labels:
    app: drone
spec:
  replicas: 1
  template:
    metadata:
      labels:
        app: drone
    spec:
      nodeSelector:
        kubernetes.io/hostname: drone-node
      containers:
      - name: drone
        image: "docker.io/drone/drone:1.0.0-rc.3"
        imagePullPolicy: Always
        envFrom:
        - secretRef:
            name: drone
        ports:
        - name: http
          containerPort: 80
          protocol: TCP
        - name: grpc
          containerPort: 9000
          protocol: TCP
        volumeMounts:
        - mountPath: /data
          name: drone
      dnsPolicy: ClusterFirst
      restartPolicy: Always
---

I’m using the following environment variables to configure:

DRONE_DATABASE_DATASOURCE: "postgres://drone:password@postgres:5432/drone?sslmode=disable"
DRONE_DATABASE_SECRET: "secret"
POSTGRES_PASSWORD: "password"
POSTGRES_USER: "drone"
DRONE_CRON_INTERVAL: "1h"
DRONE_CRON_DISABLED: "false"
DRONE_DATABASE_DRIVER: "postgres"
DRONE_GITEA_SERVER: "http://gitea-http:3000"
DRONE_GITEA_SKIP_VERIFY: "true"
DRONE_GIT_ALWAYS_AUTH: "false"
DRONE_GIT_USERNAME: "dronebot"
DRONE_GIT_PASSWORD: "password"
DRONE_LOGS_COLOR: "true"
DRONE_LOGS_DEBUG: "false"
DRONE_LOGS_PRETTY: "false"
DRONE_RPC_SECRET: "rpcsecret"
DRONE_RPC_SERVER: "http://drone"
DRONE_SERVER_HOST: "drone"
DRONE_SERVER_PROTO: "http"
DRONE_USER_CREATE: "username:dronebot,machine:false,admin:true"
DRONE_RUNNER_CAPACITY: "8"
DRONE_KUBERNETES_ENABLE: "true"
DRONE_KUBERNETES_NAMESPACE: "default"

Am I using the wrong image or maybe passing the wrong environment variables?

Thank you!
Eric

Also, I see that it’s running drone-server in the container, should it be running a different binary?

this should be removed. The runner capacity instructs the Drone server to run in single-machine mode with 8 local threads.

I wasn’t able to remove that configuration element as I got the follow error when I tried:

time="2019-01-03T16:21:34Z" level=fatal msg="main: invalid configuration" error="envconfig.Process: aITY to Capacity: converting '' to type int. details: strconv.ParseInt: parsing \"\": invalid syntax"

However, I found the problem - it was a simple typo:

DRONE_KUBERNETES_ENABLE: "true"

should have been:

DRONE_KUBERNETES_ENABLED: "true"

Thank you!

great, glad you figured it out. I also recommend force-pulling the latest rc.3 image. I just patched to fix an issue with service connectivity on kubernetes.