Connect with custome k8s cluster

Hi ,everyone.

Recently, I want to use okteto cli connect to my private k8s cluster.

Firstly, I can’t connect my k8s cluster. so I commit a issue in github: connect with custome k8s cluster · Issue #3025 · okteto/okteto · GitHub

And the okteto community was very friendly and helped me resolve this issue which can’t connect to k8s cluster.

But when I want to use okteto up to deploy example code (GitHub - okteto/movies-with-vanilla-k8s: A movies application to show how Okteto CLI works with vanilla Kubernetes clusters) to k8s cluster, the fontend pod is always pending (I select front).

I know it’s because the pvc is always pending.

here is pvc info:

how should I do?

Hi @rainbowechoes,

Okteto provision a PVC by default as a cache for your dev tooling (npm/yarn, go mod, maven, etc …) and code sync feature. It can be disabled by setting the following snippet in your okteto.yml:

persistentVolume:
  enabled: false

If disabled, when your dev container restarts or gets recreated, it will be needed again your code to be synced and your dependencies to be redownloaded.

Hi, @provecho , thanks for your reply.

sure, this way which you said will work. but like you said also, I will need download dependencies when app reload.

so actually I need to make this pvc create successfully.
my k8s cluster has no pv and storage class. Did it mean if I need to create pv and storage class? And Is there some special requirements about pv and storage class?

Sorry, I misremembered. In my k8s cluster, I create a pv, like this:

but the pvc still pending

Can you try following the following steps:

  • Remove the pv and pvc created.
  • Create a new storage class. You can check how to create it here
  • Add the name of the storage class you just created in your okteto manifest on the persistent volume section.

That should do the trick!!

Hi all,

I am also trying to use okteto on a custom K8s cluster, which can only pull images from a private registry. For security policies, containers should run as non-root.

I created a simple okteto.yaml file as shown below:

name: test-okteto-dev
context: my-custom-k8s-context-name


dev:
  python-app-01:
    image: my-private-registry/psc-python:0.1
    command: bash
    sync:
      - .:/okteto

    autocreate: true

    securityContext:
      runAsUser: 999

When I execute okteto up from my terminal, my environment fails to start. after checking the status of the pods via kubectl describe pod <pod-name> seems like there is an image named okteto/bin:1.3.6 which is always downloaded from https://registry-1.docker.io/v2/

I didn’t find any configuration for custom K8s clusters, so I didn’t expect something to be required there.

I am missing something?

Here the logs ok the failing pod:

Events:
  Type     Reason                  Age                    From                     Message
  ----     ------                  ----                   ----                     -------
  Normal   Scheduled               4m37s                  default-scheduler        Successfully assigned <my-namespace-on-my-custom-k8s-cluster>/python-app-01-okteto-56bbc95845-hs8tw to da757860-6ca7-4846-a5c5-78dcfa1b9d58
  Normal   SuccessfulAttachVolume  4m36s                  attachdetach-controller  AttachVolume.Attach succeeded for volume "pvc-41fea322-da07-4702-b758-bdf349242dba"
  Warning  Failed                  3m40s                  kubelet                  Failed to pull image "okteto/bin:1.3.6": rpc error: code = Unknown desc = Error response from daemon: Get "https://registry-1.docker.io/v2/": context deadline exceed
ed
  Warning  Failed                  2m19s (x3 over 3m23s)  kubelet                  Error: ImagePullBackOff
  Normal   BackOff                 2m19s (x3 over 3m23s)  kubelet                  Back-off pulling image "okteto/bin:1.3.6"
  Normal   Pulling                 2m3s (x4 over 4m26s)   kubelet                  Pulling image "okteto/bin:1.3.6"
  Warning  Failed                  108s (x3 over 4m11s)   kubelet                  Failed to pull image "okteto/bin:1.3.6": rpc error: code = Unknown desc = Error response from daemon: Get "https://registry-1.docker.io/v2/": net/http: request cance
led while waiting for connection (Client.Timeout exceeded while awaiting headers)
  Warning  Failed                  108s (x4 over 4m11s)   kubelet                  Error: ErrImagePull
  Normal   SandboxChanged          108s (x7 over 4m10s)   kubelet                  Pod sandbox changed, it will be killed and re-created.

My version of oketeto is version 2.3.3

Seems like okteto needs that image, so I downloaded and added it to the private registry.
Then I changed the default config of the initContainer to pull the image from the private registry.

This is how my okteto.yml looks like now, for now I do not need a persistent volume so I disabled it:

name: test-okteto-dev
context: my-custom-k8s-context-name

image: my-private-registry/psc-python:0.1

securityContext:
  runAsUser: 999
autocreate: true

persistentVolume:
  enabled: false
  storageClass: standard
  size: 30Gi

forward:
  - 8080:8080
reverse:
  - 3500:3500

initContainer:
  image: my-private-registry/okteto/bin:1.3.6
  resources:
    requests:
      cpu: 30m
      memory: 30Mi
    limits:
      cpu: 30m
      memory: 30Mi

According the k8s logs, looks like all goes well while pulling the images:

Events:
  Type     Reason     Age                    From               Message
  ----     ------     ----                   ----               -------
  Normal   Scheduled  4m53s                  default-scheduler  Successfully assigned <my-namespace-on-my-custom-k8s-cluster>/test-okteto-dev-okteto-77777b679c-8n7nt to 7b1ddb41-9647-4f4f-ab4b-a2b42b2ab863
  Normal   Pulling    4m48s                  kubelet            Pulling image "my-private-registry/okteto/bin:1.3.6"
  Normal   Pulled     4m46s                  kubelet            Successfully pulled image "my-private-registry/okteto/bin:1.3.6" in 1.661839206s
  Normal   Created    4m46s                  kubelet            Created container okteto-bin
  Normal   Started    4m35s                  kubelet            Started container okteto-bin
  Normal   Pulled     4m33s                  kubelet            Successfully pulled image "my-private-registry/psc-python:0.1" in 147.90517ms
  Normal   Pulled     4m32s                  kubelet            Successfully pulled image "my-private-registry/psc-python:0.1" in 280.891512ms
  Normal   Pulled     4m17s                  kubelet            Successfully pulled image "my-private-registry/psc-python:0.1" in 122.996261ms
  Normal   Pulling    3m49s (x4 over 4m33s)  kubelet            Pulling image "my-private-registry/psc-python:0.1"
  Normal   Created    3m49s (x4 over 4m33s)  kubelet            Created container dev
  Normal   Started    3m49s (x4 over 4m32s)  kubelet            Started container dev
  Normal   Pulled     3m49s                  kubelet            Successfully pulled image "my-private-registry/psc-python:0.1" in 109.439647ms
  Warning  BackOff    3m48s (x5 over 4m31s)  kubelet            Back-off restarting failed container

But from the console where I am running okteto up I get this error:

Couldn't connect to your development container: failed to start SSH port-forward: error upgrading connection: Upgrade request required
    Find additional logs at: C:\Users\XXX\.okteto\XXX\test-okteto-dev/okteto.log

This is the content of the okteto.log file:

time="2022-09-15T08:44:56+02:00" level=info msg="'.stignore' exists in folder 'C:\\Users\\XXXX\\PycharmProjects\\pysparkK8sAzure'" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:44:56+02:00" level=info msg="failed to get repository url in IsOktetoRepo: failed to analyze git repo: repository does not exist" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:44:56+02:00" level=info msg="activating development container retry=false" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:44:56+02:00" level=debug msg="calling registry my-private-registry" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:44:57+02:00" level=debug msg="image with digest: my-private-registry/psc-python@sha256:37747165dfebb4904b6a5d08f98ada82f552447f933556295834f468d8173996" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:44:57+02:00" level=info msg="local syncthing initialized: gui -> 64995, sync -> 64996" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:44:57+02:00" level=info msg="remote syncthing initialized: gui -> 64994, sync -> 64993" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:44:59+02:00" level=info msg="create deployment secrets" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:44:59+02:00" level=info msg="created okteto secret 'okteto-test-okteto-dev'" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:44:59+02:00" level=info msg="creating service 'test-okteto-dev'" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:00+02:00" level=info msg="created service 'test-okteto-dev'" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:37+02:00" level=info msg="dev pod test-okteto-dev-okteto-77777b679c-8n7nt is now Pending" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:37+02:00" level=info msg="dev pod test-okteto-dev-okteto-77777b679c-8n7nt is now Pending" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:37+02:00" level=info msg="dev pod test-okteto-dev-okteto-77777b679c-8n7nt is now Pending" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:37+02:00" level=info msg="dev pod test-okteto-dev-okteto-77777b679c-8n7nt is now Running" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:37+02:00" level=info msg="starting SSH port forwards" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:37+02:00" level=info msg="starting SSH forward manager" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:37+02:00" level=info msg="SSH forward manager retry 1" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:38+02:00" level=info msg="k8s forwarding to dev pod finished with errors: error upgrading connection: Upgrade request required" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:38+02:00" level=info msg="starting shutdown sequence" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:38+02:00" level=info msg="sent cancellation signal" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:38+02:00" level=info msg="stopping syncthing" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:38+02:00" level=info msg="stopping forwarders" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:38+02:00" level=info msg="stopped k8s forwarder" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:38+02:00" level=info msg="stopped SSH forward manager" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:38+02:00" level=info msg="completed shutdown sequence" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:38+02:00" level=info msg="activate failed with: couldn't connect to your development container: failed to start SSH port-forward: error upgrading connection: Upgrade request required" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3
time="2022-09-15T08:45:38+02:00" level=info msg="exit signal received due to error: couldn't connect to your development container: failed to start SSH port-forward: error upgrading connection: Upgrade request required" action=fce03893-9572-454f-b545-30749e92dde5 version=2.3.3

My guess is that, as my containers have to run as non-root, maybe okteto is not able to initialize the ssh sync service properly.

Any ideas?