Unable to connect to Kubernetes cluster

While attempting to use the Okteto CLI to deploy a “hello world” app on my own Kubernetes cluster, I encounter an error like this:

$ okteto deploy
 i  Using default @ my-cluster as context
 i  Images were already built. To rebuild your images run 'okteto build' or 'okteto deploy --build'
 i  Running 'helm upgrade --install okteto-app chart --create-namespace --namespace okteto-test ...
2022/12/08 12:51:04 http: proxy error: dial tcp: lookup rancher.example.com/k8s/clusters/c-x9xrd: no such host
Error: Kubernetes cluster unreachable: an error on the server ("") has prevented the request from succeeding

My kubeconfig actually references the Rancher endpoint rancher.example.com/k8s/clusters/c-x9xrd because Rancher acts as a proxy to the target Kubernetes cluster. The kubeconfig looks like:

apiVersion: v1
kind: Config
clusters:
- name: "my-cluster"
  cluster:
    server: "https://rancher.example.com/k8s/clusters/c-x9xrd"
users:
- name: "my-cluster"
  user:
    token: "kubeconfig-u-1234:*****"
contexts:
- name: "my-cluster"
  context:
    user: "my-cluster"
    cluster: "my-cluster"

current-context: "my-cluster"

kubectl and helm commands work fine in the same shell session.

Any help will be appreciated.

I bypassed the issue by enabling the Rancher authorized cluster endpoints so that Okteto CLI now works, but I would still like to know if there is native support for proxied connections like this.

@manninga, did you ever find out what the underlying problem was here. It is a bit of an issue for us

No, I did not. My workaround was good enough. It definitely seems like something that should have a solution, though.