Code Monkey home page Code Monkey logo

Comments (15)

Coming-2022 avatar Coming-2022 commented on September 7, 2024 3

AWS EKS PODS has issue

`


Normal Scheduled 28m default-scheduler Successfully assigned default/csi-s3-test-nginx to ip-10-10-3-139.cn-north-1.compute.internal
Warning FailedMount 28m (x7 over 28m) kubelet MountVolume.MountDevice failed for volume "pvc-0b24dc16-83d9-4f3b-9721-6707349d3242" : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name ru.yandex.s3.csi not found in the list of registered CSI drivers
Warning FailedMount 6m28s (x10 over 26m) kubelet Unable to attach or mount volumes: unmounted volumes=[webroot], unattached volumes=[], failed to process volumes=[]: timed out waiting for the condition
Warning FailedMount 2m24s (x14 over 27m) kubelet MountVolume.MountDevice failed for volume "pvc-0b24dc16-83d9-4f3b-9721-6707349d3242" : rpc error: code = Unknown desc = Error starting systemd unit geesefs-poc_2draas_2ds3csi_2dsource_2fpvc_2d0b24dc16_2d83d9_2d4f3b_2d9721_2d6707349d3242.service on host: Cannot set property ExecStopPost, or unknown property.`

from k8s-csi-s3.

shawn-gogh avatar shawn-gogh commented on September 7, 2024 2

@joedborg
Hi,I had the same problem with you, but in the end, I found the reason. I hope my experience can help you.
as you see ,your register pod has been stucked as below:

I0202 16:35:11.018178       1 node_register.go:58] Starting Registration Server at: /registration/ru.yandex.s3.csi-reg.sock
I0202 16:35:11.018375       1 node_register.go:67] Registration Server started at: /registration/ru.yandex.s3.csi-reg.sock

if the register success, it may response:

I0721 12:34:24.129772       1 node_register.go:58] Starting Registration Server at: /registration/smb.csi.k8s.io-reg.sock
I0721 12:34:24.130069       1 node_register.go:67] Registration Server started at: /registration/smb.csi.k8s.io-reg.sock
I0721 12:34:25.864398       1 main.go:77] Received GetInfo call: &InfoRequest{}
I0721 12:34:25.903679       1 main.go:87] Received NotifyRegistrationStatus call: &RegistrationStatus{PluginRegistered:true,Error:,}

so, it's the problem of the register pod.
My problem is that, Because I'm using k0s, the default directory for kubelet is /var/lib/k0s/kubelet. However, the deployment of register is bound to the hostpath /var/lib/kubelet by default. Therefore, when my kubelet registers CSI, it cannot find the registered ru.yandex.s3.csi-reg.sock, causing the registration to get stuck.
Hoping my experience can help u

from k8s-csi-s3.

levin-kelevra avatar levin-kelevra commented on September 7, 2024

Anyone? )

from k8s-csi-s3.

vitalif avatar vitalif commented on September 7, 2024

Hi, no idea. Did you install the CSI itself? )
I just rechecked with K8s 1.25.3, everything works just fine

from k8s-csi-s3.

levin-kelevra avatar levin-kelevra commented on September 7, 2024

it was installed from deploy/kubernetes

from k8s-csi-s3.

vitalif avatar vitalif commented on September 7, 2024

What exactly did you do?
Check output of kubectl get pods -A, it should contain csi-attacher-s3-*, csi-provisioner-s3-*, csi-s3-*.

from k8s-csi-s3.

levin-kelevra avatar levin-kelevra commented on September 7, 2024
NAME            STATUS   ROLES    AGE    VERSION
k0s-master      Ready    master   476d   v1.25.3+k0s
k0s-worker-01   Ready    worker   476d   v1.25.3+k0s
k0s-worker-02   Ready    worker   476d   v1.25.3+k0s
k0s-worker-03   Ready    worker   318d   v1.25.3+k0s
kube-system       csi-attacher-s3-0                                              1/1     Running                  2 (25h ago)       9d
kube-system       csi-provisioner-s3-0                                           2/2     Running                  5 (25h ago)       9d
kube-system       csi-s3-px2ct                                                   2/2     Running                  4 (25h ago)       9d
kube-system       csi-s3-q7867                                                   2/2     Running                  4 (25h ago)       9d
kube-system       csi-s3-rqv6p                                                   2/2     Running                  4 (25h ago)       9d
kube-system       csi-s3-s8m2k                                                   2/2     Running                  2 (25h ago)       8d

from k8s-csi-s3.

vitalif avatar vitalif commented on September 7, 2024

Looks fine. Then check logs of all these containers and look for anything suspicious like errors :)

from k8s-csi-s3.

levin-kelevra avatar levin-kelevra commented on September 7, 2024

there are not error logs at all...
only info and it looks like all fine...
PVC successfuly created, dir inside bucket also created, but...
Warning FailedMount 30s (x2 over 2m33s) kubelet MountVolume.MountDevice failed for volume "pvc-9ae16f5b-d425-478c-b1d0-880504dc32aa" : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name ru.yandex.s3.csi not found in the list of registered CSI drivers Warning FailedMount 11s (x3 over 7m2s) kubelet Unable to attach or mount volumes: unmounted volumes=[grafana-data], unattached volumes=[grafana-data kube-api-access-wtnff]: timed out waiting for the condition

from k8s-csi-s3.

levin-kelevra avatar levin-kelevra commented on September 7, 2024
I1027 23:27:14.583559       1 connection.go:153] Connecting to unix:///var/lib/kubelet/plugins/ru.yandex.s3.csi/csi.sock
I1027 23:27:15.584950       1 common.go:111] Probing CSI driver for readiness
I1027 23:27:15.589021       1 main.go:136] CSI driver name: "ru.yandex.s3.csi"
W1027 23:27:15.589061       1 metrics.go:333] metrics endpoint will not be started because `metrics-address` was not specified.
I1027 23:27:15.590916       1 main.go:165] CSI driver does not support ControllerPublishUnpublish, using trivial handler
I1027 23:27:15.591932       1 controller.go:121] Starting CSI attacher
I1027 23:27:15.592708       1 reflector.go:207] Starting reflector *v1.PersistentVolume (10m0s) from k8s.io/client-go/informers/factory.go:134
I1027 23:27:15.592801       1 reflector.go:243] Listing and watching *v1.PersistentVolume from k8s.io/client-go/informers/factory.go:134
I1027 23:27:15.592952       1 reflector.go:207] Starting reflector *v1.VolumeAttachment (10m0s) from k8s.io/client-go/informers/factory.go:134
I1027 23:27:15.592985       1 reflector.go:243] Listing and watching *v1.VolumeAttachment from k8s.io/client-go/informers/factory.go:134
I1027 23:27:15.692359       1 shared_informer.go:270] caches populated
I1027 23:27:26.793402       1 controller.go:198] Started VA processing "csi-aa2d75a06990e05a4a0a27cadab6fa21c0ef2609b12a9ceb2f71672f64697bd8"
I1027 23:27:26.793457       1 trivial_handler.go:53] Trivial sync[csi-aa2d75a06990e05a4a0a27cadab6fa21c0ef2609b12a9ceb2f71672f64697bd8] started
I1027 23:27:26.793464       1 util.go:37] Marking as attached "csi-aa2d75a06990e05a4a0a27cadab6fa21c0ef2609b12a9ceb2f71672f64697bd8"
I1027 23:27:26.804419       1 util.go:51] Marked as attached "csi-aa2d75a06990e05a4a0a27cadab6fa21c0ef2609b12a9ceb2f71672f64697bd8"
I1027 23:27:26.804459       1 trivial_handler.go:61] Marked VolumeAttachment csi-aa2d75a06990e05a4a0a27cadab6fa21c0ef2609b12a9ceb2f71672f64697bd8 as attached
I1027 23:27:26.804987       1 controller.go:198] Started VA processing "csi-aa2d75a06990e05a4a0a27cadab6fa21c0ef2609b12a9ceb2f71672f64697bd8"
I1027 23:27:26.805024       1 trivial_handler.go:53] Trivial sync[csi-aa2d75a06990e05a4a0a27cadab6fa21c0ef2609b12a9ceb2f71672f64697bd8] started
I1027 23:35:16.619186       1 reflector.go:515] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.VolumeAttachment total 11 items received
I1027 23:36:57.617209       1 reflector.go:515] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.PersistentVolume total 12 items received
I1027 23:37:15.614772       1 reflector.go:369] k8s.io/client-go/informers/factory.go:134: forcing resync
I1027 23:37:15.615054       1 reflector.go:369] k8s.io/client-go/informers/factory.go:134: forcing resync
I1027 23:37:15.616113       1 controller.go:198] Started VA processing "csi-aa2d75a06990e05a4a0a27cadab6fa21c0ef2609b12a9ceb2f71672f64697bd8"
I1027 23:37:15.616215       1 trivial_handler.go:53] Trivial sync[csi-aa2d75a06990e05a4a0a27cadab6fa21c0ef2609b12a9ceb2f71672f64697bd8] started
I1027 23:43:35.627695       1 reflector.go:515] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.VolumeAttachment total 0 items received
I1027 23:44:08.624477       1 reflector.go:515] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.PersistentVolume total 5 items received
I1027 23:47:15.616424       1 reflector.go:369] k8s.io/client-go/informers/factory.go:134: forcing resync
I1027 23:47:15.616424       1 reflector.go:369] k8s.io/client-go/informers/factory.go:134: forcing resync
I1027 23:47:15.617819       1 controller.go:198] Started VA processing "csi-aa2d75a06990e05a4a0a27cadab6fa21c0ef2609b12a9ceb2f71672f64697bd8"
I1027 23:47:15.617991       1 trivial_handler.go:53] Trivial sync[csi-aa2d75a06990e05a4a0a27cadab6fa21c0ef2609b12a9ceb2f71672f64697bd8] started
I1027 23:50:42.647171       1 reflector.go:515] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.VolumeAttachment total 6 items received
I1027 23:52:34.629946       1 reflector.go:515] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.PersistentVolume total 3 items received
I1027 23:56:01.656958       1 reflector.go:515] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.VolumeAttachment total 0 items received

from k8s-csi-s3.

joedborg avatar joedborg commented on September 7, 2024

I'm seeing the same issue on K8s 1.24 and 1.25

Warning  FailedMount             67s (x9 over 3m14s)  kubelet                  MountVolume.MountDevice failed for volume "pvc-1f0ba904-3899-4b16-bc7b-be51a2ca9117" : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name ru.yandex.s3.csi not found in the list of registered CSI drivers

3 CSI pods are running as expected and bucket is created automatically.

$ kubectl logs pod/csi-s3-vqf8t -n kube-system
Defaulted container "driver-registrar" out of: driver-registrar, csi-s3
I0202 16:35:01.176054       1 main.go:110] Version: v1.2.0-0-g6ef000ae
I0202 16:35:01.176119       1 main.go:120] Attempting to open a gRPC connection with: "/csi/csi.sock"
I0202 16:35:01.176869       1 connection.go:151] Connecting to unix:///csi/csi.sock
I0202 16:35:11.013770       1 main.go:127] Calling CSI driver to discover driver name
I0202 16:35:11.017770       1 main.go:137] CSI driver name: "ru.yandex.s3.csi"
I0202 16:35:11.018178       1 node_register.go:58] Starting Registration Server at: /registration/ru.yandex.s3.csi-reg.sock
I0202 16:35:11.018375       1 node_register.go:67] Registration Server started at: /registration/ru.yandex.s3.csi-reg.sock

$ kubectl logs pod/csi-attacher-s3-0 -n kube-system
I0202 16:35:09.311779       1 main.go:91] Version: v3.0.1-0-g4074360a
I0202 16:35:09.314018       1 connection.go:153] Connecting to unix:///var/lib/kubelet/plugins/ru.yandex.s3.csi/csi.sock
I0202 16:35:11.356363       1 common.go:111] Probing CSI driver for readiness
I0202 16:35:11.358187       1 main.go:136] CSI driver name: "ru.yandex.s3.csi"
W0202 16:35:11.358204       1 metrics.go:333] metrics endpoint will not be started because `metrics-address` was not specified.
I0202 16:35:11.359093       1 main.go:165] CSI driver does not support ControllerPublishUnpublish, using trivial handler
I0202 16:35:11.359526       1 controller.go:121] Starting CSI attacher
I0202 16:35:11.359744       1 reflector.go:207] Starting reflector *v1.VolumeAttachment (10m0s) from k8s.io/client-go/informers/factory.go:134
I0202 16:35:11.359759       1 reflector.go:207] Starting reflector *v1.PersistentVolume (10m0s) from k8s.io/client-go/informers/factory.go:134
I0202 16:35:11.359771       1 reflector.go:243] Listing and watching *v1.PersistentVolume from k8s.io/client-go/informers/factory.go:134
I0202 16:35:11.359771       1 reflector.go:243] Listing and watching *v1.VolumeAttachment from k8s.io/client-go/informers/factory.go:134
I0202 16:35:11.459816       1 shared_informer.go:270] caches populated
I0202 16:37:34.826107       1 controller.go:198] Started VA processing "csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739"
I0202 16:37:34.826157       1 trivial_handler.go:53] Trivial sync[csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739] started
I0202 16:37:34.826168       1 util.go:37] Marking as attached "csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739"
I0202 16:37:34.836389       1 util.go:51] Marked as attached "csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739"
I0202 16:37:34.836430       1 trivial_handler.go:61] Marked VolumeAttachment csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739 as attached
I0202 16:37:34.836677       1 controller.go:198] Started VA processing "csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739"
I0202 16:37:34.836695       1 trivial_handler.go:53] Trivial sync[csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739] started
I0202 16:40:46.363459       1 streamwatcher.go:114] Unexpected EOF during watch stream event decoding: unexpected EOF
I0202 16:40:46.363459       1 streamwatcher.go:114] Unexpected EOF during watch stream event decoding: unexpected EOF
I0202 16:40:46.363538       1 reflector.go:515] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.VolumeAttachment total 2 items received
I0202 16:40:46.363511       1 reflector.go:515] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.PersistentVolume total 2 items received
W0202 16:40:46.486975       1 reflector.go:424] k8s.io/client-go/informers/factory.go:134: watch of *v1.VolumeAttachment ended with: very short watch: k8s.io/client-go/informers/factory.go:134: Unexpected watch close - watch lasted less than a second and no items received
W0202 16:40:46.487293       1 reflector.go:424] k8s.io/client-go/informers/factory.go:134: watch of *v1.PersistentVolume ended with: very short watch: k8s.io/client-go/informers/factory.go:134: Unexpected watch close - watch lasted less than a second and no items received
I0202 16:40:47.626815       1 reflector.go:243] Listing and watching *v1.VolumeAttachment from k8s.io/client-go/informers/factory.go:134
E0202 16:40:47.627239       1 reflector.go:127] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.VolumeAttachment: failed to list *v1.VolumeAttachment: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/volumeattachments?resourceVersion=75228937": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:47.836891       1 reflector.go:243] Listing and watching *v1.PersistentVolume from k8s.io/client-go/informers/factory.go:134
E0202 16:40:47.837459       1 reflector.go:127] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://10.152.183.1:443/api/v1/persistentvolumes?resourceVersion=75228755": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:49.332393       1 reflector.go:243] Listing and watching *v1.VolumeAttachment from k8s.io/client-go/informers/factory.go:134
E0202 16:40:49.332992       1 reflector.go:127] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.VolumeAttachment: failed to list *v1.VolumeAttachment: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/volumeattachments?resourceVersion=75228937": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:49.688047       1 reflector.go:243] Listing and watching *v1.PersistentVolume from k8s.io/client-go/informers/factory.go:134
E0202 16:40:49.688782       1 reflector.go:127] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://10.152.183.1:443/api/v1/persistentvolumes?resourceVersion=75228755": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:52.843424       1 reflector.go:243] Listing and watching *v1.VolumeAttachment from k8s.io/client-go/informers/factory.go:134
E0202 16:40:52.843862       1 reflector.go:127] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.VolumeAttachment: failed to list *v1.VolumeAttachment: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/volumeattachments?resourceVersion=75228937": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:53.851864       1 reflector.go:243] Listing and watching *v1.PersistentVolume from k8s.io/client-go/informers/factory.go:134
E0202 16:40:53.852444       1 reflector.go:127] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://10.152.183.1:443/api/v1/persistentvolumes?resourceVersion=75228755": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:02.541375       1 reflector.go:243] Listing and watching *v1.VolumeAttachment from k8s.io/client-go/informers/factory.go:134
E0202 16:41:02.542156       1 reflector.go:127] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.VolumeAttachment: failed to list *v1.VolumeAttachment: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/volumeattachments?resourceVersion=75228937": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:05.459874       1 reflector.go:243] Listing and watching *v1.PersistentVolume from k8s.io/client-go/informers/factory.go:134
E0202 16:41:05.465167       1 reflector.go:127] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://10.152.183.1:443/api/v1/persistentvolumes?resourceVersion=75228755": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:18.084907       1 reflector.go:243] Listing and watching *v1.VolumeAttachment from k8s.io/client-go/informers/factory.go:134
E0202 16:41:18.085816       1 reflector.go:127] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.VolumeAttachment: failed to list *v1.VolumeAttachment: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/volumeattachments?resourceVersion=75228937": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:23.137770       1 reflector.go:243] Listing and watching *v1.PersistentVolume from k8s.io/client-go/informers/factory.go:134
E0202 16:41:23.138265       1 reflector.go:127] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://10.152.183.1:443/api/v1/persistentvolumes?resourceVersion=75228755": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:51.828267       1 reflector.go:243] Listing and watching *v1.VolumeAttachment from k8s.io/client-go/informers/factory.go:134
E0202 16:41:51.828818       1 reflector.go:127] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.VolumeAttachment: failed to list *v1.VolumeAttachment: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/volumeattachments?resourceVersion=75228937": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:42:00.741958       1 reflector.go:243] Listing and watching *v1.PersistentVolume from k8s.io/client-go/informers/factory.go:134
I0202 16:42:30.319966       1 reflector.go:243] Listing and watching *v1.VolumeAttachment from k8s.io/client-go/informers/factory.go:134
I0202 16:42:30.322375       1 controller.go:198] Started VA processing "csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739"
I0202 16:42:30.322405       1 trivial_handler.go:53] Trivial sync[csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739] started
I0202 16:44:23.019093       1 controller.go:232] Started PV processing "pvc-1f0ba904-3899-4b16-bc7b-be51a2ca9117"
I0202 16:44:31.353923       1 controller.go:198] Started VA processing "csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739"
I0202 16:44:31.353950       1 trivial_handler.go:53] Trivial sync[csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739] started
I0202 16:44:31.353959       1 util.go:37] Marking as attached "csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739"
I0202 16:44:31.369565       1 util.go:51] Marked as attached "csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739"
I0202 16:44:31.369591       1 trivial_handler.go:61] Marked VolumeAttachment csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739 as attached
I0202 16:44:31.369920       1 controller.go:198] Started VA processing "csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739"
I0202 16:44:31.369937       1 trivial_handler.go:53] Trivial sync[csi-100f94172063c31e5a6a1b4ef09548588e8274747853966250443d2bf9a00739] started
I0202 16:48:27.769614       1 reflector.go:515] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.PersistentVolume total 0 items received

$ kubectl logs pod/csi-provisioner-s3-0 -n kube-system
Defaulted container "csi-provisioner" out of: csi-provisioner, csi-s3
I0202 16:35:00.798580       1 feature_gate.go:243] feature gates: &{map[]}
I0202 16:35:00.798986       1 csi-provisioner.go:132] Version: v2.1.0-0-gb7a8fe9fe
I0202 16:35:00.799032       1 csi-provisioner.go:155] Building kube configs for running in cluster...
I0202 16:35:00.861187       1 connection.go:153] Connecting to unix:///var/lib/kubelet/plugins/ru.yandex.s3.csi/csi.sock
I0202 16:35:07.023098       1 common.go:111] Probing CSI driver for readiness
I0202 16:35:07.028255       1 csi-provisioner.go:202] Detected CSI driver ru.yandex.s3.csi
I0202 16:35:07.031785       1 csi-provisioner.go:244] CSI driver does not support PUBLISH_UNPUBLISH_VOLUME, not watching VolumeAttachments
I0202 16:35:07.036738       1 controller.go:753] Using saving PVs to API server in background
I0202 16:35:07.039105       1 reflector.go:219] Starting reflector *v1.StorageClass (1h0m0s) from k8s.io/client-go/informers/factory.go:134
I0202 16:35:07.039122       1 reflector.go:255] Listing and watching *v1.StorageClass from k8s.io/client-go/informers/factory.go:134
I0202 16:35:07.039305       1 reflector.go:219] Starting reflector *v1.PersistentVolumeClaim (15m0s) from k8s.io/client-go/informers/factory.go:134
I0202 16:35:07.039319       1 reflector.go:255] Listing and watching *v1.PersistentVolumeClaim from k8s.io/client-go/informers/factory.go:134
I0202 16:35:07.138079       1 shared_informer.go:270] caches populated
I0202 16:35:07.138130       1 shared_informer.go:270] caches populated
I0202 16:35:07.138145       1 controller.go:838] Starting provisioner controller ru.yandex.s3.csi_csi-provisioner-s3-0_badf9697-3fa5-4791-9451-c9bd26b93390!
I0202 16:35:07.138199       1 volume_store.go:97] Starting save volume queue
I0202 16:35:07.138406       1 reflector.go:219] Starting reflector *v1.StorageClass (15m0s) from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875
I0202 16:35:07.138422       1 reflector.go:255] Listing and watching *v1.StorageClass from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875
I0202 16:35:07.138508       1 reflector.go:219] Starting reflector *v1.PersistentVolume (15m0s) from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872
I0202 16:35:07.138522       1 reflector.go:255] Listing and watching *v1.PersistentVolume from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872
I0202 16:35:07.238352       1 shared_informer.go:270] caches populated
I0202 16:35:07.239277       1 controller.go:887] Started provisioner controller ru.yandex.s3.csi_csi-provisioner-s3-0_badf9697-3fa5-4791-9451-c9bd26b93390!
I0202 16:36:42.483489       1 controller.go:1335] provision "default/csi-s3-pvc" class "csi-s3": started
I0202 16:36:42.484625       1 event.go:282] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"default", Name:"csi-s3-pvc", UID:"1f0ba904-3899-4b16-bc7b-be51a2ca9117", APIVersion:"v1", ResourceVersion:"75228750", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "default/csi-s3-pvc"
I0202 16:36:42.715262       1 controller.go:762] create volume rep: {CapacityBytes:5368709120 VolumeId:pvc-1f0ba904-3899-4b16-bc7b-be51a2ca9117 VolumeContext:map[capacity:5368709120 mounter:geesefs options:--memory-limit 1000 --dir-mode 0777 --file-mode 0666] ContentSource:<nil> AccessibleTopology:[] XXX_NoUnkeyedLiteral:{} XXX_unrecognized:[] XXX_sizecache:0}
I0202 16:36:42.715361       1 controller.go:838] successfully created PV pvc-1f0ba904-3899-4b16-bc7b-be51a2ca9117 for PVC csi-s3-pvc and csi volume name pvc-1f0ba904-3899-4b16-bc7b-be51a2ca9117
I0202 16:36:42.715401       1 controller.go:1442] provision "default/csi-s3-pvc" class "csi-s3": volume "pvc-1f0ba904-3899-4b16-bc7b-be51a2ca9117" provisioned
I0202 16:36:42.715449       1 controller.go:1459] provision "default/csi-s3-pvc" class "csi-s3": succeeded
I0202 16:36:42.737031       1 event.go:282] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"default", Name:"csi-s3-pvc", UID:"1f0ba904-3899-4b16-bc7b-be51a2ca9117", APIVersion:"v1", ResourceVersion:"75228750", FieldPath:""}): type: 'Normal' reason: 'ProvisioningSucceeded' Successfully provisioned volume pvc-1f0ba904-3899-4b16-bc7b-be51a2ca9117
I0202 16:40:46.380807       1 streamwatcher.go:114] Unexpected EOF during watch stream event decoding: unexpected EOF
I0202 16:40:46.380858       1 reflector.go:530] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872: Watch close - *v1.PersistentVolume total 2 items received
I0202 16:40:46.383734       1 streamwatcher.go:114] Unexpected EOF during watch stream event decoding: unexpected EOF
I0202 16:40:46.383785       1 reflector.go:530] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.PersistentVolumeClaim total 4 items received
I0202 16:40:46.394870       1 streamwatcher.go:114] Unexpected EOF during watch stream event decoding: unexpected EOF
I0202 16:40:46.394915       1 reflector.go:530] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875: Watch close - *v1.StorageClass total 6 items received
I0202 16:40:46.401911       1 streamwatcher.go:114] Unexpected EOF during watch stream event decoding: unexpected EOF
I0202 16:40:46.402352       1 reflector.go:530] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.StorageClass total 6 items received
W0202 16:40:46.492857       1 reflector.go:436] k8s.io/client-go/informers/factory.go:134: watch of *v1.StorageClass ended with: very short watch: k8s.io/client-go/informers/factory.go:134: Unexpected watch close - watch lasted less than a second and no items received
W0202 16:40:46.492992       1 reflector.go:436] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872: watch of *v1.PersistentVolume ended with: very short watch: sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872: Unexpected watch close - watch lasted less than a second and no items received
W0202 16:40:46.493082       1 reflector.go:436] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875: watch of *v1.StorageClass ended with: very short watch: sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875: Unexpected watch close - watch lasted less than a second and no items received
W0202 16:40:46.493174       1 reflector.go:436] k8s.io/client-go/informers/factory.go:134: watch of *v1.PersistentVolumeClaim ended with: very short watch: k8s.io/client-go/informers/factory.go:134: Unexpected watch close - watch lasted less than a second and no items received
I0202 16:40:47.464729       1 reflector.go:255] Listing and watching *v1.PersistentVolumeClaim from k8s.io/client-go/informers/factory.go:134
E0202 16:40:47.465116       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://10.152.183.1:443/api/v1/persistentvolumeclaims?resourceVersion=75228758": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:47.533754       1 reflector.go:255] Listing and watching *v1.StorageClass from k8s.io/client-go/informers/factory.go:134
E0202 16:40:47.534164       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/storageclasses?resourceVersion=75228598": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:47.705315       1 reflector.go:255] Listing and watching *v1.PersistentVolume from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872
E0202 16:40:47.705760       1 reflector.go:138] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://10.152.183.1:443/api/v1/persistentvolumes?resourceVersion=75228755": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:47.944110       1 reflector.go:255] Listing and watching *v1.StorageClass from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875
E0202 16:40:47.944664       1 reflector.go:138] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/storageclasses?resourceVersion=75228598": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:49.643219       1 reflector.go:255] Listing and watching *v1.StorageClass from k8s.io/client-go/informers/factory.go:134
E0202 16:40:49.643938       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/storageclasses?resourceVersion=75228598": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:49.674324       1 reflector.go:255] Listing and watching *v1.PersistentVolumeClaim from k8s.io/client-go/informers/factory.go:134
E0202 16:40:49.675011       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://10.152.183.1:443/api/v1/persistentvolumeclaims?resourceVersion=75228758": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:49.997657       1 reflector.go:255] Listing and watching *v1.StorageClass from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875
E0202 16:40:49.998418       1 reflector.go:138] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/storageclasses?resourceVersion=75228598": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:50.056158       1 reflector.go:255] Listing and watching *v1.PersistentVolume from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872
E0202 16:40:50.056910       1 reflector.go:138] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://10.152.183.1:443/api/v1/persistentvolumes?resourceVersion=75228755": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:53.782000       1 reflector.go:255] Listing and watching *v1.StorageClass from k8s.io/client-go/informers/factory.go:134
E0202 16:40:53.782458       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/storageclasses?resourceVersion=75228598": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:53.897931       1 reflector.go:255] Listing and watching *v1.StorageClass from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875
E0202 16:40:53.898424       1 reflector.go:138] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/storageclasses?resourceVersion=75228598": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:53.907237       1 reflector.go:255] Listing and watching *v1.PersistentVolume from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872
E0202 16:40:53.907688       1 reflector.go:138] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://10.152.183.1:443/api/v1/persistentvolumes?resourceVersion=75228755": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:40:55.048217       1 reflector.go:255] Listing and watching *v1.PersistentVolumeClaim from k8s.io/client-go/informers/factory.go:134
E0202 16:40:55.048649       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://10.152.183.1:443/api/v1/persistentvolumeclaims?resourceVersion=75228758": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:02.492211       1 reflector.go:255] Listing and watching *v1.StorageClass from k8s.io/client-go/informers/factory.go:134
E0202 16:41:02.492977       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/storageclasses?resourceVersion=75228598": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:03.324726       1 reflector.go:255] Listing and watching *v1.PersistentVolumeClaim from k8s.io/client-go/informers/factory.go:134
E0202 16:41:03.326195       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://10.152.183.1:443/api/v1/persistentvolumeclaims?resourceVersion=75228758": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:03.950872       1 reflector.go:255] Listing and watching *v1.StorageClass from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875
E0202 16:41:03.951478       1 reflector.go:138] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/storageclasses?resourceVersion=75228598": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:05.827768       1 reflector.go:255] Listing and watching *v1.PersistentVolume from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872
E0202 16:41:05.828270       1 reflector.go:138] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://10.152.183.1:443/api/v1/persistentvolumes?resourceVersion=75228755": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:19.095777       1 reflector.go:255] Listing and watching *v1.StorageClass from k8s.io/client-go/informers/factory.go:134
E0202 16:41:19.096304       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/storageclasses?resourceVersion=75228598": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:19.395854       1 reflector.go:255] Listing and watching *v1.StorageClass from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875
E0202 16:41:19.396309       1 reflector.go:138] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://10.152.183.1:443/apis/storage.k8s.io/v1/storageclasses?resourceVersion=75228598": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:25.759262       1 reflector.go:255] Listing and watching *v1.PersistentVolumeClaim from k8s.io/client-go/informers/factory.go:134
E0202 16:41:25.759741       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://10.152.183.1:443/api/v1/persistentvolumeclaims?resourceVersion=75228758": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:29.704736       1 reflector.go:255] Listing and watching *v1.PersistentVolume from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872
E0202 16:41:29.705155       1 reflector.go:138] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://10.152.183.1:443/api/v1/persistentvolumes?resourceVersion=75228755": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:52.084426       1 reflector.go:255] Listing and watching *v1.PersistentVolumeClaim from k8s.io/client-go/informers/factory.go:134
E0202 16:41:52.084948       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://10.152.183.1:443/api/v1/persistentvolumeclaims?resourceVersion=75228758": dial tcp 10.152.183.1:443: connect: connection refused
I0202 16:41:58.406807       1 reflector.go:255] Listing and watching *v1.StorageClass from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:875
I0202 16:41:59.358441       1 reflector.go:255] Listing and watching *v1.PersistentVolume from sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872
I0202 16:42:02.532441       1 reflector.go:255] Listing and watching *v1.StorageClass from k8s.io/client-go/informers/factory.go:134
I0202 16:42:40.302707       1 reflector.go:255] Listing and watching *v1.PersistentVolumeClaim from k8s.io/client-go/informers/factory.go:134
I0202 16:47:22.418890       1 reflector.go:530] sigs.k8s.io/sig-storage-lib-external-provisioner/v6/controller/controller.go:872: Watch close - *v1.PersistentVolume total 0 items received
I0202 16:47:57.305723       1 reflector.go:530] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.PersistentVolumeClaim total 0 items received
I0202 16:50:00.536099       1 reflector.go:530] k8s.io/client-go/informers/factory.go:134: Watch close - *v1.StorageClass total 0 items received

from k8s-csi-s3.

axce1 avatar axce1 commented on September 7, 2024

Got same error:
Warning FailedMount 33s (x3 over 4m37s) kubelet MountVolume.MountDevice failed for volume "pvc-2e2e7250-c8bc-4577-b21e-1713f2ea781f" : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name ru.yandex.s3.csi not found in the list of registered CSI drivers

and no any error messages in csi-attacher-s3-, csi-provisioner-s3-, csi-s3-* pods

from k8s-csi-s3.

Kashemir001 avatar Kashemir001 commented on September 7, 2024

Probably unrelated to the cases above, but might help some other people searching by the error description.
In my case, a newly created node had a custom NoSchedule taint, and by default csi-s3 DaemonSet only has NoSchedule toleration for CriticalAddonsOnly taint. If you are using Helm chart, you can set tolerations.all value to true, otherwise manually modify the manifest file.

from k8s-csi-s3.

zaro avatar zaro commented on September 7, 2024

@shawn-gogh Had the same problem with k0s, and the provided fix work perfectly.

I simply did :

   cd /var/lib
   ln -s k0s/kubelet .

on the host and the error was gone. Of course it will be better to have support for this in the helm chart.

from k8s-csi-s3.

schlichtanders avatar schlichtanders commented on September 7, 2024

I am using autoscaler and am running into the very same error as reported in this initial issue.
When the pod is the first time created on the newly added autoscaled node, it fails with below error. After a restartOnFailure, the second attempt actually works.

MountVolume.MountDevice failed for volume "myvolume" : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name ru.yandex.s3.csi not found in the list of registered CSI drivers

I checked that the pvc and pv are already created, even before the autoscaler expands the cluster. So maybe some rebinding to a new node fails, but this sounds weird as a simple restart of the pod succeeds.

Or the csidriverclient is not yet available on the new nodes and it would be necessary to wait for it before creating the pod. Can someone help how to achieve this?

from k8s-csi-s3.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.