Issue Accessing Pod Logs in SigNoz UI on AKS

TLDR prashant is facing an issue accessing pod logs of their application in SigNoz UI on AKS. nitya-signoz and Prashant provide suggestions related to log file paths and potential issues, but the problem remains unresolved.

Photo of prashant
prashant
Mon, 22 May 2023 10:16:15 UTC

Hi we have installed signoz in k8 recently, we are facing issue that we are not getting pod logs of application in signoz ui, but we are getting the logs of signoz componets. can you please help me in this

Photo of prashant
prashant
Mon, 22 May 2023 10:17:19 UTC

as per documention When you deploy SigNoz to your kubernetes cluster it will automatically start collecting all the pod logs

Photo of vishal-signoz
vishal-signoz
Mon, 22 May 2023 10:17:26 UTC

nitya-signoz Can you look into this?

Photo of prashant
prashant
Mon, 22 May 2023 10:31:39 UTC

vishal-signoz is there anyone who can help me for this?

Photo of nitya-signoz
nitya-signoz
Mon, 22 May 2023 11:37:16 UTC

You are running EKS or something else ?

Photo of prashant
prashant
Mon, 22 May 2023 12:14:52 UTC

AKS

Photo of prashant
prashant
Mon, 22 May 2023 12:15:09 UTC

nitya-signoz

Photo of nitya-signoz
nitya-signoz
Mon, 22 May 2023 12:18:10 UTC

Don’t think we have tested properly in AKS. But the only reason this might not be working is that the filereader cannot access files in the node. can you manually check if you can access files present in `/var/log/pods/*/*/*.log` through the collector pod cc Prashant

Photo of prashant
prashant
Mon, 22 May 2023 12:24:39 UTC

there is no file in this path /var/log/pods/*/*/*.log , i have checked inside the pods

Photo of nitya-signoz
nitya-signoz
Mon, 22 May 2023 12:25:52 UTC

It means that AKS stores logs in a different path, I will have to check on this. or maybe more permissions are required

Photo of prashant
prashant
Mon, 22 May 2023 12:26:03 UTC

but some logs we are getting of signoz componet, i have checked on those pod as well , there is no file in this path /var/log/pods/*/*/*.log

Photo of prashant
prashant
Mon, 22 May 2023 12:26:42 UTC

although we are getting logs of signoz components

Photo of nitya-signoz
nitya-signoz
Mon, 22 May 2023 12:27:27 UTC

Interesting, Prashant any idea on the above ^?

Photo of Prashant
Prashant
Mon, 22 May 2023 12:27:58 UTC

prashant do you have signoz and your application deployed in same cluster?

Photo of Prashant
Prashant
Mon, 22 May 2023 12:28:04 UTC

which namespace?

Photo of prashant
prashant
Mon, 22 May 2023 12:28:06 UTC

yes

Photo of prashant
prashant
Mon, 22 May 2023 12:28:57 UTC

there are 3 namespaces 1. zp-rara-backend 2. zp-rara-frontend 3. zp-devops-tools

Photo of Prashant
Prashant
Mon, 22 May 2023 12:29:33 UTC

okay, can you please share logs of the `otel-agent` pods?

Photo of prashant
prashant
Mon, 22 May 2023 12:31:13 UTC

sharing

Photo of prashant
prashant
Mon, 22 May 2023 12:32:50 UTC

i am just sending you json which we are getting for signoz component log { "timestamp": 1684758022447026700, "id": "2PyEEZGV7J0yMixQCHwdjkijkSk", "trace_id": "", "span_id": "", "trace_flags": 0, "severity_text": "", "severity_number": 0, "body": "10.20.193.90 - - [22/May/2023:12:20:22 +0000] \"GET /api/5QVY36-0856-7c2867 HTTP/1.1\" 400 30 \"-\" \"okhttp/2.7.5\" 147 0.003 [zp-rara-backend-zp-rara-ms-oms-brs-80] [] 10.20.192.192:8181 30 0.003 400 089ecf4890d6aebe1caec59720a10cf9", "resources_string": { "host_name": "aks-ondemand2c-23288688-vmss000007", "k8s_cluster_name": "", "k8s_container_name": "controller", "k8s_container_restart_count": "2", "k8s_namespace_name": "zp-devops-ingress", "k8s_node_name": "aks-ondemand2c-23288688-vmss000007", "k8s_pod_ip": "k8s_pod_name": "main-public-controller-74d9bc46-hhrcw", "k8s_pod_start_time": "2023-04-25 03:13:09 +0000 UTC", "k8s_pod_uid": "1f5b75c1-eddb-4fe6-a121-f2293f23fcdd", "os_type": "linux", "signoz_component": "otel-agent" }, "attributes_string": { "log_file_path": "/var/log/pods/zp-devops-ingress_main-public-controller-74d9bc46-hhrcw_a80102e1-ff36-496d-9308-7ac18f6edd1b/controller/2.log", "log_iostream": "stdout", "logtag": "F", "time": "2023-05-22T12:20:22.447026621Z" }, "attributes_int": {}, "attributes_float": {} }

Photo of prashant
prashant
Mon, 22 May 2023 12:37:14 UTC

Photo of prashant
prashant
Mon, 22 May 2023 12:40:25 UTC

```2023-05-18T14:41:05.902Z info exporterhelper/queued_retry.go:426 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "logs", "name": "otlp", "error": "rpc error: code = DeadlineExceeded desc = context deadline exceeded", "interval": "26.412791317s"} 2023-05-18T14:41:05.906Z info exporterhelper/queued_retry.go:426 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlp", "error": "rpc error: code = DeadlineExceeded desc = context deadline exceeded", "interval": "23.285570238s"} 2023-05-18T14:41:06.808Z error exporterhelper/queued_retry.go:310 Dropping data because sending_queue is full. Try increasing queue_size. {"kind": "exporter", "data_type": "logs", "name": "otlp", "dropped_items": 395} 2023-05-18T14:41:06.813Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:07.024Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:07.225Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:07.429Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:07.630Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:07.830Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:07.913Z info fileconsumer/file.go:171 Started watching file {"kind": "receiver", "name": "filelog/k8s", "pipeline": "logs", "component": "fileconsumer", "path": "/var/log/pods/zp-devops-tools_zp-devops-cert-manager-webhook-7fd5b5c95b-zhdc7_e9eada80-182a-4d77-8a76-d90afb6fb822/cert-manager-webhook/14.log"} 2023-05-18T14:41:08.033Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:08.235Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:08.434Z error exporterhelper/queued_retry.go:175 Exporting failed. No more retries left. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlp", "error": "max elapsed time expired rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing dial tcp 100.64.246.181:4317: connect: connection refused\"", "dropped_items": 855} 2023-05-18T14:41:08.443Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:08.644Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:08.849Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:09.053Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:09.256Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:09.459Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:09.661Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:09.862Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:10.065Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:10.269Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:10.472Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:10.673Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:10.874Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:11.076Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:11.276Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:11.477Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:11.680Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-18T14:41:11.881Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"} 2023-05-22T04:38:24.713Z info fileconsumer/file.go:171 Started watching file {"kind": "receiver", "name": "filelog/k8s", "pipeline": "logs", "component": "fileconsumer", "path": "/var/log/pods/zp-devops-tools_zp-devops-signoz-k8s-infra-otel-deployment-54cc957dd7-b26fj_42123d85-2ef3-4677-9fdd-73fa45712f69/zp-devops-signoz-k8s-infra-otel-deployment/0.log"}```

Photo of prashant
prashant
Tue, 23 May 2023 03:30:45 UTC

nitya-signoz Prashant any update?

Photo of Prashant
Prashant
Tue, 23 May 2023 05:33:41 UTC

looks like there were something wrong with signoz otel-collector or related clickhouse writer before. to know more abt this, logs of signoz-otel-collector or clickhouse would be helpful. But seems to be resolved on `2023-05-22T04:38`, but few log files being watched. It should work fine as long as you do not blacklist any pods or namespace.