Issue Accessing Pod Logs in SigNoz UI on AKS
TLDR prashant is facing an issue accessing pod logs of their application in SigNoz UI on AKS. nitya-signoz and Prashant provide suggestions related to log file paths and potential issues, but the problem remains unresolved.
May 22, 2023 (1 week ago)
prashant
10:16 AMwe have installed signoz in k8 recently, we are facing issue that we are not getting pod logs of application in signoz ui, but we are getting the logs of signoz componets. can you please help me in this
prashant
10:17 AMVishal
10:17 AMprashant
10:31 AMnitya-signoz
11:37 AMprashant
12:14 PMprashant
12:15 PMnitya-signoz
12:18 PMcan you manually check if you can access files present in
/var/log/pods/*/*/*.log
https://github.com/SigNoz/charts/blob/5d322fbf515a3af07b4ea8102be6ca4b8f16654b/charts/k8s-infra/values.yaml#L86 through the collector podcc Prashant
prashant
12:24 PMnitya-signoz
12:25 PMor maybe more permissions are required
prashant
12:26 PMprashant
12:26 PMnitya-signoz
12:27 PMPrashant
12:27 PMPrashant
12:28 PMprashant
12:28 PMprashant
12:28 PM1. zp-rara-backend
2. zp-rara-frontend
3. zp-devops-tools
Prashant
12:29 PMotel-agent
pods?prashant
12:31 PMprashant
12:32 PM{
"timestamp": 1684758022447026700,
"id": "2PyEEZGV7J0yMixQCHwdjkijkSk",
"trace_id": "",
"span_id": "",
"trace_flags": 0,
"severity_text": "",
"severity_number": 0,
"body": "10.20.193.90 - - [22/May/2023:12:20:22 +0000] \"GET /api/5QVY36-0856-7c2867 HTTP/1.1\" 400 30 \"-\" \"okhttp/2.7.5\" 147 0.003 [zp-rara-backend-zp-rara-ms-oms-brs-80] [] 10.20.192.192:8181 30 0.003 400 089ecf4890d6aebe1caec59720a10cf9",
"resources_string": {
"host_name": "aks-ondemand2c-23288688-vmss000007",
"k8s_cluster_name": "",
"k8s_container_name": "controller",
"k8s_container_restart_count": "2",
"k8s_namespace_name": "zp-devops-ingress",
"k8s_node_name": "aks-ondemand2c-23288688-vmss000007",
"k8s_pod_ip":
"k8s_pod_name": "main-public-controller-74d9bc46-hhrcw",
"k8s_pod_start_time": "2023-04-25 03:13:09 +0000 UTC",
"k8s_pod_uid": "1f5b75c1-eddb-4fe6-a121-f2293f23fcdd",
"os_type": "linux",
"signoz_component": "otel-agent"
},
"attributes_string": {
"log_file_path": "/var/log/pods/zp-devops-ingress_main-public-controller-74d9bc46-hhrcw_a80102e1-ff36-496d-9308-7ac18f6edd1b/controller/2.log",
"log_iostream": "stdout",
"logtag": "F",
"time": "2023-05-22T12:20:22.447026621Z"
},
"attributes_int": {},
"attributes_float": {}
}
prashant
12:37 PMprashant
12:40 PM2023-05-18T14:41:05.902Z info exporterhelper/queued_retry.go:426 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "logs", "name": "otlp", "error": "rpc error: code = DeadlineExceeded desc = context deadline exceeded", "interval": "26.412791317s"}
2023-05-18T14:41:05.906Z info exporterhelper/queued_retry.go:426 Exporting failed. Will retry the request after interval. {"kind": "exporter", "data_type": "metrics", "name": "otlp", "error": "rpc error: code = DeadlineExceeded desc = context deadline exceeded", "interval": "23.285570238s"}
2023-05-18T14:41:06.808Z error exporterhelper/queued_retry.go:310 Dropping data because sending_queue is full. Try increasing queue_size. {"kind": "exporter", "data_type": "logs", "name": "otlp", "dropped_items": 395}
2023-05-18T14:41:06.813Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.024Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.225Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.429Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.630Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.830Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.913Z info fileconsumer/file.go:171 Started watching file {"kind": "receiver", "name": "filelog/k8s", "pipeline": "logs", "component": "fileconsumer", "path": "/var/log/pods/zp-devops-tools_zp-devops-cert-manager-webhook-7fd5b5c95b-zhdc7_e9eada80-182a-4d77-8a76-d90afb6fb822/cert-manager-webhook/14.log"}
2023-05-18T14:41:08.033Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.235Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.434Z error exporterhelper/queued_retry.go:175 Exporting failed. No more retries left. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlp", "error": "max elapsed time expired rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing dial tcp 100.64.246.181:4317: connect: connection refused\"", "dropped_items": 855}
2023-05-18T14:41:08.443Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.644Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.849Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.053Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.256Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.459Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.661Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.862Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.065Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.269Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.472Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.673Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.874Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.076Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.276Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.477Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.680Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.881Z warn [email protected]/batch_processor.go:177 Sender failed {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-22T04:38:24.713Z info fileconsumer/file.go:171 Started watching file {"kind": "receiver", "name": "filelog/k8s", "pipeline": "logs", "component": "fileconsumer", "path": "/var/log/pods/zp-devops-tools_zp-devops-signoz-k8s-infra-otel-deployment-54cc957dd7-b26fj_42123d85-2ef3-4677-9fdd-73fa45712f69/zp-devops-signoz-k8s-infra-otel-deployment/0.log"}
May 23, 2023 (6 days ago)
prashant
03:30 AMPrashant
05:33 AMto know more abt this, logs of signoz-otel-collector or clickhouse would be helpful.
But seems to be resolved on
2023-05-22T04:38
, but few log files being watched.It should work fine as long as you do not blacklist any pods or namespace.
SigNoz Community
Similar Threads
SigNoz Helm Chart Not Displaying Kubernetes Pod Logs
Juha reported an issue with SigNoz Helm chart not displaying Kubernetes pod logs. nitya-signoz suggests an Oracle cloud-related problem and asks to create a GitHub issue.
Connection Issues with signoz-otel-collector on k8s Cluster
Ankit encounters connection issues with signoz-otel-collector on a k8s cluster. Prashant suggests checking logs and status, but the issue remains unresolved.
Issues with SigNoz Setup and Data Persistence in AKS
Vaibhavi experienced issues setting up SigNoz in AKS, and faced data persistence issues after installation. Srikanth provided guidance on ClickHouse version compatibility and resource requirements, helping Vaibhavi troubleshoot and resolve the issue.
Signoz Kubernetes Logs Issue with External Clickhouse
Syed installed Signoz on Kubernetes EKS, but experiencing missing logs and delays using external Clickhouse.
Log Monitoring Connection Error in Signoz
Utkarsh encountered a connection error while adding log monitoring using Signoz. nitya-signoz identified the incorrect otlp exporter port and suggested using 4317.
