#support

Issue Accessing Pod Logs in SigNoz UI on AKS

TLDR prashant is facing an issue accessing pod logs of their application in SigNoz UI on AKS. nitya-signoz and Prashant provide suggestions related to log file paths and potential issues, but the problem remains unresolved.

Powered by Struct AI
May 22, 2023 (1 week ago)
prashant
Photo of md5-308371ae4a1b2ceefc0e1984d0bc9681
prashant
10:16 AM
Hi
we have installed signoz in k8 recently, we are facing issue that we are not getting pod logs of application in signoz ui, but we are getting the logs of signoz componets. can you please help me in this
10:17
prashant
10:17 AM
as per documention When you deploy SigNoz to your kubernetes cluster it will automatically start collecting all the pod logs
Vishal
Photo of md5-31f2f0cedfc4bc1be85b1ab6bf0b9040
Vishal
10:17 AM
nitya-signoz Can you look into this?
prashant
Photo of md5-308371ae4a1b2ceefc0e1984d0bc9681
prashant
10:31 AM
Vishal is there anyone who can help me for this?
nitya-signoz
Photo of md5-a52b9d6c34f193d9a1ff940024f36f77
nitya-signoz
11:37 AM
You are running EKS or something else ?
prashant
Photo of md5-308371ae4a1b2ceefc0e1984d0bc9681
prashant
12:14 PM
AKS
12:15
prashant
12:15 PM
nitya-signoz
nitya-signoz
Photo of md5-a52b9d6c34f193d9a1ff940024f36f77
nitya-signoz
12:18 PM
Don’t think we have tested properly in AKS. But the only reason this might not be working is that the filereader cannot access files in the node.

can you manually check if you can access files present in /var/log/pods/*/*/*.log https://github.com/SigNoz/charts/blob/5d322fbf515a3af07b4ea8102be6ca4b8f16654b/charts/k8s-infra/values.yaml#L86 through the collector pod

cc Prashant
prashant
Photo of md5-308371ae4a1b2ceefc0e1984d0bc9681
prashant
12:24 PM
there is no file in this path /var/log/pods//*/.log , i have checked inside the pods
nitya-signoz
Photo of md5-a52b9d6c34f193d9a1ff940024f36f77
nitya-signoz
12:25 PM
It means that AKS stores logs in a different path, I will have to check on this.

or maybe more permissions are required
prashant
Photo of md5-308371ae4a1b2ceefc0e1984d0bc9681
prashant
12:26 PM
but some logs we are getting of signoz componet, i have checked on those pod as well , there is no file in this path /var/log/pods//*/.log
12:26
prashant
12:26 PM
although we are getting logs of signoz components
nitya-signoz
Photo of md5-a52b9d6c34f193d9a1ff940024f36f77
nitya-signoz
12:27 PM
Interesting, Prashant any idea on the above ^?
Prashant
Photo of md5-1899629483c7ab1dccfbee6cc2f637b9
Prashant
12:27 PM
prashant do you have signoz and your application deployed in same cluster?
12:28
Prashant
12:28 PM
which namespace?
prashant
Photo of md5-308371ae4a1b2ceefc0e1984d0bc9681
prashant
12:28 PM
yes
12:28
prashant
12:28 PM
there are 3 namespaces
1. zp-rara-backend
2. zp-rara-frontend
3. zp-devops-tools
Prashant
Photo of md5-1899629483c7ab1dccfbee6cc2f637b9
Prashant
12:29 PM
okay, can you please share logs of the otel-agent pods?
prashant
Photo of md5-308371ae4a1b2ceefc0e1984d0bc9681
prashant
12:31 PM
sharing
12:32
prashant
12:32 PM
i am just sending you json which we are getting for signoz component log

{
"timestamp": 1684758022447026700,
"id": "2PyEEZGV7J0yMixQCHwdjkijkSk",
"trace_id": "",
"span_id": "",
"trace_flags": 0,
"severity_text": "",
"severity_number": 0,
"body": "10.20.193.90 - - [22/May/2023:12:20:22 +0000] \"GET /api/5QVY36-0856-7c2867 HTTP/1.1\" 400 30 \"-\" \"okhttp/2.7.5\" 147 0.003 [zp-rara-backend-zp-rara-ms-oms-brs-80] [] 10.20.192.192:8181 30 0.003 400 089ecf4890d6aebe1caec59720a10cf9",
"resources_string": {
"host_name": "aks-ondemand2c-23288688-vmss000007",
"k8s_cluster_name": "",
"k8s_container_name": "controller",
"k8s_container_restart_count": "2",
"k8s_namespace_name": "zp-devops-ingress",
"k8s_node_name": "aks-ondemand2c-23288688-vmss000007",
"k8s_pod_ip":
"k8s_pod_name": "main-public-controller-74d9bc46-hhrcw",
"k8s_pod_start_time": "2023-04-25 03:13:09 +0000 UTC",
"k8s_pod_uid": "1f5b75c1-eddb-4fe6-a121-f2293f23fcdd",
"os_type": "linux",
"signoz_component": "otel-agent"
},
"attributes_string": {
"log_file_path": "/var/log/pods/zp-devops-ingress_main-public-controller-74d9bc46-hhrcw_a80102e1-ff36-496d-9308-7ac18f6edd1b/controller/2.log",
"log_iostream": "stdout",
"logtag": "F",
"time": "2023-05-22T12:20:22.447026621Z"
},
"attributes_int": {},
"attributes_float": {}
}
12:37
prashant
12:37 PM
12:40
prashant
12:40 PM
2023-05-18T14:41:05.902Z    info    exporterhelper/queued_retry.go:426    Exporting failed. Will retry the request after interval.    {"kind": "exporter", "data_type": "logs", "name": "otlp", "error": "rpc error: code = DeadlineExceeded desc = context deadline exceeded", "interval": "26.412791317s"}
2023-05-18T14:41:05.906Z    info    exporterhelper/queued_retry.go:426    Exporting failed. Will retry the request after interval.    {"kind": "exporter", "data_type": "metrics", "name": "otlp", "error": "rpc error: code = DeadlineExceeded desc = context deadline exceeded", "interval": "23.285570238s"}
2023-05-18T14:41:06.808Z    error    exporterhelper/queued_retry.go:310    Dropping data because sending_queue is full. Try increasing queue_size.    {"kind": "exporter", "data_type": "logs", "name": "otlp", "dropped_items": 395}

    

    

    

    

    

    
2023-05-18T14:41:06.813Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.024Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.225Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.429Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.630Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.830Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:07.913Z    info    fileconsumer/file.go:171    Started watching file    {"kind": "receiver", "name": "filelog/k8s", "pipeline": "logs", "component": "fileconsumer", "path": "/var/log/pods/zp-devops-tools_zp-devops-cert-manager-webhook-7fd5b5c95b-zhdc7_e9eada80-182a-4d77-8a76-d90afb6fb822/cert-manager-webhook/14.log"}
2023-05-18T14:41:08.033Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.235Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.434Z    error    exporterhelper/queued_retry.go:175    Exporting failed. No more retries left. Dropping data.    {"kind": "exporter", "data_type": "metrics", "name": "otlp", "error": "max elapsed time expired rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing dial tcp 100.64.246.181:4317: connect: connection refused\"", "dropped_items": 855}

    

    

    

    

    
2023-05-18T14:41:08.443Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.644Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:08.849Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.053Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.256Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.459Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.661Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:09.862Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.065Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.269Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.472Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.673Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:10.874Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.076Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.276Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.477Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.680Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-18T14:41:11.881Z    warn    [email protected]/batch_processor.go:177    Sender failed    {"kind": "processor", "name": "batch", "pipeline": "logs", "error": "sending_queue is full"}
2023-05-22T04:38:24.713Z    info    fileconsumer/file.go:171    Started watching file    {"kind": "receiver", "name": "filelog/k8s", "pipeline": "logs", "component": "fileconsumer", "path": "/var/log/pods/zp-devops-tools_zp-devops-signoz-k8s-infra-otel-deployment-54cc957dd7-b26fj_42123d85-2ef3-4677-9fdd-73fa45712f69/zp-devops-signoz-k8s-infra-otel-deployment/0.log"}
May 23, 2023 (6 days ago)
prashant
Photo of md5-308371ae4a1b2ceefc0e1984d0bc9681
prashant
03:30 AM
nitya-signoz Prashant any update?
Prashant
Photo of md5-1899629483c7ab1dccfbee6cc2f637b9
Prashant
05:33 AM
looks like there were something wrong with signoz otel-collector or related clickhouse writer before.
to know more abt this, logs of signoz-otel-collector or clickhouse would be helpful.

But seems to be resolved on 2023-05-22T04:38, but few log files being watched.
It should work fine as long as you do not blacklist any pods or namespace.