TLDR Ulisses is experiencing issues with a specific application that stops sending logs and metrics when the otel collector container is restarted. Srikanth suggested increasing the maximum received message size.
> ```Server responded with gRPC status code 2. Error message: FRAME_SIZE_ERROR: 4740180```
>
You can try increasing the recv msg size ``max_recv_msg_size_mib`` for otlp receiver
Ulisses
Thu, 09 Nov 2023 18:10:01 UTCI'm using the otel java agent (auto instrumentation) in 5 applications running on tomcat. Signoz is running with the docker compose config... If i restart the otel collector container, a specific application stop sending logs, metrics and trace data until i restart tomcat. I really don't know where to start to try to understand and solve the problem. otel-java-agent: 1.29.0 log: ```[otel.javaagent 2023-09-11 12:43:49:596 -0400] [OkHttp] WARN io.opentelemetry.exporter.internal.grpc.GrpcExporter - Failed to export logs. Server responded with gRPC status code 2. Error message: FRAME_SIZE_ERROR: 4740180
[otel.javaagent 2023-09-11 12:43:50:705 -0400] [OkHttp ] WARN io.opentelemetry.exporter.internal.grpc.GrpcExporter - Failed to export spans. Server responded with gRPC status code 2. Error message: FRAME_SIZE_ERROR: 4740180
[otel.javaagent 2023-09-11 12:43:53:404 -0400] [OkHttp ] WARN io.opentelemetry.exporter.internal.grpc.GrpcExporter - Failed to export metrics. Server responded with gRPC status code 2. Error message: java.io.IOException: FRAME_SIZE_ERROR: 4740180
[otel.javaagent 2023-09-11 12:44:05:717 -0400] [OkHttp ] WARN io.opentelemetry.exporter.internal.grpc.GrpcExporter - Failed to export spans. Server responded with gRPC status code 2. Error message: Broken pipe```