Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

## Unreleased

- OTLP exporters now log partial success responses.
([#4805](https://github.com/open-telemetry/opentelemetry-python/pull/4805))
- docs: Added sqlcommenter example
([#4734](https://github.com/open-telemetry/opentelemetry-python/pull/4734))
- build: bump ruff to 0.14.1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
# See the License for the specific language governing permissions and
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

# limitations under the License.


Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: can revert this

from os import environ
from typing import Dict, Literal, Optional, Sequence, Tuple, Union
from typing import Sequence as TypingSequence
Expand Down
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We add a DuplicateFilter to the logger on line 116 (as an aside I don't like the github code review tool, which doesn't let me add a comment to a line that wasn't modified...)

I'm thinking we should update that to reference record.lineno instead of record.msg which could be unique.... and that is sufficient to suppress noisy logging, and also avoid the endless logging issues...

If you want to take a stab at that change here that'd be great, if not I think we can proceed without that and I can make it separately..

I don't think we should special case the logging exporter because we have this duplicate filter thing.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, @DylanRussell ! Updated accordingly. Please take another look.

Also, any idea what's going on with the failed tests? They look to be unrelated to this change:
Screenshot Capture - 2025-11-24 - 21-56-05

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's some sort of flake we get all the time (can't find a commit hash), IDK why we get that. It's OK to ignore

Original file line number Diff line number Diff line change
Expand Up @@ -374,6 +374,10 @@ def _translate_data(
) -> ExportServiceRequestT:
pass

def _process_response(self, response):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: can we inline this code instead ? IMO it's more readable

if response.HasField("partial_success"):
logger.debug("Partial success:\n%s", response.partial_success)

def _export(
self,
data: SDKDataT,
Expand All @@ -388,11 +392,12 @@ def _export(
deadline_sec = time() + self._timeout
for retry_num in range(_MAX_RETRYS):
try:
self._client.Export(
response = self._client.Export(
request=self._translate_data(data),
metadata=self._headers,
timeout=deadline_sec - time(),
)
self._process_response(response)
return self._result.SUCCESS # type: ignore [reportReturnType]
except RpcError as error:
retry_info_bin = dict(error.trailing_metadata()).get( # type: ignore [reportAttributeAccessIssue]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,7 @@
)
from opentelemetry.exporter.otlp.proto.grpc.version import __version__
from opentelemetry.proto.collector.trace.v1.trace_service_pb2 import (
ExportTracePartialSuccess,
ExportTraceServiceRequest,
ExportTraceServiceResponse,
)
Expand Down Expand Up @@ -534,3 +535,22 @@ def test_permanent_failure(self):
warning.records[-1].message,
"Failed to export traces to localhost:4317, error code: StatusCode.ALREADY_EXISTS",
)

@patch("logging.Logger.debug")
def test_records_partial_success(self, mock_logger_debug):
exporter = OTLPSpanExporterForTesting(insecure=True)
# pylint: disable=protected-access
exporter._client = Mock()
partial_success = ExportTracePartialSuccess(
rejected_spans=1,
error_message="Span dropped",
)
exporter._client.Export.return_value = ExportTraceServiceResponse(
partial_success=partial_success
)
exporter.export([self.span])

mock_logger_debug.assert_called_once_with(
"Partial success:\n%s", partial_success
)
mock_logger_debug.reset_mock()
Original file line number Diff line number Diff line change
Expand Up @@ -46,17 +46,36 @@ class DuplicateFilter(logging.Filter):
recursion depth exceeded issue in cases where logging itself results in an exception."""

def filter(self, record):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

IMO I would simplify all this and just have the logic as:

        current_log_by_line = (
            record.module,
            record.pathname,
            record.lineno,
            time_boundary,
        )
        previous_log_by_line = getattr(self, "last_source", None)
        keep_processing = current_log_by_line != previous_log_by_line
        self.last_source = current_log_by_line 
        return keep_processing

Although I'm realizing now that we should store the previous 10 previous_log_by_line in like a deque or OrderedDict instead.. Right now if duplicate log messages are interleaved we will not filter anything out

Also we have tests for this thing here:

class TestCommonFuncs(unittest.TestCase):
def test_duplicate_logs_filter_works(self):
test_logger = logging.getLogger("testLogger")
test_logger.addFilter(DuplicateFilter())
with self.assertLogs("testLogger") as cm:
test_logger.info("message")
test_logger.info("message")
self.assertEqual(len(cm.output), 1)

current_log = (
# We need to pick a time longer than the OTLP LogExporter timeout
# which defaults to 10 seconds, but not pick something so long that
# it filters out useful logs.
time_boundary = time.time() // 20
current_log_by_line = (
record.module,
record.pathname,
record.lineno,
time_boundary,
)
previous_log_by_line = getattr(self, "last_source", None)
previous_log_by_line_count = getattr(self, "last_source_count", 1)
if current_log_by_line == previous_log_by_line:
recurring_count = previous_log_by_line_count + 1
self.last_source_count = recurring_count # pylint: disable=attribute-defined-outside-init
if recurring_count > 3:
# Don't allow further processing, since the same line is repeatedly emitting logs.
return False
else:
self.last_source = current_log_by_line # pylint: disable=attribute-defined-outside-init
self.last_source_count = 1 # pylint: disable=attribute-defined-outside-init

current_log_by_content = (
record.module,
record.levelno,
record.msg,
# We need to pick a time longer than the OTLP LogExporter timeout
# which defaults to 10 seconds, but not pick something so long that
# it filters out useful logs.
time.time() // 20,
time_boundary,
)
if current_log != getattr(self, "last_log", None):
self.last_log = current_log # pylint: disable=attribute-defined-outside-init
if current_log_by_content != getattr(self, "last_log", None):
self.last_log = current_log_by_content # pylint: disable=attribute-defined-outside-init
return True
# False means python's `logging` module will no longer process this log.
return False
Expand Down
Loading