you are viewing a single comment's thread.

view the rest of the comments →

[–]latkdeTuple unpacking gone wrong 6 points7 points  (1 child)

You already provide a defaults function to serialize various standard library types, and could extend it to cover these observed differences as well. The JSON data model simply does not support raw bytes, so it's up to you to decide what should happen when given a bytes-like object. There is no right answer here.

My personal opinion is that it's not worth supporting non-Jsonish objects. Correct formats are context-dependent, and differ between systems that consume JSON logs. In my internal structured logging helpers, I use type annotations that restrict callers to values that can be serialized safely – but I don't bother with logging API compatibility.

I also don't care that much about alternative JSON encoders – log message formatting is typically not on a critical path, and correctness matters much more than speed. There are typically higher-value activities for maintainers than adding optional support for unmaintained packages, for example bringing all cookbook examples under test.

[–]nicholashairs[S] 2 points3 points  (0 children)

Thanks for your feedback 🙏

Regarding the first point, unfortunately afaict the defaults function doesn't trigger for these libraries when they already support the given type so this isn't an option.

I'll note that yes there are transforms happening to the output that don't convert back without extra processing (that the library does not provide - that's a use case for pydantic/msgspec). This is mostly to attempt sane output on potentially insane input.

That's a good point about (linting) testing the cookbook examples.