Skip to content

Commit 8beb0cc

Browse files
TimPansinolrafeeihmstepanekumaannamalai
committed
Kafka-Python Serialization Metrics (#628)
* Add more metrics, consumer, and producer tests Co-authored-by: Timothy Pansino <[email protected]> Co-authored-by: Hannah Stepanek <[email protected]> * Remove breakpoint * Add DT accepted validator * Fix issue with holdover transaction Co-authored-by: Lalleh Rafeei <[email protected]> Co-authored-by: Hannah Stepanek <[email protected]> * [Mega-Linter] Apply linters fixes * Fix broken producer error test * [Mega-Linter] Apply linters fixes * Working on serializer instrumentation * Working on testing * Squashed commit of the following: commit 0b7b56b Author: Tim Pansino <[email protected]> Date: Tue Sep 20 15:26:49 2022 -0700 Assert records counts for consumer commit c0d32bb Author: Tim Pansino <[email protected]> Date: Tue Sep 20 15:05:20 2022 -0700 Add producer requirement to consumers Co-authored-by: Lalleh Rafeei <[email protected]> Co-authored-by: Hannah Stepanek <[email protected]> Co-authored-by: Uma Annamalai <[email protected]> commit 9e94920 Author: Tim Pansino <[email protected]> Date: Tue Sep 20 14:49:57 2022 -0700 Remove commented out code commit b2f1257 Author: Tim Pansino <[email protected]> Date: Tue Sep 20 14:49:47 2022 -0700 Fix exception tracebacks for py27 commit 686c9ae Author: Tim Pansino <[email protected]> Date: Tue Sep 20 14:27:40 2022 -0700 Fix errors in test and tox matrix Co-authored-by: Hannah Stepanek <[email protected]> Co-authored-by: Uma Annamalai <[email protected]> Co-authored-by: Lalleh Rafeei <[email protected]> commit 7f92c6d Author: Hannah Stepanek <[email protected]> Date: Tue Sep 20 13:54:48 2022 -0700 Fix Py2.7 kafka consumer first access issue commit 4245201 Author: Hannah Stepanek <[email protected]> Date: Tue Sep 20 12:34:35 2022 -0700 Add sudo to tz info commit 2251a0a Author: Hannah Stepanek <[email protected]> Date: Tue Sep 20 12:26:36 2022 -0700 Use ubuntu-latest commit 1ca1175 Author: Hannah Stepanek <[email protected]> Date: Tue Sep 20 12:15:59 2022 -0700 Grab librdkafka from confluent commit bb0a192 Author: Tim Pansino <[email protected]> Date: Tue Sep 20 10:37:33 2022 -0700 Fixed cutting release from lsb commit 3cf3852 Author: Hannah Stepanek <[email protected]> Date: Tue Sep 20 10:19:24 2022 -0700 Fixup: librdkafka installed from universe commit bf20359 Author: Tim Pansino <[email protected]> Date: Mon Sep 19 16:46:23 2022 -0700 Use lsb to install librdkafka-dev commit a85e3fd Merge: 7fc2b89 d5cf9a0 Author: Timothy Pansino <[email protected]> Date: Mon Sep 19 16:34:19 2022 -0700 Merge branch 'develop-kafka' into feature-confluent-kafka commit 7fc2b89 Author: Tim Pansino <[email protected]> Date: Mon Sep 19 16:30:19 2022 -0700 Fix package name commit f25c59d Author: Hannah Stepanek <[email protected]> Date: Mon Sep 19 16:00:07 2022 -0700 Specify later version of librdkafka commit 6be9b43 Author: Hannah Stepanek <[email protected]> Date: Mon Sep 19 09:26:39 2022 -0700 Fix removing client_id from incorrect kafka commit d658ef1 Author: Hannah Stepanek <[email protected]> Date: Fri Sep 16 10:50:04 2022 -0700 Add install of librdkafka-dev for kafka commit 940f9f4 Author: Tim Pansino <[email protected]> Date: Fri Sep 9 16:09:21 2022 -0700 Clean up names commit 8bbed46 Author: Tim Pansino <[email protected]> Date: Fri Sep 9 14:58:21 2022 -0700 Serialization timing commit 761c753 Merge: bccb321 7897a99 Author: Hannah Stepanek <[email protected]> Date: Fri Sep 9 12:51:29 2022 -0700 Merge branch 'feature-confluent-kafka' of github.com:newrelic/newrelic-python-agent into feature-confluent-kafka commit bccb321 Author: Hannah Stepanek <[email protected]> Date: Fri Sep 9 12:51:09 2022 -0700 Fix test_consumer_errors tests commit 7897a99 Author: hmstepanek <[email protected]> Date: Fri Sep 9 19:04:58 2022 +0000 [Mega-Linter] Apply linters fixes commit 34c084c Author: Hannah Stepanek <[email protected]> Date: Fri Sep 9 12:01:18 2022 -0700 Fix consumer tests & reorg fixtures commit ff77d90 Author: Tim Pansino <[email protected]> Date: Thu Sep 8 14:53:28 2022 -0700 Consumer testing setup commit 9f1451e Author: Tim Pansino <[email protected]> Date: Thu Sep 8 13:25:15 2022 -0700 Confluent kafka test setup commit 74c443c Author: Tim Pansino <[email protected]> Date: Thu Sep 8 11:49:03 2022 -0700 Starting work on confluent kafka * Clean up tests to refactor out fixtures * Refactor and test serialization tracing * Finish fixing testing for serialization. Co-authored-by: Lalleh Rafeei <[email protected]> Co-authored-by: Hannah Stepanek <[email protected]> Co-authored-by: Uma Annamalai <[email protected]> * Add serializer object test * Remove unused test files * Starting merge of confluent kafka to kafka python * Make message trace terminal optional * Clean up confluent_kafka tests * Update kafkapython tests * Fix failing tests * Finish kafkapython serialization Co-authored-by: Lalleh Rafeei <[email protected]> Co-authored-by: Hannah Stepanek <[email protected]> Co-authored-by: Uma Annamalai <[email protected]> * Clean up tests * Remove kafkapython deserialization metrics * Fix py2 testing differences * Add mutliple transaction test * Rename fixtures * Fix multiple transaction consumer failure Co-authored-by: Lalleh Rafeei <[email protected]> Co-authored-by: Timothy Pansino <[email protected]> Co-authored-by: Hannah Stepanek <[email protected]> Co-authored-by: Lalleh Rafeei <[email protected]> Co-authored-by: Uma Annamalai <[email protected]> Co-authored-by: Hannah Stepanek <[email protected]>
1 parent 75a8c6e commit 8beb0cc

File tree

8 files changed

+456
-156
lines changed

8 files changed

+456
-156
lines changed

newrelic/api/message_trace.py

Lines changed: 10 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -28,14 +28,16 @@ class MessageTrace(CatHeaderMixin, TimeTrace):
2828
cat_appdata_key = "NewRelicAppData"
2929
cat_synthetics_key = "NewRelicSynthetics"
3030

31-
def __init__(self, library, operation, destination_type, destination_name, params=None, **kwargs):
31+
def __init__(self, library, operation, destination_type, destination_name, params=None, terminal=True, **kwargs):
3232
parent = kwargs.pop("parent", None)
3333
source = kwargs.pop("source", None)
3434
if kwargs:
3535
raise TypeError("Invalid keyword arguments:", kwargs)
3636

3737
super(MessageTrace, self).__init__(parent=parent, source=source)
3838

39+
self.terminal = terminal
40+
3941
self.library = library
4042
self.operation = operation
4143

@@ -69,7 +71,7 @@ def __repr__(self):
6971
)
7072

7173
def terminal_node(self):
72-
return True
74+
return self.terminal
7375

7476
def create_node(self):
7577
return MessageNode(
@@ -89,7 +91,7 @@ def create_node(self):
8991
)
9092

9193

92-
def MessageTraceWrapper(wrapped, library, operation, destination_type, destination_name, params={}):
94+
def MessageTraceWrapper(wrapped, library, operation, destination_type, destination_name, params={}, terminal=True):
9395
def _nr_message_trace_wrapper_(wrapped, instance, args, kwargs):
9496
wrapper = async_wrapper(wrapped)
9597
if not wrapper:
@@ -131,7 +133,7 @@ def _nr_message_trace_wrapper_(wrapped, instance, args, kwargs):
131133
else:
132134
_destination_name = destination_name
133135

134-
trace = MessageTrace(_library, _operation, _destination_type, _destination_name, params={}, parent=parent, source=wrapped)
136+
trace = MessageTrace(_library, _operation, _destination_type, _destination_name, params={}, terminal=terminal, parent=parent, source=wrapped)
135137

136138
if wrapper: # pylint: disable=W0125,W0126
137139
return wrapper(wrapped, trace)(*args, **kwargs)
@@ -142,18 +144,19 @@ def _nr_message_trace_wrapper_(wrapped, instance, args, kwargs):
142144
return FunctionWrapper(wrapped, _nr_message_trace_wrapper_)
143145

144146

145-
def message_trace(library, operation, destination_type, destination_name, params={}):
147+
def message_trace(library, operation, destination_type, destination_name, params={}, terminal=True):
146148
return functools.partial(
147149
MessageTraceWrapper,
148150
library=library,
149151
operation=operation,
150152
destination_type=destination_type,
151153
destination_name=destination_name,
152154
params=params,
155+
terminal=terminal,
153156
)
154157

155158

156-
def wrap_message_trace(module, object_path, library, operation, destination_type, destination_name, params={}):
159+
def wrap_message_trace(module, object_path, library, operation, destination_type, destination_name, params={}, terminal=True):
157160
wrap_object(
158-
module, object_path, MessageTraceWrapper, (library, operation, destination_type, destination_name, params)
161+
module, object_path, MessageTraceWrapper, (library, operation, destination_type, destination_name, params, terminal)
159162
)

newrelic/core/message_node.py

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,6 +51,12 @@ def time_metrics(self, stats, root, parent):
5151
yield TimeMetric(name=name, scope=root.path,
5252
duration=self.duration, exclusive=self.exclusive)
5353

54+
# Now for the children, if the trace is not terminal.
55+
56+
for child in self.children:
57+
for metric in child.time_metrics(stats, root, self):
58+
yield metric
59+
5460
def trace_node(self, stats, root, connections):
5561
name = root.string_table.cache(self.name)
5662

newrelic/hooks/messagebroker_kafkapython.py

Lines changed: 132 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -13,12 +13,19 @@
1313
# limitations under the License.
1414
import sys
1515

16+
from kafka.serializer import Serializer
17+
1618
from newrelic.api.application import application_instance
19+
from newrelic.api.function_trace import FunctionTraceWrapper
1720
from newrelic.api.message_trace import MessageTrace
1821
from newrelic.api.message_transaction import MessageTransaction
19-
from newrelic.api.time_trace import notice_error
22+
from newrelic.api.time_trace import current_trace, notice_error
2023
from newrelic.api.transaction import current_transaction
21-
from newrelic.common.object_wrapper import wrap_function_wrapper
24+
from newrelic.common.object_wrapper import (
25+
ObjectProxy,
26+
function_wrapper,
27+
wrap_function_wrapper,
28+
)
2229

2330
HEARTBEAT_POLL = "MessageBroker/Kafka/Heartbeat/Poll"
2431
HEARTBEAT_SENT = "MessageBroker/Kafka/Heartbeat/Sent"
@@ -47,6 +54,7 @@ def wrap_KafkaProducer_send(wrapped, instance, args, kwargs):
4754
destination_type="Topic",
4855
destination_name=topic or "Default",
4956
source=wrapped,
57+
terminal=False,
5058
) as trace:
5159
dt_headers = [(k, v.encode("utf-8")) for k, v in trace.generate_request_headers(transaction)]
5260
headers.extend(dt_headers)
@@ -57,49 +65,6 @@ def wrap_KafkaProducer_send(wrapped, instance, args, kwargs):
5765
raise
5866

5967

60-
def metric_wrapper(metric_name, check_result=False):
61-
def _metric_wrapper(wrapped, instance, args, kwargs):
62-
result = wrapped(*args, **kwargs)
63-
64-
application = application_instance(activate=False)
65-
if application:
66-
if not check_result or check_result and result:
67-
# If the result does not need validated, send metric.
68-
# If the result does need validated, ensure it is True.
69-
application.record_custom_metric(metric_name, 1)
70-
71-
return result
72-
73-
return _metric_wrapper
74-
75-
76-
def instrument_kafka_heartbeat(module):
77-
if hasattr(module, "Heartbeat"):
78-
if hasattr(module.Heartbeat, "poll"):
79-
wrap_function_wrapper(module, "Heartbeat.poll", metric_wrapper(HEARTBEAT_POLL))
80-
81-
if hasattr(module.Heartbeat, "fail_heartbeat"):
82-
wrap_function_wrapper(module, "Heartbeat.fail_heartbeat", metric_wrapper(HEARTBEAT_FAIL))
83-
84-
if hasattr(module.Heartbeat, "sent_heartbeat"):
85-
wrap_function_wrapper(module, "Heartbeat.sent_heartbeat", metric_wrapper(HEARTBEAT_SENT))
86-
87-
if hasattr(module.Heartbeat, "received_heartbeat"):
88-
wrap_function_wrapper(module, "Heartbeat.received_heartbeat", metric_wrapper(HEARTBEAT_RECEIVE))
89-
90-
if hasattr(module.Heartbeat, "session_timeout_expired"):
91-
wrap_function_wrapper(
92-
module,
93-
"Heartbeat.session_timeout_expired",
94-
metric_wrapper(HEARTBEAT_SESSION_TIMEOUT, check_result=True),
95-
)
96-
97-
if hasattr(module.Heartbeat, "poll_timeout_expired"):
98-
wrap_function_wrapper(
99-
module, "Heartbeat.poll_timeout_expired", metric_wrapper(HEARTBEAT_POLL_TIMEOUT, check_result=True)
100-
)
101-
102-
10368
def wrap_kafkaconsumer_next(wrapped, instance, args, kwargs):
10469
if hasattr(instance, "_nr_transaction") and not instance._nr_transaction.stopped:
10570
instance._nr_transaction.__exit__(*sys.exc_info())
@@ -110,7 +75,12 @@ def wrap_kafkaconsumer_next(wrapped, instance, args, kwargs):
11075
# StopIteration is an expected error, indicating the end of an iterable,
11176
# that should not be captured.
11277
if not isinstance(e, StopIteration):
113-
notice_error()
78+
if current_transaction():
79+
# Report error on existing transaction if there is one
80+
notice_error()
81+
else:
82+
# Report error on application
83+
notice_error(application=application_instance(activate=False))
11484
raise
11585

11686
if record:
@@ -177,11 +147,126 @@ def wrap_kafkaconsumer_next(wrapped, instance, args, kwargs):
177147
return record
178148

179149

150+
def wrap_KafkaProducer_init(wrapped, instance, args, kwargs):
151+
get_config_key = lambda key: kwargs.get(key, instance.DEFAULT_CONFIG[key]) # noqa: E731
152+
153+
kwargs["key_serializer"] = wrap_serializer(
154+
instance, "Serialization/Key", "MessageBroker", get_config_key("key_serializer")
155+
)
156+
kwargs["value_serializer"] = wrap_serializer(
157+
instance, "Serialization/Value", "MessageBroker", get_config_key("value_serializer")
158+
)
159+
160+
return wrapped(*args, **kwargs)
161+
162+
163+
class NewRelicSerializerWrapper(ObjectProxy):
164+
def __init__(self, wrapped, serializer_name, group_prefix):
165+
ObjectProxy.__init__.__get__(self)(wrapped)
166+
167+
self._nr_serializer_name = serializer_name
168+
self._nr_group_prefix = group_prefix
169+
170+
def serialize(self, topic, object):
171+
wrapped = self.__wrapped__.serialize
172+
args = (topic, object)
173+
kwargs = {}
174+
175+
if not current_transaction():
176+
return wrapped(*args, **kwargs)
177+
178+
group = "%s/Kafka/Topic" % self._nr_group_prefix
179+
name = "Named/%s/%s" % (topic, self._nr_serializer_name)
180+
181+
return FunctionTraceWrapper(wrapped, name=name, group=group)(*args, **kwargs)
182+
183+
184+
def wrap_serializer(client, serializer_name, group_prefix, serializer):
185+
@function_wrapper
186+
def _wrap_serializer(wrapped, instance, args, kwargs):
187+
transaction = current_transaction()
188+
if not transaction:
189+
return wrapped(*args, **kwargs)
190+
191+
topic = "Unknown"
192+
if isinstance(transaction, MessageTransaction):
193+
topic = transaction.destination_name
194+
else:
195+
# Find parent message trace to retrieve topic
196+
message_trace = current_trace()
197+
while message_trace is not None and not isinstance(message_trace, MessageTrace):
198+
message_trace = message_trace.parent
199+
if message_trace:
200+
topic = message_trace.destination_name
201+
202+
group = "%s/Kafka/Topic" % group_prefix
203+
name = "Named/%s/%s" % (topic, serializer_name)
204+
205+
return FunctionTraceWrapper(wrapped, name=name, group=group)(*args, **kwargs)
206+
207+
try:
208+
# Apply wrapper to serializer
209+
if serializer is None:
210+
# Do nothing
211+
return serializer
212+
elif isinstance(serializer, Serializer):
213+
return NewRelicSerializerWrapper(serializer, group_prefix=group_prefix, serializer_name=serializer_name)
214+
else:
215+
# Wrap callable in wrapper
216+
return _wrap_serializer(serializer)
217+
except Exception:
218+
return serializer # Avoid crashes from immutable serializers
219+
220+
221+
def metric_wrapper(metric_name, check_result=False):
222+
def _metric_wrapper(wrapped, instance, args, kwargs):
223+
result = wrapped(*args, **kwargs)
224+
225+
application = application_instance(activate=False)
226+
if application:
227+
if not check_result or check_result and result:
228+
# If the result does not need validated, send metric.
229+
# If the result does need validated, ensure it is True.
230+
application.record_custom_metric(metric_name, 1)
231+
232+
return result
233+
234+
return _metric_wrapper
235+
236+
180237
def instrument_kafka_producer(module):
181238
if hasattr(module, "KafkaProducer"):
239+
wrap_function_wrapper(module, "KafkaProducer.__init__", wrap_KafkaProducer_init)
182240
wrap_function_wrapper(module, "KafkaProducer.send", wrap_KafkaProducer_send)
183241

184242

185243
def instrument_kafka_consumer_group(module):
186244
if hasattr(module, "KafkaConsumer"):
187-
wrap_function_wrapper(module.KafkaConsumer, "__next__", wrap_kafkaconsumer_next)
245+
wrap_function_wrapper(module, "KafkaConsumer.__next__", wrap_kafkaconsumer_next)
246+
247+
248+
def instrument_kafka_heartbeat(module):
249+
if hasattr(module, "Heartbeat"):
250+
if hasattr(module.Heartbeat, "poll"):
251+
wrap_function_wrapper(module, "Heartbeat.poll", metric_wrapper(HEARTBEAT_POLL))
252+
253+
if hasattr(module.Heartbeat, "fail_heartbeat"):
254+
wrap_function_wrapper(module, "Heartbeat.fail_heartbeat", metric_wrapper(HEARTBEAT_FAIL))
255+
256+
if hasattr(module.Heartbeat, "sent_heartbeat"):
257+
wrap_function_wrapper(module, "Heartbeat.sent_heartbeat", metric_wrapper(HEARTBEAT_SENT))
258+
259+
if hasattr(module.Heartbeat, "received_heartbeat"):
260+
wrap_function_wrapper(module, "Heartbeat.received_heartbeat", metric_wrapper(HEARTBEAT_RECEIVE))
261+
262+
if hasattr(module.Heartbeat, "session_timeout_expired"):
263+
wrap_function_wrapper(
264+
module,
265+
"Heartbeat.session_timeout_expired",
266+
metric_wrapper(HEARTBEAT_SESSION_TIMEOUT, check_result=True),
267+
)
268+
269+
if hasattr(module.Heartbeat, "poll_timeout_expired"):
270+
wrap_function_wrapper(
271+
module, "Heartbeat.poll_timeout_expired", metric_wrapper(HEARTBEAT_POLL_TIMEOUT, check_result=True)
272+
)

0 commit comments

Comments
 (0)