admin管理员组文章数量:1026989
I am using latest .NET 9
for Azure Function and want to use distributed tracing accorss different Kafka
messages that are created by Azure Functions and are read by other Azure Functions.
So far I am able to create tracing via
services.AddOpenTelemetry()
.ConfigureResource(r => r.AddService("MyApplicationName", serviceVersion: "1.0"))
.WithTracing(tracing =>
{
tracing.SetSampler(new AlwaysOnSampler());
tracing.AddSource("MyApplicationName");
tracing.AddAspNetCoreInstrumentation(o => { o.RecordException = true; });
tracing.AddHttpClientInstrumentation(o => { o.RecordException = true; });
tracing.AddEntityFrameworkCoreInstrumentation();
tracing.AddNpgsql();
tracing.AddOtlpExporter(name: "OtlpTracing", configure: null);
});
services.AddSingleton(TracerProvider.Default.GetTracer(tracerSourceName));
The traces are written to Seq
.
This works so far.
In order to create a new trace I create a new RootActivity
via Helper class
public static class Tracing
{
public static readonly ActivitySource Source = new(
Constants.ApplicationName.ToTraceName(),
Constants.ApplicationVersion);
public static RootActivity StartRootActivity(this ActivitySource source,
string name,
ActivityKind kind = ActivityKind.Internal,
IEnumerable<KeyValuePair<string, object?>>? tags = null)
{
var parent = Activity.Current;
Activity.Current = null;
var next = source.StartActivity(name, kind,
parentContext: default,
tags: tags,
links: new[] { new ActivityLink(parent.Context) });
return new RootActivity(next, parent);
}
}
public class RootActivity : IDisposable
{
public Activity Activity { get; }
public Activity ParentActivity { get; }
public RootActivity(Activity activity, Activity parentActivity)
{
Activity = activity;
ParentActivity = parentActivity;
}
private bool disposedValue;
protected virtual void Dispose(bool disposing)
{
if (!disposedValue)
{
if (disposing)
{
Activity?.Dispose();
Activity.Current = ParentActivity;
}
disposedValue = true;
}
}
public void Dispose()
{
Dispose(disposing: true);
GC.SuppressFinalize(this);
}
}
In my function I then create a trace via
using var parentTrace = Tracing.Source.StartRootActivity("Linked Trace");
and the output of the function is written to a Kafka Queue following Microsoft Azure Function Kafka documentation (link).
[KafkaOutput("%KafkaBrokerList%",
"%KafkaTopic_UploadDocument%",
Username = "%KafkaUsername%",
Password = "%KafkaPassword%",
AuthenticationMode = BrokerAuthenticationMode.Plain)]
public UploadDocumentModel[]? KafkaEventUploadDocument { get; set; }
I have trouble adding a Propagator
for distributed tracing.
When reading a Kafka
message, I would like my trace to be continued accross different Azure Functions.
I have read, that for distributed tracing a propagator
is used, but I don't know how to implement it in Azure Functions as all examples are for Producer
/ Consumer
examples, which are not exposed in Azure Functions, only KafkaOutput
.
The examples I found basically say to use a propagator
and an injection
private static readonly TextMapPropagator _propagator = Propagators.DefaultTextMapPropagator;
...
var contextHeader = new Headers();
var contextToInject = Activity.Current.Context;
_propagator.Inject(new PropagationContext(contextToInject, Baggage.Current), contextHeader, InjectTraceContextIntoHeader);
...
private void InjectTraceContextIntoHeader(Headers headers, string key, string value)
{
try
{
headers.Add(key, Encoding.UTF8.GetBytes(value)); // adding as byte array
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to inject trace context.");
}
}
How would I do this with Azure Functions? Where the first call of the Azure Function creates a Parent Trace
, passes it on to the Kafka
message and then a different Azure Function triggered by the Kafka
message reads it and goes on with the trace?
I am using latest .NET 9
for Azure Function and want to use distributed tracing accorss different Kafka
messages that are created by Azure Functions and are read by other Azure Functions.
So far I am able to create tracing via
services.AddOpenTelemetry()
.ConfigureResource(r => r.AddService("MyApplicationName", serviceVersion: "1.0"))
.WithTracing(tracing =>
{
tracing.SetSampler(new AlwaysOnSampler());
tracing.AddSource("MyApplicationName");
tracing.AddAspNetCoreInstrumentation(o => { o.RecordException = true; });
tracing.AddHttpClientInstrumentation(o => { o.RecordException = true; });
tracing.AddEntityFrameworkCoreInstrumentation();
tracing.AddNpgsql();
tracing.AddOtlpExporter(name: "OtlpTracing", configure: null);
});
services.AddSingleton(TracerProvider.Default.GetTracer(tracerSourceName));
The traces are written to Seq
.
This works so far.
In order to create a new trace I create a new RootActivity
via Helper class
public static class Tracing
{
public static readonly ActivitySource Source = new(
Constants.ApplicationName.ToTraceName(),
Constants.ApplicationVersion);
public static RootActivity StartRootActivity(this ActivitySource source,
string name,
ActivityKind kind = ActivityKind.Internal,
IEnumerable<KeyValuePair<string, object?>>? tags = null)
{
var parent = Activity.Current;
Activity.Current = null;
var next = source.StartActivity(name, kind,
parentContext: default,
tags: tags,
links: new[] { new ActivityLink(parent.Context) });
return new RootActivity(next, parent);
}
}
public class RootActivity : IDisposable
{
public Activity Activity { get; }
public Activity ParentActivity { get; }
public RootActivity(Activity activity, Activity parentActivity)
{
Activity = activity;
ParentActivity = parentActivity;
}
private bool disposedValue;
protected virtual void Dispose(bool disposing)
{
if (!disposedValue)
{
if (disposing)
{
Activity?.Dispose();
Activity.Current = ParentActivity;
}
disposedValue = true;
}
}
public void Dispose()
{
Dispose(disposing: true);
GC.SuppressFinalize(this);
}
}
In my function I then create a trace via
using var parentTrace = Tracing.Source.StartRootActivity("Linked Trace");
and the output of the function is written to a Kafka Queue following Microsoft Azure Function Kafka documentation (link).
[KafkaOutput("%KafkaBrokerList%",
"%KafkaTopic_UploadDocument%",
Username = "%KafkaUsername%",
Password = "%KafkaPassword%",
AuthenticationMode = BrokerAuthenticationMode.Plain)]
public UploadDocumentModel[]? KafkaEventUploadDocument { get; set; }
I have trouble adding a Propagator
for distributed tracing.
When reading a Kafka
message, I would like my trace to be continued accross different Azure Functions.
I have read, that for distributed tracing a propagator
is used, but I don't know how to implement it in Azure Functions as all examples are for Producer
/ Consumer
examples, which are not exposed in Azure Functions, only KafkaOutput
.
The examples I found basically say to use a propagator
and an injection
private static readonly TextMapPropagator _propagator = Propagators.DefaultTextMapPropagator;
...
var contextHeader = new Headers();
var contextToInject = Activity.Current.Context;
_propagator.Inject(new PropagationContext(contextToInject, Baggage.Current), contextHeader, InjectTraceContextIntoHeader);
...
private void InjectTraceContextIntoHeader(Headers headers, string key, string value)
{
try
{
headers.Add(key, Encoding.UTF8.GetBytes(value)); // adding as byte array
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to inject trace context.");
}
}
How would I do this with Azure Functions? Where the first call of the Azure Function creates a Parent Trace
, passes it on to the Kafka
message and then a different Azure Function triggered by the Kafka
message reads it and goes on with the trace?
本文标签: NET Azure Function Kafka PropagatorStack Overflow
版权声明:本文标题:.NET Azure Function Kafka Propagator - Stack Overflow 内容由热心网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://it.en369.cn/questions/1745654762a2161533.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论