Skip to content

Commit c999422

Browse files
committed
first draft done
1 parent cbf54e3 commit c999422

File tree

1 file changed

+154
-0
lines changed

1 file changed

+154
-0
lines changed
Lines changed: 154 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,154 @@
1+
---
2+
title: Instrument AI Agents
3+
sidebar_order: 500
4+
description: "Learn how to instrument your code to use Sentry's AI Agents module with Microsoft.Extensions.AI."
5+
---
6+
7+
With <Link to="/product/insights/ai/agents/dashboard/">Sentry AI Agent Monitoring</Link>, you can monitor and debug your AI systems with full-stack context. You'll be able to track key insights like token usage, latency, tool usage, and error rates. AI Agent Monitoring data will be fully connected to your other Sentry data like logs, errors, and traces.
8+
9+
As a prerequisite to setting up AI Agent Monitoring with .NET, you'll need to first <PlatformLink to="/tracing/">set up tracing</PlatformLink>. Once this is done, you can use the `Sentry.Extensions.AI` package to automatically instrument AI agents created with `Microsoft.Extensions.AI`.
10+
11+
## Installation
12+
13+
Install the `Sentry.Extensions.AI` package:
14+
15+
```shell {tabTitle:.NET CLI}
16+
dotnet add package Sentry.Extensions.AI
17+
```
18+
19+
```shell {tabTitle:Package Manager}
20+
Install-Package Sentry.Extensions.AI
21+
```
22+
23+
The `Sentry.Extensions.AI` integration depends on the `Microsoft.Extensions.AI.Abstractions` package (version 9.7.0 or higher).
24+
25+
## Automatic Instrumentation
26+
27+
The `Sentry.Extensions.AI` package provides automatic instrumentation for AI agents built with [Microsoft.Extensions.AI](https://devblogs.microsoft.com/dotnet/introducing-microsoft-extensions-ai-preview/). This works with any AI provider that implements the `IChatClient` interface, including:
28+
29+
- [Microsoft.Extensions.AI.OpenAI](https://www.nuget.org/packages/Microsoft.Extensions.AI.OpenAI/)
30+
- [Microsoft.Extensions.AI.AzureAIInference](https://www.nuget.org/packages/Microsoft.Extensions.AI.AzureAIInference/https://www.nuget.org/packages/Microsoft.Extensions.AI.AzureAIInference/)
31+
- [Anthropic.SDK](https://www.nuget.org/packages/Anthropic.SDK)
32+
33+
### Basic Setup
34+
35+
<Alert level="warning" title="Important">
36+
AI Agent monitoring is marked as experimental.
37+
</Alert>
38+
39+
To instrument your AI agent, wrap your `IChatClient` with the `AddSentry()` extension method:
40+
41+
If your AI agent uses tools (function calling), you can instrument them using the `AddSentryToolInstrumentation()` extension method on `ChatOptions`:
42+
43+
<Alert level="warning" title="When using tools">
44+
You must wrap your `IChatClient` before creating a `ChatClientBuilder` with it. If you run `AddSentry()` on an `IChatClient` that already has function invocation, spans will not show up correctly.
45+
</Alert>
46+
47+
```csharp
48+
// Wrap your IChatClient with Sentry instrumentation
49+
var openAiClient = new OpenAI.Chat.ChatClient("gpt-4o-mini", apiKey)
50+
.AsIChatClient()
51+
.AddSentry(options =>
52+
{
53+
options.Experimental.RecordInputs = true;
54+
options.Experimental.RecordOutputs = true;
55+
options.Experimental.AgentName = "MyAgent";
56+
});
57+
58+
// Wrap your client with FunctionInvokingChatClient
59+
var chatClient = new ChatClientBuilder(openAiClient)
60+
.UseFunctionInvocation()
61+
.Build();
62+
63+
// Create chat options with tools and add Sentry instrumentation
64+
var options = new ChatOptions
65+
{
66+
ModelId = "gpt-4o-mini",
67+
MaxOutputTokens = 1024,
68+
Tools =
69+
[
70+
AIFunctionFactory.Create(async (string location) =>
71+
{
72+
// Tool implementation
73+
await Task.Delay(500);
74+
return $"The weather in {location} is sunny";
75+
}, "GetWeather", "Gets the current weather for a location")
76+
]
77+
}.AddSentryToolInstrumentation();
78+
79+
var response = await chatClient.GetResponseAsync(
80+
"What's the weather in New York?",
81+
options);
82+
```
83+
84+
85+
## Configuration Options
86+
87+
The `AddSentry()` method accepts an optional configuration delegate to customize the instrumentation:
88+
89+
<SdkOption name="Experimental.RecordInputs" type="bool" defaultValue="true">
90+
91+
Whether to include request messages in spans. When enabled, the content of messages sent to the AI model will be recorded in the span data.
92+
93+
</SdkOption>
94+
95+
<SdkOption name="Experimental.RecordOutputs" type="bool" defaultValue="true">
96+
97+
Whether to include response content in spans. When enabled, the content of responses from the AI model will be recorded in the span data.
98+
99+
</SdkOption>
100+
101+
<SdkOption name="Experimental.AgentName" type="string" defaultValue="Agent">
102+
103+
Name of the AI Agent. This name will be used to identify the agent in the Sentry UI and helps differentiate between multiple agents in your application.
104+
105+
</SdkOption>
106+
107+
## ASP.NET Core Integration
108+
109+
For ASP.NET Core applications, you can integrate Sentry AI Agent monitoring as follows:
110+
111+
```csharp
112+
var builder = WebApplication.CreateBuilder(args);
113+
114+
// Initialize Sentry for ASP.NET Core
115+
builder.WebHost.UseSentry(options =>
116+
{
117+
options.Dsn = "___PUBLIC_DSN___";
118+
options.TracesSampleRate = 1.0;
119+
});
120+
121+
// Set up the AI client with Sentry instrumentation
122+
var openAiClient = new OpenAI.Chat.ChatClient("gpt-4o-mini", apiKey)
123+
.AsIChatClient()
124+
.AddSentry(options =>
125+
{
126+
options.Experimental.RecordInputs = true;
127+
options.Experimental.RecordOutputs = true;
128+
});
129+
130+
var chatClient = new ChatClientBuilder(openAiClient)
131+
.UseFunctionInvocation()
132+
.Build();
133+
134+
// Register as a singleton
135+
builder.Services.AddSingleton(chatClient);
136+
137+
var app = builder.Build();
138+
139+
// Use in endpoints
140+
app.MapGet("/chat", async (IChatClient client, string message) =>
141+
{
142+
var options = new ChatOptions
143+
{
144+
ModelId = "gpt-4o-mini",
145+
Tools = [ /* your tools */ ]
146+
}.AddSentryToolInstrumentation();
147+
148+
var response = await client.GetResponseAsync(message, options);
149+
return Results.Ok(response.Message.Text);
150+
});
151+
152+
app.Run();
153+
```
154+

0 commit comments

Comments
 (0)