Skip to content

Commit 26f84d1

Browse files
committed
Clarify notebook walkthrough steps
1 parent b7fdd6a commit 26f84d1

File tree

3 files changed

+100
-12
lines changed

3 files changed

+100
-12
lines changed

lab/5-Observability/1-OpenAIAgents/weekend_planner.ipynb

Lines changed: 37 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
"source": [
77
"# Weekend Planner with OpenAI Agents Telemetry\n",
88
"\n",
9-
"Capture GenAI-compliant spans while orchestrating an OpenAI Agents workflow. This notebook mirrors the sample script but keeps everything inline so you can tweak the code and immediately inspect telemetry."
9+
"Capture GenAI-compliant spans while orchestrating an OpenAI Agents workflow. This notebook mirrors the sample script but keeps everything inline so you can tweak the code and immediately inspect telemetry."
1010
]
1111
},
1212
{
@@ -23,7 +23,15 @@
2323
"pip install azure-monitor-opentelemetry-exporter\n",
2424
"```\n",
2525
"\n",
26-
"Set environment variables for your model provider (`API_HOST=github` by default) and, if needed, `APPLICATION_INSIGHTS_CONNECTION_STRING`."
26+
"Set environment variables for your model provider (`API_HOST=github` by default) and, if needed, `APPLICATION_INSIGHTS_CONNECTION_STRING`."
27+
]
28+
},
29+
{
30+
"cell_type": "markdown",
31+
"metadata": {},
32+
"source": [
33+
"### Step 1: Import dependencies and configure logging\n",
34+
"Load the OpenAI Agents SDK, OpenTelemetry helpers, and supporting utilities. This cell also reads your `.env` file and enables rich logging so you can watch the agent run.\n"
2735
]
2836
},
2937
{
@@ -67,6 +75,14 @@
6775
"LOGGER.setLevel(logging.INFO)"
6876
]
6977
},
78+
{
79+
"cell_type": "markdown",
80+
"metadata": {},
81+
"source": [
82+
"### Step 2: Define helpers for capture configuration\n",
83+
"These utilities resolve the API host, set the GenAI capture environment variables, and prepare the tracer provider used throughout the notebook.\n"
84+
]
85+
},
7086
{
7187
"cell_type": "code",
7288
"execution_count": null,
@@ -160,6 +176,14 @@
160176
" trace.set_tracer_provider(provider)"
161177
]
162178
},
179+
{
180+
"cell_type": "markdown",
181+
"metadata": {},
182+
"source": [
183+
"### Step 3: Initialise OpenTelemetry instrumentation\n",
184+
"Run this cell after updating your environment variables to wire up the tracer provider and the OpenAI Agents instrumentor.\n"
185+
]
186+
},
163187
{
164188
"cell_type": "code",
165189
"execution_count": null,
@@ -176,6 +200,14 @@
176200
"print(\"Instrumentation ready for provider:\", API_CONFIG.provider)"
177201
]
178202
},
203+
{
204+
"cell_type": "markdown",
205+
"metadata": {},
206+
"source": [
207+
"### Step 4: Register agent tools and construct the agent\n",
208+
"With instrumentation active, declare the reusable tools and build the Weekend Planner agent that the notebook will invoke.\n"
209+
]
210+
},
179211
{
180212
"cell_type": "code",
181213
"execution_count": null,
@@ -225,8 +257,8 @@
225257
"cell_type": "markdown",
226258
"metadata": {},
227259
"source": [
228-
"## Run the agent\n",
229-
"Execute the final cell to orchestrate a single planning session. Spans will be emitted to the configured exporter (console or Azure Monitor)."
260+
"### Step 5: Run the agent\n",
261+
"Execute the final cell to orchestrate a single planning session. Watch the console (or Application Insights) for `create_agent`, `invoke_agent`, and `execute_tool` spans emitted by the instrumentation.\n"
230262
]
231263
},
232264
{
@@ -273,4 +305,4 @@
273305
},
274306
"nbformat": 4,
275307
"nbformat_minor": 5
276-
}
308+
}

lab/5-Observability/2-LangChain/weekend_planner.ipynb

Lines changed: 29 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
"source": [
77
"# LangChain Weekend Planner with Azure AI Telemetry\n",
88
"\n",
9-
"Instrument a LangChain v1 agent so each invocation emits GenAI-compliant spans via `langchain-azure-ai`."
9+
"Instrument a LangChain v1 agent so each invocation emits GenAI-compliant spans via `langchain-azure-ai`."
1010
]
1111
},
1212
{
@@ -21,7 +21,15 @@
2121
"pip install azure-identity # required when API_HOST=azure\n",
2222
"```\n",
2323
"\n",
24-
"Configure environment variables for your model provider (GitHub Models by default) and optionally `APPLICATION_INSIGHTS_CONNECTION_STRING` to export to Azure Monitor."
24+
"Configure environment variables for your model provider (GitHub Models by default) and optionally `APPLICATION_INSIGHTS_CONNECTION_STRING` to export to Azure Monitor."
25+
]
26+
},
27+
{
28+
"cell_type": "markdown",
29+
"metadata": {},
30+
"source": [
31+
"### Step 1: Import dependencies and tracer helpers\n",
32+
"Load LangChain, the Azure tracer callback, and supporting utilities while configuring logging and environment handling.\n"
2533
]
2634
},
2735
{
@@ -54,6 +62,14 @@
5462
"LOGGER.setLevel(logging.INFO)"
5563
]
5664
},
65+
{
66+
"cell_type": "markdown",
67+
"metadata": {},
68+
"source": [
69+
"### Step 2: Configure the chat model and tracer\n",
70+
"This cell selects GitHub Models or Azure OpenAI based on environment variables and prepares the Azure OpenTelemetry tracer.\n"
71+
]
72+
},
5773
{
5874
"cell_type": "code",
5975
"execution_count": null,
@@ -95,6 +111,14 @@
95111
"print(\"Model configured:\", MODEL.model_name if hasattr(MODEL, \"model_name\") else \"custom\")"
96112
]
97113
},
114+
{
115+
"cell_type": "markdown",
116+
"metadata": {},
117+
"source": [
118+
"### Step 3: Define tools and initialise the agent\n",
119+
"Create the weather, activities, and date tools, then assemble the agent so each invocation can emit GenAI spans.\n"
120+
]
121+
},
98122
{
99123
"cell_type": "code",
100124
"execution_count": null,
@@ -147,8 +171,8 @@
147171
"cell_type": "markdown",
148172
"metadata": {},
149173
"source": [
150-
"## Invoke the agent\n",
151-
"Use the LangChain tracer callback to record `invoke_agent` and tool spans for each request."
174+
"### Step 4: Invoke the agent\n",
175+
"Call the agent with the Azure tracer in the config to emit `invoke_agent` and tool spans. Inspect the notebook output and your telemetry backend before iterating further.\n"
152176
]
153177
},
154178
{
@@ -179,4 +203,4 @@
179203
},
180204
"nbformat": 4,
181205
"nbformat_minor": 5
182-
}
206+
}

lab/5-Observability/3-LangGraph/music_router.ipynb

Lines changed: 34 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,14 @@
2424
"Set `API_HOST` to `github` (default) or `azure`, provide the corresponding credentials, and optionally `APPLICATION_INSIGHTS_CONNECTION_STRING` for Azure Monitor export."
2525
]
2626
},
27+
{
28+
"cell_type": "markdown",
29+
"metadata": {},
30+
"source": [
31+
"### Step 1: Import LangGraph, LangChain, and telemetry helpers\n",
32+
"Bring in the required libraries and load environment variables for your chosen provider.\n"
33+
]
34+
},
2735
{
2836
"cell_type": "code",
2937
"execution_count": null,
@@ -48,6 +56,14 @@
4856
"load_dotenv(override=True)"
4957
]
5058
},
59+
{
60+
"cell_type": "markdown",
61+
"metadata": {},
62+
"source": [
63+
"### Step 2: Configure the chat model and tracer\n",
64+
"Initialise the chat client (GitHub Models or Azure OpenAI) and the Azure AI tracer so spans are emitted for every model call.\n"
65+
]
66+
},
5167
{
5268
"cell_type": "code",
5369
"execution_count": null,
@@ -89,6 +105,14 @@
89105
"print(\"Model ready:\", MODEL.model_name if hasattr(MODEL, \"model_name\") else \"custom\")"
90106
]
91107
},
108+
{
109+
"cell_type": "markdown",
110+
"metadata": {},
111+
"source": [
112+
"### Step 3: Declare tools and bind them to the model\n",
113+
"Define the music playback tools and bind them to the chat model so LangGraph can leverage them during execution.\n"
114+
]
115+
},
92116
{
93117
"cell_type": "code",
94118
"execution_count": null,
@@ -114,6 +138,14 @@
114138
"MODEL_WITH_TOOLS = MODEL.bind_tools(TOOLS, parallel_tool_calls=False)\n"
115139
]
116140
},
141+
{
142+
"cell_type": "markdown",
143+
"metadata": {},
144+
"source": [
145+
"### Step 4: Build the LangGraph workflow\n",
146+
"Create the agent and tool nodes, wire up conditional edges, and compile the workflow with an in-memory checkpointer.\n"
147+
]
148+
},
117149
{
118150
"cell_type": "code",
119151
"execution_count": null,
@@ -148,8 +180,8 @@
148180
"cell_type": "markdown",
149181
"metadata": {},
150182
"source": [
151-
"## Stream an interaction\n",
152-
"Runs the graph with telemetry enabled. Each step sends spans through the Azure tracer."
183+
"### Step 5: Stream an interaction\n",
184+
"Execute the graph with the tracer callback to observe emitted spans for the model and tool steps in real time.\n"
153185
]
154186
},
155187
{

0 commit comments

Comments
 (0)