feat(openai): expose observation and trace IDs from observeOpenAI#739
feat(openai): expose observation and trace IDs from observeOpenAI#739guidev wants to merge 3 commits intolangfuse:mainfrom
Conversation
After an observeOpenAI-wrapped call completes, the observation ID and
trace ID are attached to the response object as non-enumerable
properties:
const response = await openai.chat.completions.create({ ... });
response.langfuseObservationId // the generation's observation ID
response.langfuseTraceId // the parent trace ID
A WithLangfuseIds<T> utility type is exported for TypeScript consumers.
This enables attaching scores to specific generations via the Langfuse
API (e.g. user feedback) without having to replace observeOpenAI with
manual startActiveObservation calls.
Closes langfuse/langfuse#12389
Co-Authored-By: Claude Opus 4.6 <[email protected]>
|
@guidev is attempting to deploy a commit to the langfuse Team on Vercel. A member of the Team first needs to authorize it. |
| export function attachLangfuseIds<T>( | ||
| result: T, | ||
| generation: LangfuseGeneration, | ||
| ): asserts result is T & WithLangfuseIds<T> { | ||
| if (result && typeof result === "object") { | ||
| Object.defineProperty(result, "langfuseObservationId", { | ||
| value: generation.id, | ||
| enumerable: false, | ||
| writable: false, | ||
| configurable: false, | ||
| }); | ||
| Object.defineProperty(result, "langfuseTraceId", { | ||
| value: generation.traceId, | ||
| enumerable: false, | ||
| writable: false, | ||
| configurable: false, | ||
| }); | ||
| } | ||
| } |
There was a problem hiding this comment.
Object.defineProperty may throw on non-extensible or sealed objects
Object.defineProperty throws a TypeError if the target object is non-extensible (e.g. Object.preventExtensions() / Object.seal() / Object.freeze()) or if either property already exists as non-configurable. Because configurable: false is set, any accidental double-call on the same object will also throw. Neither case is caught here, so the error would propagate up through the .then() chain and reject the user's promise.
Consider wrapping the property definitions in a try/catch to make the attachment best-effort:
export function attachLangfuseIds<T>(
result: T,
generation: LangfuseGeneration,
): asserts result is T & WithLangfuseIds<T> {
if (result && typeof result === "object") {
try {
Object.defineProperty(result, "langfuseObservationId", {
value: generation.id,
enumerable: false,
writable: false,
configurable: false,
});
Object.defineProperty(result, "langfuseTraceId", {
value: generation.traceId,
enumerable: false,
writable: false,
configurable: false,
});
} catch {
// Silently ignore if properties cannot be defined (e.g. frozen object)
}
}
}There was a problem hiding this comment.
Good catch — added a try/catch in 8d02412. In practice this can't happen since OpenAI SDK responses are plain objects from JSON parsing (never frozen/sealed) and the function is only called once per response, but it's worth guarding defensively.
Co-Authored-By: Claude Opus 4.6 <[email protected]>
Add module augmentation for ChatCompletion and Response interfaces so consumers can access langfuseObservationId/langfuseTraceId without type casts when using observeOpenAI. Co-Authored-By: Claude Opus 4.6 <[email protected]>
Summary
observeOpenAI-wrapped call completes, the observation ID and trace ID are now attached to the response object as non-enumerable propertiesWithLangfuseIds<T>utility type for TypeScript consumersUsage
For TypeScript, a
WithLangfuseIds<T>utility type is exported:Motivation
Currently there is no way to retrieve the observation/generation ID created by
observeOpenAI. The integration internally usesstartObservationand calls.end()before returning, so the observation object is never exposed. This makes it impossible to programmatically attach scores to specific generations (e.g., for user feedback loops).The only workaround is replacing
observeOpenAIwith manualstartActiveObservation({ asType: 'generation' })calls +generation.update(), which duplicates all the input/output/usage parsing thatobserveOpenAIalready handles.Implementation
Object.definePropertywithenumerable: false, so they don't appear inJSON.stringifyoutput orfor-inloopsattachLangfuseIdshelper is called after the generation is updated and ended, in both the Promise and streaming pathsCloses langfuse/langfuse#12389
Test plan
response.langfuseObservationIdis accessible after a non-streaming callresponse.langfuseTraceIdis accessible after a non-streaming callJSON.stringify(response)🤖 Generated with Claude Code
Disclaimer: Experimental PR review
Greptile Summary
This PR exposes Langfuse observation and trace IDs directly on
observeOpenAIresponses, addressing a long-standing gap where users had no programmatic way to attach scores to specific generations. The implementation adds a small new module (langfuseIds.ts) with aWithLangfuseIds<T>utility type and anattachLangfuseIdshelper that stamps non-enumerable properties onto response objects and stream generators usingObject.defineProperty.Key changes:
packages/openai/src/langfuseIds.ts— new module exportingWithLangfuseIds<T>andattachLangfuseIdspackages/openai/src/traceMethod.ts— callsattachLangfuseIdsin both the non-streaming (.then()) and streaming (wrapAsyncIterable) pathspackages/openai/src/index.ts— re-exportsWithLangfuseIds<T>as a public typeIssues found:
Object.definePropertyis called without atry/catch. If the target object is frozen, sealed, or non-extensible (or if this function is ever called twice on the same object, sinceconfigurable: false), aTypeErrorwill propagate to the user — potentially rejecting their awaited promise for an unrelated-looking reason.asserts result is T & WithLangfuseIds<T>assertion is unconditional at the type level, but the property definitions are conditional onresultbeing a non-null object. When the guard fails, TypeScript still narrows the type as if the IDs were present, creating a soundness gap.langfuseObservationId/langfuseTraceIdare readable immediately on the generator, but the underlying generation is only.end()-ed after the stream is fully iterated. This should be documented so consumers know not to immediately fire Langfuse API calls (e.g.score.create) right after starting the stream.Confidence Score: 3/5
try/catcharoundObject.definePropertyis a latent runtime failure risk on non-extensible objects.Object.definePropertywithconfigurable: falsecan throw aTypeErrorthat propagates to users in edge cases (frozen objects, double-call, etc.), and theassertstype soundness gap could cause hard-to-debug issues if the helper is reused beyond its current call-sites.packages/openai/src/langfuseIds.ts— specifically theObject.definePropertyerror handling and the unconditionalassertstype assertion.Important Files Changed
WithLangfuseIds<T>utility type andattachLangfuseIdsfunction; has two concerns —Object.definePropertycan throw on non-extensible/frozen objects without a try-catch, and theassertssignature creates a type-soundness gap when the guard fails.attachLangfuseIdsinto both the Promise (non-streaming) and async-iterable (streaming) paths; timing for the streaming case is correct since OTEL span IDs are determined at span creation, but worth documenting that the generation may not be flushed when IDs are first readable.WithLangfuseIds<T>as a type-only export; minimal, correct change.Sequence Diagram
sequenceDiagram participant Caller participant wrapMethod participant tracedMethod as OpenAI SDK Method participant wrapAsyncIterable participant attachLangfuseIds participant LangfuseGeneration Caller->>wrapMethod: call (e.g. chat.completions.create) wrapMethod->>LangfuseGeneration: startObservation() → generation (id, traceId set from OTEL span) wrapMethod->>tracedMethod: execute original method alt Non-streaming (Promise) tracedMethod-->>wrapMethod: Promise<ChatCompletion> wrapMethod->>wrapMethod: .then(result => ...) wrapMethod->>LangfuseGeneration: generation.update().end() wrapMethod->>attachLangfuseIds: attachLangfuseIds(result, generation) Note over attachLangfuseIds: Object.defineProperty(result, "langfuseObservationId", ...)<br/>Object.defineProperty(result, "langfuseTraceId", ...) wrapMethod-->>Caller: Promise<ChatCompletion & WithLangfuseIds> else Streaming (AsyncIterable) tracedMethod-->>wrapMethod: AsyncIterable (stream) wrapMethod->>wrapAsyncIterable: wrapAsyncIterable(stream, generation) wrapAsyncIterable->>wrapAsyncIterable: create tracedOutputGenerator() wrapAsyncIterable->>attachLangfuseIds: attachLangfuseIds(generator, generation) Note over attachLangfuseIds: IDs attached before stream is iterated wrapAsyncIterable-->>Caller: generator (with langfuseObservationId, langfuseTraceId) loop Iterate stream Caller->>wrapAsyncIterable: for await chunk wrapAsyncIterable-->>Caller: yield rawChunk end wrapAsyncIterable->>LangfuseGeneration: generation.update().end() Note over Caller: IDs readable immediately,<br/>but generation only flushed after full iteration endLast reviewed commit: 41e6e12
(2/5) Greptile learns from your feedback when you react with thumbs up/down!