Skip to content

feat(openai): expose observation and trace IDs from observeOpenAI#739

Open
guidev wants to merge 3 commits intolangfuse:mainfrom
guidev:feat/expose-observation-id-from-observe-openai
Open

feat(openai): expose observation and trace IDs from observeOpenAI#739
guidev wants to merge 3 commits intolangfuse:mainfrom
guidev:feat/expose-observation-id-from-observe-openai

Conversation

@guidev
Copy link

@guidev guidev commented Mar 4, 2026

Summary

  • After an observeOpenAI-wrapped call completes, the observation ID and trace ID are now attached to the response object as non-enumerable properties
  • Exports a WithLangfuseIds<T> utility type for TypeScript consumers
  • Works for both non-streaming (Promise) and streaming (AsyncIterable) responses

Usage

import { observeOpenAI } from '@langfuse/openai';

const openai = observeOpenAI(new OpenAI());
const response = await openai.chat.completions.create({
  model: 'gpt-4',
  messages: [{ role: 'user', content: 'Hello!' }],
});

// Access the IDs directly on the response
response.langfuseObservationId // the generation's observation ID
response.langfuseTraceId       // the parent trace ID

// Use them to attach scores via the Langfuse API
await langfuseClient.score.create({
  traceId: response.langfuseTraceId,
  observationId: response.langfuseObservationId,
  name: 'user-feedback',
  value: 'positive',
  dataType: 'CATEGORICAL',
});

For TypeScript, a WithLangfuseIds<T> utility type is exported:

import type { WithLangfuseIds } from '@langfuse/openai';

Motivation

Currently there is no way to retrieve the observation/generation ID created by observeOpenAI. The integration internally uses startObservation and calls .end() before returning, so the observation object is never exposed. This makes it impossible to programmatically attach scores to specific generations (e.g., for user feedback loops).

The only workaround is replacing observeOpenAI with manual startActiveObservation({ asType: 'generation' }) calls + generation.update(), which duplicates all the input/output/usage parsing that observeOpenAI already handles.

Implementation

  • IDs are attached via Object.defineProperty with enumerable: false, so they don't appear in JSON.stringify output or for-in loops
  • The attachLangfuseIds helper is called after the generation is updated and ended, in both the Promise and streaming paths

Closes langfuse/langfuse#12389

Test plan

  • Existing E2E tests continue to pass
  • response.langfuseObservationId is accessible after a non-streaming call
  • response.langfuseTraceId is accessible after a non-streaming call
  • Streaming responses also carry the IDs on the generator object
  • IDs do not appear in JSON.stringify(response)

🤖 Generated with Claude Code

Disclaimer: Experimental PR review

Greptile Summary

This PR exposes Langfuse observation and trace IDs directly on observeOpenAI responses, addressing a long-standing gap where users had no programmatic way to attach scores to specific generations. The implementation adds a small new module (langfuseIds.ts) with a WithLangfuseIds<T> utility type and an attachLangfuseIds helper that stamps non-enumerable properties onto response objects and stream generators using Object.defineProperty.

Key changes:

  • packages/openai/src/langfuseIds.ts — new module exporting WithLangfuseIds<T> and attachLangfuseIds
  • packages/openai/src/traceMethod.ts — calls attachLangfuseIds in both the non-streaming (.then()) and streaming (wrapAsyncIterable) paths
  • packages/openai/src/index.ts — re-exports WithLangfuseIds<T> as a public type

Issues found:

  • Object.defineProperty is called without a try/catch. If the target object is frozen, sealed, or non-extensible (or if this function is ever called twice on the same object, since configurable: false), a TypeError will propagate to the user — potentially rejecting their awaited promise for an unrelated-looking reason.
  • The asserts result is T & WithLangfuseIds<T> assertion is unconditional at the type level, but the property definitions are conditional on result being a non-null object. When the guard fails, TypeScript still narrows the type as if the IDs were present, creating a soundness gap.
  • For streaming responses, langfuseObservationId / langfuseTraceId are readable immediately on the generator, but the underlying generation is only .end()-ed after the stream is fully iterated. This should be documented so consumers know not to immediately fire Langfuse API calls (e.g. score.create) right after starting the stream.

Confidence Score: 3/5

  • Safe to merge for the happy path, but the missing try/catch around Object.defineProperty is a latent runtime failure risk on non-extensible objects.
  • The core logic is sound and the IDs are correctly captured from the OTEL span context at creation time. However, the unguarded Object.defineProperty with configurable: false can throw a TypeError that propagates to users in edge cases (frozen objects, double-call, etc.), and the asserts type soundness gap could cause hard-to-debug issues if the helper is reused beyond its current call-sites.
  • Pay close attention to packages/openai/src/langfuseIds.ts — specifically the Object.defineProperty error handling and the unconditional asserts type assertion.

Important Files Changed

Filename Overview
packages/openai/src/langfuseIds.ts New helper module: defines WithLangfuseIds<T> utility type and attachLangfuseIds function; has two concerns — Object.defineProperty can throw on non-extensible/frozen objects without a try-catch, and the asserts signature creates a type-soundness gap when the guard fails.
packages/openai/src/traceMethod.ts Integrates attachLangfuseIds into both the Promise (non-streaming) and async-iterable (streaming) paths; timing for the streaming case is correct since OTEL span IDs are determined at span creation, but worth documenting that the generation may not be flushed when IDs are first readable.
packages/openai/src/index.ts Exports WithLangfuseIds<T> as a type-only export; minimal, correct change.

Sequence Diagram

sequenceDiagram
    participant Caller
    participant wrapMethod
    participant tracedMethod as OpenAI SDK Method
    participant wrapAsyncIterable
    participant attachLangfuseIds
    participant LangfuseGeneration

    Caller->>wrapMethod: call (e.g. chat.completions.create)
    wrapMethod->>LangfuseGeneration: startObservation() → generation (id, traceId set from OTEL span)
    wrapMethod->>tracedMethod: execute original method

    alt Non-streaming (Promise)
        tracedMethod-->>wrapMethod: Promise<ChatCompletion>
        wrapMethod->>wrapMethod: .then(result => ...)
        wrapMethod->>LangfuseGeneration: generation.update().end()
        wrapMethod->>attachLangfuseIds: attachLangfuseIds(result, generation)
        Note over attachLangfuseIds: Object.defineProperty(result, "langfuseObservationId", ...)<br/>Object.defineProperty(result, "langfuseTraceId", ...)
        wrapMethod-->>Caller: Promise<ChatCompletion & WithLangfuseIds>
    else Streaming (AsyncIterable)
        tracedMethod-->>wrapMethod: AsyncIterable (stream)
        wrapMethod->>wrapAsyncIterable: wrapAsyncIterable(stream, generation)
        wrapAsyncIterable->>wrapAsyncIterable: create tracedOutputGenerator()
        wrapAsyncIterable->>attachLangfuseIds: attachLangfuseIds(generator, generation)
        Note over attachLangfuseIds: IDs attached before stream is iterated
        wrapAsyncIterable-->>Caller: generator (with langfuseObservationId, langfuseTraceId)
        loop Iterate stream
            Caller->>wrapAsyncIterable: for await chunk
            wrapAsyncIterable-->>Caller: yield rawChunk
        end
        wrapAsyncIterable->>LangfuseGeneration: generation.update().end()
        Note over Caller: IDs readable immediately,<br/>but generation only flushed after full iteration
    end
Loading

Last reviewed commit: 41e6e12

(2/5) Greptile learns from your feedback when you react with thumbs up/down!

After an observeOpenAI-wrapped call completes, the observation ID and
trace ID are attached to the response object as non-enumerable
properties:

  const response = await openai.chat.completions.create({ ... });
  response.langfuseObservationId // the generation's observation ID
  response.langfuseTraceId       // the parent trace ID

A WithLangfuseIds<T> utility type is exported for TypeScript consumers.

This enables attaching scores to specific generations via the Langfuse
API (e.g. user feedback) without having to replace observeOpenAI with
manual startActiveObservation calls.

Closes langfuse/langfuse#12389

Co-Authored-By: Claude Opus 4.6 <[email protected]>
@vercel
Copy link

vercel bot commented Mar 4, 2026

@guidev is attempting to deploy a commit to the langfuse Team on Vercel.

A member of the Team first needs to authorize it.

Comment on lines +23 to +41
export function attachLangfuseIds<T>(
result: T,
generation: LangfuseGeneration,
): asserts result is T & WithLangfuseIds<T> {
if (result && typeof result === "object") {
Object.defineProperty(result, "langfuseObservationId", {
value: generation.id,
enumerable: false,
writable: false,
configurable: false,
});
Object.defineProperty(result, "langfuseTraceId", {
value: generation.traceId,
enumerable: false,
writable: false,
configurable: false,
});
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Object.defineProperty may throw on non-extensible or sealed objects

Object.defineProperty throws a TypeError if the target object is non-extensible (e.g. Object.preventExtensions() / Object.seal() / Object.freeze()) or if either property already exists as non-configurable. Because configurable: false is set, any accidental double-call on the same object will also throw. Neither case is caught here, so the error would propagate up through the .then() chain and reject the user's promise.

Consider wrapping the property definitions in a try/catch to make the attachment best-effort:

export function attachLangfuseIds<T>(
  result: T,
  generation: LangfuseGeneration,
): asserts result is T & WithLangfuseIds<T> {
  if (result && typeof result === "object") {
    try {
      Object.defineProperty(result, "langfuseObservationId", {
        value: generation.id,
        enumerable: false,
        writable: false,
        configurable: false,
      });
      Object.defineProperty(result, "langfuseTraceId", {
        value: generation.traceId,
        enumerable: false,
        writable: false,
        configurable: false,
      });
    } catch {
      // Silently ignore if properties cannot be defined (e.g. frozen object)
    }
  }
}

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch — added a try/catch in 8d02412. In practice this can't happen since OpenAI SDK responses are plain objects from JSON parsing (never frozen/sealed) and the function is only called once per response, but it's worth guarding defensively.

guidev and others added 2 commits March 4, 2026 23:23
Add module augmentation for ChatCompletion and Response interfaces
so consumers can access langfuseObservationId/langfuseTraceId
without type casts when using observeOpenAI.

Co-Authored-By: Claude Opus 4.6 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat(js-openai): expose observation ID from observeOpenAI

1 participant