Skip to content

feat: add include_raw_response option for accessing token usage#123

Open
yosalama wants to merge 1 commit intothmsmlr:mainfrom
yosalama:feature/include-raw-response
Open

feat: add include_raw_response option for accessing token usage#123
yosalama wants to merge 1 commit intothmsmlr:mainfrom
yosalama:feature/include-raw-response

Conversation

@yosalama
Copy link
Copy Markdown

Summary

Adds a new include_raw_response: true option to chat_completion/2 that returns the raw API response alongside the parsed model. This allows users to access token usage information and other API metadata.

When enabled, the function returns {:ok, model, %{raw_response: response}} instead of {:ok, model}.

Usage

{:ok, result, %{raw_response: raw_response}} =
  Instructor.chat_completion(
    model: "gpt-4o-mini",
    response_model: MySchema,
    include_raw_response: true,
    messages: [%{role: "user", content: "..."}]
  )

# Access token usage
raw_response.body["usage"]
# => %{"prompt_tokens" => 50, "completion_tokens" => 10, "total_tokens" => 60}

Changes

  • Added include_raw_response option to do_chat_completion/3
  • Added maybe_include_raw_response/3 helper function
  • Updated typespec to include new return type
  • Added documentation for the new option
  • Added tests for the new functionality

Backwards Compatibility

The default behavior is unchanged (include_raw_response: false), so this is fully backwards compatible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant