Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
216 changes: 74 additions & 142 deletions MIGRATION.md
Original file line number Diff line number Diff line change
@@ -1,30 +1,8 @@
# Migration Guide
# Migration Guide: v1.x to v2.x

This guide covers migrating between major versions of the Mistral Python SDK.
## Import Changes

---

## Migrating from v1.x to v2.x

Version 2.0 updates the import paths from `mistralai` to `mistralai.client`.

### Import Changes

All imports move from `mistralai` to `mistralai.client`:

```python
# v1
from mistralai import Mistral
from mistralai.models import UserMessage, AssistantMessage
from mistralai.types import BaseModel

# v2
from mistralai.client import Mistral
from mistralai.client.models import UserMessage, AssistantMessage
from mistralai.client.types import BaseModel
```

### Quick Reference
All SDK imports move from `mistralai` to `mistralai.client`:

| v1 | v2 |
|---|---|
Expand All @@ -33,48 +11,96 @@ from mistralai.client.types import BaseModel
| `from mistralai.types import ...` | `from mistralai.client.types import ...` |
| `from mistralai.utils import ...` | `from mistralai.client.utils import ...` |

### Azure & GCP Import Changes
`mistralai.extra` is unchanged (`RunContext`, `MCPClientSTDIO`, `MCPClientSSE`, `response_format_from_pydantic_model`, etc. stay at `mistralai.extra`).

## Azure & GCP

Azure and GCP SDKs now live under the `mistralai` namespace as separate distributions:
Azure and GCP are now namespace sub-packages under `mistralai`, no longer separate top-level packages.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no longer separate top-level packages.

Not sure this is true. Those are bundled packages, exactly like in v1.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's not separate packages (its bundled when you pip install mistralai), but they both live under mistralai namespace, mistralai.{gcp/azure}.


| v1 | v2 |
|---|---|
| `from mistralai_azure import MistralAzure` | `from mistralai.azure.client import MistralAzure` |
| `from mistralai_azure.models import ...` | `from mistralai.azure.client.models import ...` |
| `from mistralai_gcp import MistralGoogleCloud` | `from mistralai.gcp.client import MistralGCP` |
| `from mistralai_gcp.models import ...` | `from mistralai.gcp.client.models import ...` |
GCP class renamed `MistralGoogleCloud` -> `MistralGCP`.

#### Installation Changes
## Type Renames

For GCP authentication dependencies, use `pip install "mistralai[gcp]"`.
42 request/response types renamed to follow `{Verb}{Entity}Request` / `{Verb}{Entity}Response` / `{Entity}` conventions. Core types (`Mistral`, `UserMessage`, `AssistantMessage`, `File`, `FunctionTool`, `ResponseFormat`, etc.) keep the same name — just different import path.

### What Stays the Same
Only one user-facing type rename: `Tools` -> `ConversationRequestTool`.

- The `Mistral` client API is unchanged
- All models (`UserMessage`, `AssistantMessage`, etc.) work the same way
<details>
<summary>Full rename table (42 schemas)</summary>

### Enums

Enums now accept unknown values for forward compatibility with API changes.
| v1 | v2 |
|---|---|
| `AgentCreationRequest` | `CreateAgentRequest` |
| `AgentUpdateRequest` | `UpdateAgentRequest` |
| `ArchiveFTModelOut` | `ArchiveModelResponse` |
| `BatchJobIn` | `CreateBatchJobRequest` |
| `BatchJobOut` | `BatchJob` |
| `BatchJobsOut` | `ListBatchJobsResponse` |
| `CheckpointOut` | `Checkpoint` |
| `ClassifierDetailedJobOut` | `ClassifierFineTuningJobDetails` |
| `ClassifierFTModelOut` | `ClassifierFineTunedModel` |
| `ClassifierJobOut` | `ClassifierFineTuningJob` |
| `ClassifierTargetIn` | `ClassifierTarget` |
| `ClassifierTargetOut` | `ClassifierTargetResult` |
| `ClassifierTrainingParametersIn` | `ClassifierTrainingParameters` |
| `CompletionDetailedJobOut` | `CompletionFineTuningJobDetails` |
| `CompletionFTModelOut` | `CompletionFineTunedModel` |
| `CompletionJobOut` | `CompletionFineTuningJob` |
| `CompletionTrainingParametersIn` | `CompletionTrainingParameters` |
| `ConversationAppendRequestBase` | `AppendConversationRequest` |
| `ConversationRestartRequestBase` | `RestartConversationRequest` |
| `DeleteFileOut` | `DeleteFileResponse` |
| `DocumentOut` | `Document` |
| `DocumentUpdateIn` | `UpdateDocumentRequest` |
| `EventOut` | `Event` |
| `FTModelCapabilitiesOut` | `FineTunedModelCapabilities` |
| `FileSignedURL` | `GetSignedUrlResponse` |
| `GithubRepositoryOut` | `GithubRepository` |
| `JobIn` | `CreateFineTuningJobRequest` |
| `JobMetadataOut` | `JobMetadata` |
| `JobsOut` | `ListFineTuningJobsResponse` |
| `LegacyJobMetadataOut` | `LegacyJobMetadata` |
| `LibraryIn` | `CreateLibraryRequest` |
| `LibraryInUpdate` | `UpdateLibraryRequest` |
| `LibraryOut` | `Library` |
| `ListDocumentOut` | `ListDocumentsResponse` |
| `ListFilesOut` | `ListFilesResponse` |
| `ListLibraryOut` | `ListLibrariesResponse` |
| `MetricOut` | `Metric` |
| `RetrieveFileOut` | `RetrieveFileResponse` |
| `UnarchiveFTModelOut` | `UnarchiveModelResponse` |
| `UpdateFTModelIn` | `UpdateModelRequest` |
| `UploadFileOut` | `UploadFileResponse` |
| `WandbIntegrationOut` | `WandbIntegrationResult` |

</details>

## Other Changes

- `FunctionTool.type` changed from `Optional[FunctionToolType]` to `Literal["function"]` (functionally equivalent if you omit `type`)
- Enums now accept unknown values for forward compatibility with API changes
- Forward-compatible unions: discriminated unions get an `Unknown` variant

## What Did NOT Change

- All method names (`chat.complete`, `chat.stream`, `embeddings.create`, `fim.complete`, `files.upload`, `models.list`, `fine_tuning.jobs.create`, etc.)
- Zero endpoints added/removed, zero path changes
- Python minimum `>=3.10`
- Installation: `pip install mistralai`

---

## Migrating from v0.x to v1.x

Version 1.0 introduced significant changes to improve usability and consistency.

> **Note:** The v1.x examples below use v1-style imports (e.g., `from mistralai import Mistral`). If you're on v2.x, combine these API changes with the [v1 to v2 import changes](#migrating-from-v1x-to-v2x) above.

### Major Changes
> **Note:** The v1.x examples below use v1-style imports (e.g., `from mistralai import Mistral`). If you're on v2.x, combine these API changes with the [v1 to v2 import changes](#migration-guide-v1x-to-v2x) above.

1. **Unified Client Class**: `MistralClient` and `MistralAsyncClient` consolidated into a single `Mistral` class
2. **Method Structure**: Methods reorganized into resource-based groups (e.g., `client.chat.complete()`)
3. **Message Classes**: `ChatMessage` replaced with typed classes (`UserMessage`, `AssistantMessage`, etc.)
4. **Streaming Response**: Stream chunks now accessed via `chunk.data.choices[0].delta.content`

### Method Mapping

#### Sync Methods
`MistralClient`/`MistralAsyncClient` consolidated into `Mistral`. `ChatMessage` replaced with `UserMessage`, `AssistantMessage`, etc. Streaming chunks now at `chunk.data.choices[0].delta.content`.

| v0.x | v1.x |
|---|---|
Expand All @@ -91,97 +117,3 @@ Version 1.0 introduced significant changes to improve usability and consistency.
| `client.jobs.list` | `client.fine_tuning.jobs.list` |
| `client.jobs.retrieve` | `client.fine_tuning.jobs.get` |
| `client.jobs.cancel` | `client.fine_tuning.jobs.cancel` |

#### Async Methods

| v0.x | v1.x |
|---|---|
| `MistralAsyncClient` | `Mistral` |
| `async_client.chat` | `client.chat.complete_async` |
| `async_client.chat_stream` | `client.chat.stream_async` |
| `async_client.completions` | `client.fim.complete_async` |
| `async_client.completions_stream` | `client.fim.stream_async` |
| `async_client.embeddings` | `client.embeddings.create_async` |
| `async_client.list_models` | `client.models.list_async` |
| `async_client.files.create` | `client.files.upload_async` |
| `async_client.jobs.create` | `client.fine_tuning.jobs.create_async` |
| `async_client.jobs.list` | `client.fine_tuning.jobs.list_async` |
| `async_client.jobs.retrieve` | `client.fine_tuning.jobs.get_async` |
| `async_client.jobs.cancel` | `client.fine_tuning.jobs.cancel_async` |

### Example: Non-Streaming Chat

**v0.x:**
```python
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage

client = MistralClient(api_key=api_key)

messages = [ChatMessage(role="user", content="What is the best French cheese?")]
response = client.chat(model="mistral-large-latest", messages=messages)

print(response.choices[0].message.content)
```

**v1.x:**
```python
from mistralai import Mistral, UserMessage

client = Mistral(api_key=api_key)

messages = [UserMessage(content="What is the best French cheese?")]
response = client.chat.complete(model="mistral-large-latest", messages=messages)

print(response.choices[0].message.content)
```

### Example: Streaming Chat

**v0.x:**
```python
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage

client = MistralClient(api_key=api_key)
messages = [ChatMessage(role="user", content="What is the best French cheese?")]

for chunk in client.chat_stream(model="mistral-large-latest", messages=messages):
print(chunk.choices[0].delta.content)
```

**v1.x:**
```python
from mistralai import Mistral, UserMessage

client = Mistral(api_key=api_key)
messages = [UserMessage(content="What is the best French cheese?")]

for chunk in client.chat.stream(model="mistral-large-latest", messages=messages):
print(chunk.data.choices[0].delta.content) # Note: chunk.data
```

### Example: Async Streaming

**v0.x:**
```python
from mistralai.async_client import MistralAsyncClient
from mistralai.models.chat_completion import ChatMessage

client = MistralAsyncClient(api_key=api_key)
messages = [ChatMessage(role="user", content="What is the best French cheese?")]

async for chunk in client.chat_stream(model="mistral-large-latest", messages=messages):
print(chunk.choices[0].delta.content)
```

**v1.x:**
```python
from mistralai import Mistral, UserMessage

client = Mistral(api_key=api_key)
messages = [UserMessage(content="What is the best French cheese?")]

async for chunk in await client.chat.stream_async(model="mistral-large-latest", messages=messages):
print(chunk.data.choices[0].delta.content)
```