Skip to content

Clarify batch availability in together-batch-inference skill#3

Merged
muhsinking merged 3 commits intomainfrom
mking/mle-5279-batch-availability-clarification
May 5, 2026
Merged

Clarify batch availability in together-batch-inference skill#3
muhsinking merged 3 commits intomainfrom
mking/mle-5279-batch-availability-clarification

Conversation

@muhsinking
Copy link
Copy Markdown
Contributor

Summary

  • Mirrors the docs fix in mintlify-docs#782.
  • Replaces the inaccurate "All serverless models support batch processing" claim in skills/together-batch-inference/references/api-reference.md with a more accurate version that flags the small set of currently-unavailable models.

Linear: MLE-5279

Test plan

  • Read the rendered skill reference and confirm the batch-availability paragraph reads correctly.

🤖 Generated with Claude Code

Mirror the docs change in togethercomputer/mintlify-docs#782:
not all serverless models support batch, and a small set (currently
DeepSeek-R1-0528-tput and DeepSeek-V3.1) will fail if submitted.

Linear: MLE-5279

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@muhsinking muhsinking requested a review from zainhas May 5, 2026 16:58
muhsinking and others added 2 commits May 5, 2026 09:59
Mirrors the corrected list in togethercomputer/mintlify-docs#782
after cross-referencing against the live Together API (the previous list
was undercounted because it was checked against a stale model snippet).

Linear: MLE-5279

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@muhsinking muhsinking merged commit 7dcd48e into main May 5, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants