$3.30/$7.70/M
ctx64Kmax8Kavail—tps—
InOut
DeepSeek's reasoning model trained via large-scale reinforcement learning, hosted on TogetherAI.
Common Name: DeepSeek R1 Distill Llama 70B
DeepSeek R1 reasoning model distilled to Llama 70B architecture, hosted on TogetherAI.
DeepSeek's reasoning model trained via large-scale reinforcement learning, hosted on TogetherAI.
DeepSeek V3 MoE model with 671B total parameters and 37B active, hosted on TogetherAI.
Meta's largest Llama 3.1 405B model optimized for fast inference on TogetherAI.
Alibaba's Qwen2.5 7B model optimized for fast inference on TogetherAI.