TAAFT
Free mode
100% free
Freemium
Free Trial
Deals

Mixtral 8x22B

New Text Gen 7
Released: April 17, 2024

Overview

Mixtral 8x22B is a sparse mixture-of-experts language model that routes tokens to a subset of experts for high quality reasoning at practical latency.

Description

The MoE design gives Mixtral strong step-by-step analysis, multilingual coverage, and robust coding while keeping serving cost under control. It sustains long contexts, follows instructions closely, and returns structured JSON for automation. With function calling it can search, run tools, or query services in loop, and it scales well in production since only a few experts activate per token.

About Mistral AI

Mistral AI is a company that specializes in artificial intelligence and machine learning solutions.

Industry: Technology, Information and Internet
Company Size: 11-50
Location: Paris, FR
Website: mistral.ai
View Company Profile

Related Models

Last updated: November 17, 2025