Bot Info
Mixtral 8x7bBy Mistral AI
Mixtral 8x7B, a high-quality sparse mixture of experts model (SMoE) with open weights from Mistral AI. It is a decoder-only model that picks from a set of 8 distinct groups of parameters. At every layer, for every token, a router network chooses two of these groups (the "experts") to process the token and combine their output.
· View details
Official

This bot is powered by Mistral AISee all chats with this bot

Mixtral 8x7B, a high-quality sparse mixture of experts model (SMoE) with open weights from Mistral AI. It is a decoder-only model that picks from a set of 8 distinct groups of parameters. At every layer, for every token, a router network chooses two of these groups (the "experts") to process the token and combine their output.

· View details
Official
Compare Bots Read Aloud Stop Reading Rewrite Copy Dislike Ask Questions