开源模型汇总

1. deepseek

Model #Total Params #Activated Params Context Length
DeepSeek-V3-Base 671B 37B 128K
DeepSeek-V3 671B 37B 128K

采用MOE(Mixture-of-Experts,混合专家)架构,大模型有671B参数,但是实际每次推理只使用37B的参数

DeepSeek-R1 Model

Model #Total Params #Activated Params Context Length Download
DeepSeek-R1-Zero 671B 37B 128K 🤗 HuggingFace
DeepSeek-R1 671B 37B 128K 🤗 HuggingFace

DeepSeek-R1-Distill Models

Model Base Model Download
DeepSeek-R1-Distill-Qwen-1.5B Qwen2.5-Math-1.5B 🤗 HuggingFace
DeepSeek-R1-Distill-Qwen-7B Qwen2.5-Math-7B 🤗 HuggingFace
DeepSeek-R1-Distill-Llama-8B Llama-3.1-8B 🤗 HuggingFace
DeepSeek-R1-Distill-Qwen-14B Qwen2.5-14B 🤗 HuggingFace
DeepSeek-R1-Distill-Qwen-32B Qwen2.5-32B 🤗 HuggingFace
DeepSeek-R1-Distill-Llama-70B Llama-3.3-70B-Instruct 🤗 HuggingFace

商业友好,允许免费商用

2. Google Gemma

3. Mistral & Mixtral

4. kimi-k2

Architecture Mixture-of-Experts (MoE)
Total Parameters 1T
Activated Parameters 32B
Number of Layers (Dense layer included) 61
Number of Dense Layers 1
Attention Hidden Dimension 7168
MoE Hidden Dimension (per Expert) 2048
Number of Attention Heads 64
Number of Experts 384
Selected Experts per Token 8
Number of Shared Experts 1
Vocabulary Size 160K
Context Length 128K
Attention Mechanism MLA
Activation Function SwiGLU
遵循宽松的MIT协议,做了如下修改
  • 触发条件(满足其一即可)
    1. 你的商业产品或服务月活跃用户(MAU)超过 1 亿
    2. 或者你的商业产品或服务月收入超过 2000 万美元(或等值货币)。
  • 强制义务: 如果达到上述规模,你必须在产品或服务的用户界面(UI)显著位置展示 “Kimi K2” 字样。

5. Qwen

Model Release Date Max Length System Prompt Enhancement # of Pretrained Tokens Minimum GPU Memory Usage of Finetuning (Q-Lora) Minimum GPU Usage of Generating 2048 Tokens (Int4) Tool Usage
Qwen-1.8B 23.11.30 32K 2.2T 5.8GB 2.9GB
Qwen-7B 23.08.03 32K 2.4T 11.5GB 8.2GB
Qwen-14B 23.09.25 8K 3.0T 18.7GB 13.0GB
Qwen-72B 23.11.30 32K 3.0T 61.4GB 48.9GB

商用友好

6. Meta Llama

Model Launch date Model sizes Context Length Tokenizer Acceptable use policy License Model Card
Llama 2 7/18/2023 7B, 13B, 70B 4K Sentencepiece Use Policy License Model Card
Llama 3 4/18/2024 8B, 70B 8K TikToken-based Use Policy License Model Card
Llama 3.1 7/23/2024 8B, 70B, 405B 128K TikToken-based Use Policy License Model Card
Llama 3.2 9/25/2024 1B, 3B 128K TikToken-based Use Policy License Model Card
Llama 3.2-Vision 9/25/2024 11B, 90B 128K TikToken-based Use Policy License Model Card
Llama 3.3 12/04/2024 70B 128K TikToken-based Use Policy License Model Card
Llama 4 4/5/2025 Scout-17B-16E, Maverick-17B-128E 10M, 1M TikToken-based Use Policy License Model Card

license受限,月活用户超过7亿不允许使用。Additional Commercial Terms. If, on the Llama 4 version release date, the monthly active users of the products or services made available by or for Licensee, or Licensee’s affiliates, is greater than 700 million monthly active users in the preceding calendar month, you must request a license from Meta, which Meta may grant to you in its sole discretion, and you are not authorized to exercise any of the rights under this Agreement unless or until Meta otherwise expressly grants you such rights