小红书:2025 dots.llm1技术报告(英文版)(21页).pdf

编号:713091 PDF  中文版  DOCX 21页 1.51MB 下载积分:VIP专享
下载报告请您先登录!

小红书:2025 dots.llm1技术报告(英文版)(21页).pdf

1、2025-06-06dots.llm1 Technical Reportrednote-hilabhttps:/huggingface.co/rednote-hilabhttps:/ of Experts(MoE)models have emerged as a promising paradigm for scalinglanguage models efficiently by activating only a subset of parameters for each inputtoken.In this report,we presentdots.llm1,a large-scale

2、 MoE model that activates 14billion parameters out of a total of 142 billion parameters,delivering performance onpar with state-of-the-art models while reducing training and inference costs.Leveragingour meticulously crafted and efficient data processing pipeline,dots.llm1achievesperformance compara

3、ble to Qwen2.5-72B after pretraining on 11.2T high-quality tokensand post-training to fully unlock its capabilities.Notably,no synthetic data is usedduring pretraining.To foster further research,we open-source intermediate trainingcheckpoints at every one trillion tokens,providing valuable insights

4、into the learningdynamics of large language models.7143270Cost(Billion active parameters)4050607080Performance(%MMLU-Pro)dots.llm1Qwen2.5-72BQwen2.5-32BQwen2.5-14BQwen2.5-7BLlama3-70BDeepSeek-V2DeepSeek-V3Qwen3-235B-A22BQwen2-57B-A14BBest performance/costratioFigure 1:Performance and cost comparison

5、 of open MoE and dense language models.Circles()denotedense models,while diamonds()denote MoE models.We benchmark model capabilities using MMLU-Pro,showing that dots.llm1 achieves comparable accuracy to leading models.11IntroductionLarge Language Models(LLMs)have undergone rapid advancements in rece

6、nt years,moving closer tothe goal of Artificial General Intelligence(AGI)as evidenced by substantial progress(OpenAI,2025a;b;Anthropic,2025;xAI,2025).Parallel to these proprietary developments,the open-source community isalso achieving remarkable breakthroughs(Qwen,2024a;DeepSeek-AI et al.,2024;Mist

友情提示

1、下载报告失败解决办法
2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
4、本站报告下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。

本文(小红书:2025 dots.llm1技术报告(英文版)(21页).pdf)为本站 (白日梦派对) 主动上传,三个皮匠报告文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三个皮匠报告文库(点击联系客服),我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。
客服
商务合作
小程序
服务号
折叠