Datasets:
example
#9
by zym11111 - opened
- README.md +1 -3
- README_ZH.md +1 -1
README.md
CHANGED
|
@@ -45,8 +45,6 @@ default_config_name: UltraData-Math-L3-Conversation-Synthetic
|
|
| 45 |
|
| 46 |
***UltraData-Math*** is a large-scale, high-quality mathematical pre-training dataset totaling **290B+ tokens** across three progressive tiers—**L1** (170.5B tokens web corpus), **L2** (33.7B tokens quality-selected), and **L3** (88B tokens multi-format refined)—designed to systematically enhance mathematical reasoning in LLMs. It has been applied to the mathematical pre-training of the [MiniCPM Series](https://huggingface.co/collections/openbmb/minicpm4) models.
|
| 47 |
|
| 48 |
-
It was introduced in the paper [Data Science and Technology Towards AGI Part I: Tiered Data Management](https://huggingface.co/papers/2602.09003).
|
| 49 |
-
|
| 50 |
## 🆕 What's New
|
| 51 |
|
| 52 |
- **[2026.02.09]**: **UltraData-Math**, a large-scale high-quality mathematical pre-training dataset with 290B+ tokens across three progressive tiers (L1/L2-preview/L3), is now available on Hugging Face. Released as part of the [UltraData](https://ultradata.openbmb.cn/) ecosystem. 🔥🔥🔥
|
|
@@ -208,7 +206,7 @@ If you find **UltraData-Math** useful in your research, please consider citing:
|
|
| 208 |
```bibtex
|
| 209 |
@misc{ultradata-math,
|
| 210 |
title={UltraData-Math},
|
| 211 |
-
author={
|
| 212 |
year={2026},
|
| 213 |
url={https://huggingface.co/datasets/openbmb/UltraData-Math},
|
| 214 |
publisher={Hugging Face}
|
|
|
|
| 45 |
|
| 46 |
***UltraData-Math*** is a large-scale, high-quality mathematical pre-training dataset totaling **290B+ tokens** across three progressive tiers—**L1** (170.5B tokens web corpus), **L2** (33.7B tokens quality-selected), and **L3** (88B tokens multi-format refined)—designed to systematically enhance mathematical reasoning in LLMs. It has been applied to the mathematical pre-training of the [MiniCPM Series](https://huggingface.co/collections/openbmb/minicpm4) models.
|
| 47 |
|
|
|
|
|
|
|
| 48 |
## 🆕 What's New
|
| 49 |
|
| 50 |
- **[2026.02.09]**: **UltraData-Math**, a large-scale high-quality mathematical pre-training dataset with 290B+ tokens across three progressive tiers (L1/L2-preview/L3), is now available on Hugging Face. Released as part of the [UltraData](https://ultradata.openbmb.cn/) ecosystem. 🔥🔥🔥
|
|
|
|
| 206 |
```bibtex
|
| 207 |
@misc{ultradata-math,
|
| 208 |
title={UltraData-Math},
|
| 209 |
+
author={UltraData Team},
|
| 210 |
year={2026},
|
| 211 |
url={https://huggingface.co/datasets/openbmb/UltraData-Math},
|
| 212 |
publisher={Hugging Face}
|
README_ZH.md
CHANGED
|
@@ -166,7 +166,7 @@ ds = load_dataset("openbmb/UltraData-Math", "UltraData-Math-L3-Conversation-Synt
|
|
| 166 |
```bibtex
|
| 167 |
@misc{ultradata-math,
|
| 168 |
title={UltraData-Math},
|
| 169 |
-
author={
|
| 170 |
year={2026},
|
| 171 |
url={https://huggingface.co/datasets/openbmb/UltraData-Math},
|
| 172 |
publisher={Hugging Face}
|
|
|
|
| 166 |
```bibtex
|
| 167 |
@misc{ultradata-math,
|
| 168 |
title={UltraData-Math},
|
| 169 |
+
author={UltraData Team},
|
| 170 |
year={2026},
|
| 171 |
url={https://huggingface.co/datasets/openbmb/UltraData-Math},
|
| 172 |
publisher={Hugging Face}
|