Skip to content

Commit cfe4a3c

Browse files
committed
docs: fix minor typo in README
1 parent c45a4b8 commit cfe4a3c

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -95,7 +95,7 @@ python scripts/train.py configs/tiny/OLMo-20M.yaml --save_overwrite
9595
9696
### Stage 1
9797

98-
Stage 1 is the biggest stage, where we train on 4T or 5T tokens on largely web-based data.
98+
Stage 1 is the biggest stage, where we train on 4T or 5T tokens on largely web-based data.
9999

100100
| | OLMo2 1B | OLMo2 7B | OLMo2 13B |
101101
|-----------------|-----------------|-------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------|
@@ -111,7 +111,7 @@ For the 1B model, we have trained three times with different data order on 50B h
111111

112112
| | Checkpoint | Training config | WandB |
113113
|------------------------|-------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------|-------------|
114-
| random seed 42049 | [stage2-ingredient1-step23852-tokens50B](https://huggingface.co/allenai/OLMo-2-0425-1B/tree/stage2-ingredient1-step23852-tokens51B) | [OLMo2-1B-stage2-seed42069.yaml](configs/official-0425/OLMo2-1B-stage2-seed42069.yaml) | [wandb.ai/OLMo2-1B](https://api.wandb.ai/links/ai2-llm/izdtrtu0) |
114+
| random seed 42069 | [stage2-ingredient1-step23852-tokens50B](https://huggingface.co/allenai/OLMo-2-0425-1B/tree/stage2-ingredient1-step23852-tokens51B) | [OLMo2-1B-stage2-seed42069.yaml](configs/official-0425/OLMo2-1B-stage2-seed42069.yaml) | [wandb.ai/OLMo2-1B](https://api.wandb.ai/links/ai2-llm/izdtrtu0) |
115115
| random seed 666 | [stage2-ingredient2-step23852-tokens50B](https://huggingface.co/allenai/OLMo-2-0425-1B/tree/stage2-ingredient2-step23852-tokens51B) | [OLMo2-1B-stage2-seed666.yaml](configs/official-0425/OLMo2-1B-stage2-seed666.yaml) | [wandb.ai/OLMo2-1B](https://api.wandb.ai/links/ai2-llm/izdtrtu0) |
116116
| random seed 42 | [stage2-ingredient3-step23852-tokens50B](https://huggingface.co/allenai/OLMo-2-0425-1B/tree/stage2-ingredient3-step23852-tokens51B) | [OLMo2-1B-stage2-seed42.yaml](configs/official-0425/OLMo2-1B-stage2-seed42.yaml) | [wandb.ai/OLMo2-1B](https://api.wandb.ai/links/ai2-llm/izdtrtu0) |
117117
| **final souped model** | [main](https://huggingface.co/allenai/OLMo-2-0425-1B/tree/main) | no config, we just averaged the weights in Python |
@@ -235,12 +235,12 @@ where `body.json` is of the form:
235235

236236
```bibtex
237237
@misc{olmo20242olmo2furious,
238-
title={2 OLMo 2 Furious},
238+
title={2 OLMo 2 Furious},
239239
author={Team OLMo and Pete Walsh and Luca Soldaini and Dirk Groeneveld and Kyle Lo and Shane Arora and Akshita Bhagia and Yuling Gu and Shengyi Huang and Matt Jordan and Nathan Lambert and Dustin Schwenk and Oyvind Tafjord and Taira Anderson and David Atkinson and Faeze Brahman and Christopher Clark and Pradeep Dasigi and Nouha Dziri and Michal Guerquin and Hamish Ivison and Pang Wei Koh and Jiacheng Liu and Saumya Malik and William Merrill and Lester James V. Miranda and Jacob Morrison and Tyler Murray and Crystal Nam and Valentina Pyatkin and Aman Rangapur and Michael Schmitz and Sam Skjonsberg and David Wadden and Christopher Wilhelm and Michael Wilson and Luke Zettlemoyer and Ali Farhadi and Noah A. Smith and Hannaneh Hajishirzi},
240240
year={2024},
241241
eprint={2501.00656},
242242
archivePrefix={arXiv},
243243
primaryClass={cs.CL},
244-
url={https://arxiv.org/abs/2501.00656},
244+
url={https://arxiv.org/abs/2501.00656},
245245
}
246246
```

0 commit comments

Comments
 (0)