Skip to content

Commit 9a8f29c

Browse files
committed
fix config
1 parent 0c92f36 commit 9a8f29c

File tree

2 files changed

+12
-14
lines changed

2 files changed

+12
-14
lines changed
Lines changed: 9 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,10 @@
1-
{
2-
"architectures": [
3-
"LlamaForCausalLM"
4-
],
5-
"model_type": "llama",
6-
"hidden_size": 1024,
7-
"intermediate_size": 2688,
8-
"num_attention_heads": 16,
9-
"num_hidden_layers": 12,
10-
"use_cache": false,
11-
"rms_norm_eps": 1e-05
12-
}
131

2+
{
3+
"name": "llama150m",
4+
"n_embd": 1024,
5+
"intermediate_size": 4096,
6+
"n_head": 16,
7+
"n_layer": 12,
8+
"vocab_size": 32000,
9+
"block_size": 1024
10+
}

open_diloco/configs/config_1b.json

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,10 @@
11
{
2-
"name": "llama",
2+
"name": "llama1b",
33
"n_embd": 2048,
44
"intermediate_size": 5632,
55
"n_head": 32,
66
"n_layer": 22,
77
"n_query_groups": 4,
8-
"vocab_size": 1024
8+
"vocab_size": 32000,
9+
"block_size": 1024
910
}

0 commit comments

Comments
 (0)