Skip to content

Commit 78b5bb3

Browse files
Update generative-proof-of-concept-CPU-preprocessing-in-memory.py
Remove duplicative batch_size training on a batched dataset.
1 parent 2e4e716 commit 78b5bb3

File tree

1 file changed

+3
-4
lines changed

1 file changed

+3
-4
lines changed

generative-proof-of-concept-CPU-preprocessing-in-memory.py

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1405,14 +1405,13 @@ def create_dataset(raw_text_samples, tokenizer, sample_expansion_batch_size=10)
14051405
phase_i_b_dataset = create_dataset(raw_text_samples=phase_i_b_samples, tokenizer=tokenizer, sample_expansion_batch_size=10)
14061406

14071407

1408-
14091408
phase_i_b_history =\
14101409
generator.model.fit(
14111410
# best_model_found.fit(
14121411
x=phase_i_b_dataset,
1413-
epochs=phase_i_b_epochs,
1414-
batch_size=batch_size) # ,
1415-
# validation_split=0.2)
1412+
epochs=phase_i_b_epochs)
1413+
# batch_size=batch_size)
1414+
14161415

14171416
phase_i_b_history =\
14181417
pd.DataFrame(phase_i_b_history.history)

0 commit comments

Comments
 (0)