Skip to content

Comments

Multiple beams translate & evaluation with bleu#6

Open
trynusnick13 wants to merge 9 commits intolang-uk:mainfrom
trynusnick13:main
Open

Multiple beams translate & evaluation with bleu#6
trynusnick13 wants to merge 9 commits intolang-uk:mainfrom
trynusnick13:main

Conversation

@trynusnick13
Copy link

No description provided.

with open('flores-eng-devtest.csv', 'w') as csvfile:
writer = csv.DictWriter(csvfile, fieldnames=["eng_Latn-ukr_Cyrl"])
writer.writeheader()
for domain in list_of_emails:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

list_of_emails?

eng = devtest["sentence_eng_Latn"]
def write_to_csv(list_of_emails):
with open('flores-eng-devtest.csv', 'w') as csvfile:
writer = csv.DictWriter(csvfile, fieldnames=["eng_Latn-ukr_Cyrl"])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fieldnames looks wrong.


@app.command()
def eval_model_multpl_beams_ready_prep(
source_file_path: Annotated[str, typer.Option()],
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gosh, some docstrings are needed.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ahahhaah, agree

)
source_sentences = []
with open(preprocessed_file_path) as f:
source_sentences = f.readlines()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In [1]: with open("/tmp/foo", "r") as fp_on:
   ...:     lines = fp_on.readlines()
   ...:

In [2]: lines
Out[2]: ['1\n', '2\n', '3\n', '4\n']

you might want to strip newlines.

all_prompts.append(translation_prompt)
print(f"Max tokens = {max(all_token_counts)}")
inputs = tokenizer(all_prompts, return_tensors="pt", padding=True)
model.to("cuda")
Copy link
Collaborator

@proger proger Feb 23, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Loading the model to cuda later is slower than loading it directly to cuda. Check out this patch: 33b3774

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants