Skip to content

Commit a9ff599

Browse files
fix and clean sem_eval_2014 template (#697)
* fix and clean sem_eval_2014 template * some fixes Co-authored-by: Victor Sanh <[email protected]>
1 parent 3943265 commit a9ff599

File tree

1 file changed

+44
-34
lines changed

1 file changed

+44
-34
lines changed

promptsource/templates/sem_eval_2014_task_1/templates.yaml

Lines changed: 44 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -6,83 +6,93 @@ templates:
66
jinja: 'Does the premise: "{{premise}}" agree with the hypothesis: "{{hypothesis}}"
77
? ||| {{answer_choices[entailment_judgment]}}'
88
metadata: !TemplateMetadata
9-
choices_in_prompt: null
10-
metrics: []
9+
choices_in_prompt: false
10+
metrics:
11+
- Accuracy
1112
original_task: true
12-
name: entailment_basic_3
13+
name: premise_agree_hypothesis
1314
reference: ''
1415
2aa091cb-02ff-4c8c-964c-4c5e53df8c1b: !Template
1516
answer_choices: null
1617
id: 2aa091cb-02ff-4c8c-964c-4c5e53df8c1b
17-
jinja: "How related are the two sentences : \"{{hypothesis}}\" and \"{{premise}}\"\
18-
\ ? Rate it from 1-5. \n||| {{(((10*relatedness_score)|round)/10)}}"
18+
jinja: 'How related are the two sentences : "{{hypothesis}}" and "{{premise}}"
19+
? Rate it from 1-5, where 1 is completely unrelated and 5 is very related.
20+
21+
||| {{(((10*relatedness_score)|round)/10)}}'
1922
metadata: !TemplateMetadata
20-
choices_in_prompt: null
21-
metrics: []
23+
choices_in_prompt: false
24+
metrics:
25+
- Pearson correlation
26+
- Spearman correlation
27+
- Mean Squared Error
2228
original_task: true
23-
name: relatedness_basic_2
29+
name: related_rate
2430
reference: ''
2531
75203dd2-5ec3-4e91-b95f-228ad9bd2010: !Template
2632
answer_choices: neither ||| entailing ||| contradicting
2733
id: 75203dd2-5ec3-4e91-b95f-228ad9bd2010
2834
jinja: "Sentence 1: \"{{hypothesis}}\" \nSentence 2: \"{{premise}}\"\nAre the\
29-
\ two sentences {{\"entailing\"}} or {{\"contradicting\"}} each other?\n|||\
30-
\ {{answer_choices[entailment_judgment]}}"
35+
\ two sentences {{answer_choices[1]}} or {{answer_choices[2]}} each other? If\
36+
\ none of these options are valid, answer \"{{answer_choices[0]}}\".\n||| {{answer_choices[entailment_judgment]}}"
3137
metadata: !TemplateMetadata
32-
choices_in_prompt: null
33-
metrics: []
38+
choices_in_prompt: true
39+
metrics:
40+
- Accuracy
3441
original_task: true
35-
name: entailment_basic_2
42+
name: entailing_or_contradicting
3643
reference: ''
3744
892c58fd-64f5-4059-8fb8-c74bc025ff40: !Template
3845
answer_choices: Neutral ||| Entailment ||| Contradiction
3946
id: 892c58fd-64f5-4059-8fb8-c74bc025ff40
4047
jinja: "Given the following hypothesis: {{hypothesis}}.\nAs well as the premise:\
4148
\ {{premise}}, \nPredict the Entailment relation between the premise and hypothesis\
42-
\ from the labels {{\"Neutral\"}}, {{\"Entailment\"}}, {{ \"Contradiction\"\
43-
}} |||\n {{answer_choices[entailment_judgment]}}\n"
49+
\ from the labels {{answer_choices[0]}}, {{answer_choices[1]}}, {{answer_choices[2]}}\
50+
\ |||\n {{answer_choices[entailment_judgment]}}\n"
4451
metadata: !TemplateMetadata
45-
choices_in_prompt: null
46-
metrics: []
52+
choices_in_prompt: true
53+
metrics:
54+
- Accuracy
4755
original_task: true
48-
name: entailment_basic_1
56+
name: entailment_relation
4957
reference: ''
5058
91a6b1db-be59-41bd-9eea-73bb7a4e7350: !Template
51-
answer_choices: neither entails nor contradicts ||| entails ||| contradicts
59+
answer_choices: Neutral ||| Entailment ||| Contradiction
5260
id: 91a6b1db-be59-41bd-9eea-73bb7a4e7350
5361
jinja: 'Given the hypothesis: {{hypothesis}} and the premise: {{premise}}. Out
54-
of the options, {{"neither entails nor contradicts"}}, {{"entails"}} and {{
55-
"contradicts"}}, the hypothesis ||| {{answer_choices[entailment_judgment]}}
56-
the premise.
57-
58-
'
62+
of the options, {{answer_choices[0]}}, {{answer_choices[1]}} and {{answer_choices[2]}}
63+
what is the entailment label? ||| {{answer_choices[entailment_judgment]}}'
5964
metadata: !TemplateMetadata
60-
choices_in_prompt: null
61-
metrics: []
65+
choices_in_prompt: true
66+
metrics:
67+
- Accuracy
6268
original_task: true
63-
name: entailment_localization_1
69+
name: entailment_label
6470
reference: ''
6571
a58fe8b4-f185-46a9-8fca-6dc66d0812be: !Template
6672
answer_choices: null
6773
id: a58fe8b4-f185-46a9-8fca-6dc66d0812be
6874
jinja: "Given the following hypothesis: {{hypothesis}}.\nAs well as the premise:\
69-
\ {{premise}}, \nGive a score on how related the hypothesis and premise was,\
75+
\ {{premise}}, \nGive a score on how related the hypothesis and premise were,\
7076
\ from the scale 1 to 5, where 1 is completely unrelated and 5 is very related:\
7177
\ ||| {{(((10*relatedness_score)|round)/10)}}\n\n"
7278
metadata: !TemplateMetadata
73-
choices_in_prompt: null
74-
metrics: []
79+
choices_in_prompt: false
80+
metrics:
81+
- Pearson correlation
82+
- Spearman correlation
83+
- Mean Squared Error
7584
original_task: true
76-
name: relatedness_basic_1
85+
name: related_score
7786
reference: ''
7887
d9380ec0-18b3-48b2-99eb-9f9cb47ab7c7: !Template
7988
answer_choices: unclear ||| yes ||| no
8089
id: d9380ec0-18b3-48b2-99eb-9f9cb47ab7c7
8190
jinja: Does {{premise}} imply that {{hypothesis}}? Please answer yes, no, or
8291
unclear. ||| {{answer_choices[entailment_judgment]}}
8392
metadata: !TemplateMetadata
84-
choices_in_prompt: null
85-
metrics: []
93+
choices_in_prompt: true
94+
metrics:
95+
- Accuracy
8696
original_task: true
87-
name: entailment_basic_4
97+
name: premise_imply_hypothesis
8898
reference: ''

0 commit comments

Comments
 (0)