-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathnohup.out
More file actions
4087 lines (3728 loc) · 160 KB
/
nohup.out
File metadata and controls
4087 lines (3728 loc) · 160 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
###DEVICE: cuda:0
###multilingual-for-en-masking
### Your result would be saved to: /disk/data/models/results/framenet/enModel-with-exemplar/en_with_exem_for_en_with_masking_result.txt
# of instances in trn: 211812
# of instances in dev: 2272
# of instances in tst: 6714
data example: [['Greece', 'wildfires', 'force', 'thousands', 'to', '<tgt>', 'evacuate', '</tgt>'], ['_', '_', '_', '_', '_', '_', 'evacuate.v', '_'], ['_', '_', '_', '_', '_', '_', 'Escaping', '_'], ['O', 'O', 'O', 'B-Escapee', 'O', 'X', 'O', 'X']]
### EVALUATION
MODE: framenet
target LANGUAGE: en
trained LANGUAGE: en_with_exem
Viterbi: False
masking: True
using TGT token: True
model dir: /disk/data/models/framenet/enModel-with-exemplar/3/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/enModel-with-exemplar/3/
/disk/data/models/framenet/enModel-with-exemplar/3/
...model is loaded
../kaiser/src/utils.py:269: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
pred_logits = sm(masked_logit).view(1,-1)
# EPOCH: 3
SenseId Accuracy: 0.8941018766756033
ArgId Precision: 0.5715167412436353
ArgId Recall: 0.6554592107591577
ArgId F1: 0.6106165512693703
full-structure Precision: 0.6920410226780298
full-structure Recall: 0.753538848694558
full-structure F1: 0.7214818161283036
-----processing time: 0hour:8min:20sec
model dir: /disk/data/models/framenet/enModel-with-exemplar/15/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/enModel-with-exemplar/15/
/disk/data/models/framenet/enModel-with-exemplar/15/
...model is loaded
# EPOCH: 15
SenseId Accuracy: 0.8993148644623176
ArgId Precision: 0.5709492947007243
ArgId Recall: 0.6625376039639002
ArgId F1: 0.6133431625506818
full-structure Precision: 0.6926425564485266
full-structure Recall: 0.7590961518297158
full-structure F1: 0.7243483916153883
-----processing time: 0hour:17min:1sec
model dir: /disk/data/models/framenet/enModel-with-exemplar/12/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/enModel-with-exemplar/12/
/disk/data/models/framenet/enModel-with-exemplar/12/
...model is loaded
# EPOCH: 12
SenseId Accuracy: 0.8975275543640155
ArgId Precision: 0.5727511186545287
ArgId Recall: 0.6568748894001062
ArgId F1: 0.6119353775140125
full-structure Precision: 0.6939011465459101
full-structure Recall: 0.7551640977246513
full-structure F1: 0.7232375979112271
-----processing time: 0hour:25min:44sec
model dir: /disk/data/models/framenet/enModel-with-exemplar/1/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/enModel-with-exemplar/1/
/disk/data/models/framenet/enModel-with-exemplar/1/
...model is loaded
# EPOCH: 1
SenseId Accuracy: 0.8772713732499255
ArgId Precision: 0.530532666274279
ArgId Recall: 0.6380286674924792
ArgId F1: 0.5793363862778179
full-structure Precision: 0.6570267239768267
full-structure Recall: 0.7372863583936248
full-structure F1: 0.6948465833292158
-----processing time: 0hour:34min:28sec
model dir: /disk/data/models/framenet/enModel-with-exemplar/11/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/enModel-with-exemplar/11/
/disk/data/models/framenet/enModel-with-exemplar/11/
...model is loaded
# EPOCH: 11
SenseId Accuracy: 0.8987190944295502
ArgId Precision: 0.5731047619047619
ArgId Recall: 0.6655459210759158
ArgId F1: 0.6158758750562903
full-structure Precision: 0.6938746234399655
full-structure Recall: 0.7607738282478766
full-structure F1: 0.7257858804111336
-----processing time: 0hour:43min:12sec
model dir: /disk/data/models/framenet/enModel-with-exemplar/2/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/enModel-with-exemplar/2/
/disk/data/models/framenet/enModel-with-exemplar/2/
...model is loaded
# EPOCH: 2
SenseId Accuracy: 0.887846291331546
ArgId Precision: 0.5565603923973023
ArgId Recall: 0.6425411431605026
ArgId F1: 0.5964681724845995
full-structure Precision: 0.6786569594788773
full-structure Recall: 0.7428436615287827
full-structure F1: 0.7093011613936725
-----processing time: 0hour:51min:57sec
model dir: /disk/data/models/framenet/enModel-with-exemplar/4/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/enModel-with-exemplar/4/
/disk/data/models/framenet/enModel-with-exemplar/4/
...model is loaded
# EPOCH: 4
SenseId Accuracy: 0.8932082216264522
ArgId Precision: 0.5758770201024832
ArgId Recall: 0.6463457795080517
ArgId F1: 0.6090799182890732
full-structure Precision: 0.69646955417745
full-structure Recall: 0.7477718360071302
full-structure F1: 0.7212095163451573
-----processing time: 1hour:0min:43sec
model dir: /disk/data/models/framenet/enModel-with-exemplar/9/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/enModel-with-exemplar/9/
/disk/data/models/framenet/enModel-with-exemplar/9/
...model is loaded
# EPOCH: 9
SenseId Accuracy: 0.8976764968722073
ArgId Precision: 0.586978797193978
ArgId Recall: 0.6589099274464697
ArgId F1: 0.6208678977864854
full-structure Precision: 0.7045387994143485
full-structure Recall: 0.7568417741428122
full-structure F1: 0.7297543221110101
-----processing time: 1hour:9min:30sec
model dir: /disk/data/models/framenet/enModel-with-exemplar/17/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/enModel-with-exemplar/17/
/disk/data/models/framenet/enModel-with-exemplar/17/
...model is loaded
# EPOCH: 17
SenseId Accuracy: 0.8987190944295502
ArgId Precision: 0.5778161280200815
ArgId Recall: 0.6517430543266678
ArgId F1: 0.6125571725571725
full-structure Precision: 0.6986194827921447
full-structure Recall: 0.7534864213064905
full-structure F1: 0.725016395096605
-----processing time: 1hour:18min:17sec
model dir: /disk/data/models/framenet/enModel-with-exemplar/10/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/enModel-with-exemplar/10/
/disk/data/models/framenet/enModel-with-exemplar/10/
...model is loaded
# EPOCH: 10
SenseId Accuracy: 0.8969317843312481
ArgId Precision: 0.5759600614439324
ArgId Recall: 0.6635108830295523
ArgId F1: 0.6166433681440672
full-structure Precision: 0.6953973287210531
full-structure Recall: 0.7588340148893782
full-structure F1: 0.72573204973927
-----processing time: 1hour:27min:4sec
=====================================================
Using TensorFlow backend.
Epoch: 0%| | 0/50 [00:00<?, ?it/s]../kaiser/src/utils.py:269: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
pred_logits = sm(masked_logit).view(1,-1)
Epoch: 2%|▏ | 1/50 [19:17<15:45:26, 1157.69s/it]Epoch: 4%|▍ | 2/50 [41:08<16:02:53, 1203.61s/it]Epoch: 6%|▌ | 3/50 [1:02:57<16:07:41, 1235.35s/it]Epoch: 8%|▊ | 4/50 [1:24:46<16:04:02, 1257.45s/it]Epoch: 10%|█ | 5/50 [1:46:35<15:54:31, 1272.70s/it]Epoch: 12%|█▏ | 6/50 [2:08:22<15:40:54, 1283.07s/it]Epoch: 14%|█▍ | 7/50 [2:30:03<15:23:23, 1288.46s/it]Epoch: 16%|█▌ | 8/50 [2:51:47<15:05:08, 1293.05s/it]Epoch: 18%|█▊ | 9/50 [3:13:31<14:45:52, 1296.41s/it]Epoch: 20%|██ | 10/50 [3:35:17<14:26:13, 1299.33s/it]Epoch: 22%|██▏ | 11/50 [3:57:04<14:05:58, 1301.49s/it]Epoch: 24%|██▍ | 12/50 [4:18:50<13:45:10, 1302.91s/it]Epoch: 26%|██▌ | 13/50 [4:40:38<13:24:28, 1304.56s/it]Epoch: 28%|██▊ | 14/50 [5:02:28<13:03:40, 1306.14s/it]Epoch: 30%|███ | 15/50 [5:24:16<12:42:15, 1306.73s/it]Epoch: 32%|███▏ | 16/50 [5:46:03<12:20:30, 1306.79s/it]Epoch: 34%|███▍ | 17/50 [6:07:48<11:58:23, 1306.16s/it]Epoch: 36%|███▌ | 18/50 [6:29:30<11:35:55, 1304.87s/it]Epoch: 38%|███▊ | 19/50 [6:51:13<11:13:56, 1304.41s/it]Epoch: 40%|████ | 20/50 [7:12:55<10:51:53, 1303.79s/it]Epoch: 42%|████▏ | 21/50 [7:34:37<10:29:52, 1303.17s/it]Epoch: 44%|████▍ | 22/50 [7:56:16<10:07:35, 1301.99s/it]Epoch: 46%|████▌ | 23/50 [8:17:54<9:45:21, 1300.79s/it] Epoch: 48%|████▊ | 24/50 [8:39:32<9:23:19, 1299.97s/it]Epoch: 50%|█████ | 25/50 [9:01:11<9:01:30, 1299.63s/it]Epoch: 52%|█████▏ | 26/50 [9:22:48<8:39:27, 1298.65s/it]Epoch: 54%|█████▍ | 27/50 [9:44:24<8:17:36, 1298.09s/it]Epoch: 56%|█████▌ | 28/50 [10:05:59<7:55:37, 1297.16s/it]Epoch: 58%|█████▊ | 29/50 [10:27:33<7:33:38, 1296.14s/it]Epoch: 60%|██████ | 30/50 [10:49:06<7:11:44, 1295.21s/it]Epoch: 62%|██████▏ | 31/50 [11:10:39<6:49:56, 1294.55s/it]Epoch: 64%|██████▍ | 32/50 [11:32:12<6:28:13, 1294.10s/it]Epoch: 66%|██████▌ | 33/50 [11:53:45<6:06:34, 1293.77s/it]Epoch: 68%|██████▊ | 34/50 [12:15:17<5:44:49, 1293.12s/it]Epoch: 70%|███████ | 35/50 [12:36:50<5:23:15, 1293.03s/it]Epoch: 72%|███████▏ | 36/50 [12:58:23<5:01:45, 1293.28s/it]Epoch: 74%|███████▍ | 37/50 [13:19:57<4:40:12, 1293.31s/it]Epoch: 76%|███████▌ | 38/50 [13:41:32<4:18:45, 1293.77s/it]Epoch: 78%|███████▊ | 39/50 [14:03:04<3:57:04, 1293.18s/it]Epoch: 80%|████████ | 40/50 [14:24:33<3:35:19, 1291.99s/it]Epoch: 82%|████████▏ | 41/50 [14:45:59<3:13:32, 1290.31s/it]Epoch: 84%|████████▍ | 42/50 [15:07:25<2:51:51, 1288.98s/it]Epoch: 86%|████████▌ | 43/50 [15:28:52<2:30:19, 1288.46s/it]Epoch: 88%|████████▊ | 44/50 [15:50:20<2:08:49, 1288.19s/it]Epoch: 90%|█████████ | 45/50 [16:11:51<1:47:25, 1289.02s/it]Epoch: 92%|█████████▏| 46/50 [16:33:20<1:25:56, 1289.23s/it]Epoch: 94%|█████████▍| 47/50 [16:54:50<1:04:27, 1289.23s/it]Epoch: 96%|█████████▌| 48/50 [17:16:15<42:55, 1287.98s/it] Epoch: 98%|█████████▊| 49/50 [17:37:40<21:27, 1287.05s/it]Epoch: 100%|██████████| 50/50 [17:59:08<00:00, 1287.30s/it]Epoch: 100%|██████████| 50/50 [17:59:08<00:00, 1294.96s/it]### Korean FrameNet ###
# contact: hahmyg@kaist, hahmyg@gmail.com #
### loading Korean FrameNet 1.1 data...
# of instances in training data: 17838
# of instances in dev data: 2548
# of instances in test data: 5097
# of instances in trn: 17838
# of instances in dev: 2548
# of instances in tst: 5097
data example: [['태풍', 'Hugo가', '남긴', '피해들과', '회사', '내', '몇몇', '주요', '부서들의', '저조한', '실적들을', '반영하여,', 'Aetna', 'Life', 'and', 'Casualty', 'Co.의', '3분기', '<tgt>', '순이익이', '</tgt>', '182.6', '백만', '달러', '또는', '주당', '1.63', '달러로', '22', '%', '하락하였다.'], ['_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '이익.n', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_'], ['_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', 'Earnings_and_losses', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_'], ['O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-Earner', 'I-Earner', 'I-Earner', 'I-Earner', 'I-Earner', 'B-Time', 'X', 'O', 'X', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O']]
FrameBERT(ko)
### TRAINING
MODEL: framenet
LANGUAGE: multi
PRETRAINED BERT: bert-base-multilingual-cased
training data:
(ko): 17838
BATCH_SIZE: 6
MAX_LEN: 256
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
original model: BERT-multilingual-base
your model would be saved at /disk/data/models/framenet/koModel/
retrain: False
### converting data to BERT input...
...is done: 0hour:0min:25sec
#of instance: 17838 17838
Train loss: 2.195627450050828
your model is saved: /disk/data/models/framenet/koModel/0/
Train loss: 0.9128789130787516
your model is saved: /disk/data/models/framenet/koModel/1/
Train loss: 0.6664674858848655
your model is saved: /disk/data/models/framenet/koModel/2/
Train loss: 0.5661887224880164
your model is saved: /disk/data/models/framenet/koModel/3/
Train loss: 0.5070173120419966
your model is saved: /disk/data/models/framenet/koModel/4/
Train loss: 0.4668457638453523
your model is saved: /disk/data/models/framenet/koModel/5/
Train loss: 0.4304788384700599
your model is saved: /disk/data/models/framenet/koModel/6/
Train loss: 0.39627595108066327
your model is saved: /disk/data/models/framenet/koModel/7/
Train loss: 0.3640767958590349
your model is saved: /disk/data/models/framenet/koModel/8/
Train loss: 0.33188922202801646
your model is saved: /disk/data/models/framenet/koModel/9/
Train loss: 0.3016223385491497
your model is saved: /disk/data/models/framenet/koModel/10/
Train loss: 0.27034127037957173
your model is saved: /disk/data/models/framenet/koModel/11/
Train loss: 0.2416004812817305
your model is saved: /disk/data/models/framenet/koModel/12/
Train loss: 0.2153437199137461
your model is saved: /disk/data/models/framenet/koModel/13/
Train loss: 0.18907982187788575
your model is saved: /disk/data/models/framenet/koModel/14/
Train loss: 0.17079247454049118
your model is saved: /disk/data/models/framenet/koModel/15/
Train loss: 0.14815829313785248
your model is saved: /disk/data/models/framenet/koModel/16/
Train loss: 0.13343456607538137
your model is saved: /disk/data/models/framenet/koModel/17/
Train loss: 0.11397622590411544
your model is saved: /disk/data/models/framenet/koModel/18/
Train loss: 0.10243707968620976
your model is saved: /disk/data/models/framenet/koModel/19/
Train loss: 0.0902837888956204
your model is saved: /disk/data/models/framenet/koModel/20/
Train loss: 0.08178547522787091
your model is saved: /disk/data/models/framenet/koModel/21/
Train loss: 0.0721877051928208
your model is saved: /disk/data/models/framenet/koModel/22/
Train loss: 0.061517586808024695
your model is saved: /disk/data/models/framenet/koModel/23/
Train loss: 0.05559405874892217
your model is saved: /disk/data/models/framenet/koModel/24/
Train loss: 0.050362159264089275
your model is saved: /disk/data/models/framenet/koModel/25/
Train loss: 0.0470086468679877
your model is saved: /disk/data/models/framenet/koModel/26/
Train loss: 0.04111084096825857
your model is saved: /disk/data/models/framenet/koModel/27/
Train loss: 0.039848367908784384
your model is saved: /disk/data/models/framenet/koModel/28/
Train loss: 0.03650298340730382
your model is saved: /disk/data/models/framenet/koModel/29/
Train loss: 0.034098718919847434
your model is saved: /disk/data/models/framenet/koModel/30/
Train loss: 0.031007983965822394
your model is saved: /disk/data/models/framenet/koModel/31/
Train loss: 0.030955727452570463
your model is saved: /disk/data/models/framenet/koModel/32/
Train loss: 0.030489776774386208
your model is saved: /disk/data/models/framenet/koModel/33/
Train loss: 0.028268350774485997
your model is saved: /disk/data/models/framenet/koModel/34/
Train loss: 0.028346573842075874
your model is saved: /disk/data/models/framenet/koModel/35/
Train loss: 0.02795905820614942
your model is saved: /disk/data/models/framenet/koModel/36/
Train loss: 0.028662245595391815
your model is saved: /disk/data/models/framenet/koModel/37/
Train loss: 0.02748710535839607
your model is saved: /disk/data/models/framenet/koModel/38/
Train loss: 0.02609411200946188
your model is saved: /disk/data/models/framenet/koModel/39/
Train loss: 0.02682277116978963
your model is saved: /disk/data/models/framenet/koModel/40/
Train loss: 0.026324015897049715
your model is saved: /disk/data/models/framenet/koModel/41/
Train loss: 0.025035734170918204
your model is saved: /disk/data/models/framenet/koModel/42/
Train loss: 0.022774856962376005
your model is saved: /disk/data/models/framenet/koModel/43/
Train loss: 0.023382833836358446
your model is saved: /disk/data/models/framenet/koModel/44/
Train loss: 0.024599555740036503
your model is saved: /disk/data/models/framenet/koModel/45/
Train loss: 0.023319912206109922
your model is saved: /disk/data/models/framenet/koModel/46/
Train loss: 0.025143837982126303
your model is saved: /disk/data/models/framenet/koModel/47/
Train loss: 0.02504264497199141
your model is saved: /disk/data/models/framenet/koModel/48/
Train loss: 0.02309913792688522
your model is saved: /disk/data/models/framenet/koModel/49/
...training is done
### loading Korean FrameNet 1.1 data...
# of instances in training data: 17838
# of instances in dev data: 2548
# of instances in test data: 5097
# of instances in trn: 17838
# of instances in dev: 2548
# of instances in tst: 5097
data example: [['태풍', 'Hugo가', '남긴', '피해들과', '회사', '내', '몇몇', '주요', '부서들의', '저조한', '실적들을', '반영하여,', 'Aetna', 'Life', 'and', 'Casualty', 'Co.의', '3분기', '<tgt>', '순이익이', '</tgt>', '182.6', '백만', '달러', '또는', '주당', '1.63', '달러로', '22', '%', '하락하였다.'], ['_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '이익.n', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_'], ['_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', 'Earnings_and_losses', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_'], ['O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-Earner', 'I-Earner', 'I-Earner', 'I-Earner', 'I-Earner', 'B-Time', 'X', 'O', 'X', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O']]
FineTuning Multilingual
### TRAINING
MODEL: framenet
LANGUAGE: multi
PRETRAINED BERT: bert-base-multilingual-cased
training data:
(ko): 17838
BATCH_SIZE: 6
MAX_LEN: 256
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
original model: /disk/data/models/dict_framenet/enModel-with-exemplar/9/
your model would be saved at /disk/data/models/framenet/mulModel-100/
Epoch: 0%| | 0/50 [00:00<?, ?it/s]Epoch: 2%|▏ | 1/50 [21:36<17:38:58, 1296.71s/it]Epoch: 4%|▍ | 2/50 [43:19<17:18:47, 1298.49s/it]Epoch: 6%|▌ | 3/50 [1:05:03<16:58:23, 1300.07s/it]Epoch: 8%|▊ | 4/50 [1:26:50<16:38:19, 1302.16s/it]Epoch: 10%|█ | 5/50 [1:48:32<16:16:34, 1302.11s/it]Epoch: 12%|█▏ | 6/50 [2:10:13<15:54:45, 1301.94s/it]Epoch: 14%|█▍ | 7/50 [2:31:54<15:32:50, 1301.63s/it]Epoch: 16%|█▌ | 8/50 [2:53:36<15:11:06, 1301.57s/it]Epoch: 18%|█▊ | 9/50 [3:15:15<14:48:59, 1300.97s/it]Epoch: 20%|██ | 10/50 [3:36:55<14:27:01, 1300.53s/it]Epoch: 22%|██▏ | 11/50 [3:58:34<14:05:05, 1300.13s/it]Epoch: 24%|██▍ | 12/50 [4:20:16<13:43:45, 1300.68s/it]Epoch: 26%|██▌ | 13/50 [4:41:56<13:22:02, 1300.60s/it]Epoch: 28%|██▊ | 14/50 [5:03:35<13:00:02, 1300.07s/it]Epoch: 30%|███ | 15/50 [5:25:14<12:38:10, 1299.74s/it]Epoch: 32%|███▏ | 16/50 [5:46:52<12:16:15, 1299.29s/it]Epoch: 34%|███▍ | 17/50 [6:08:30<11:54:23, 1298.89s/it]Epoch: 36%|███▌ | 18/50 [6:30:08<11:32:32, 1298.53s/it]Epoch: 38%|███▊ | 19/50 [6:51:44<11:10:36, 1297.96s/it]Epoch: 40%|████ | 20/50 [7:13:22<10:48:50, 1297.69s/it]Epoch: 42%|████▏ | 21/50 [7:35:00<10:27:15, 1297.79s/it]Epoch: 44%|████▍ | 22/50 [7:56:37<10:05:32, 1297.60s/it]Epoch: 46%|████▌ | 23/50 [8:18:11<9:43:29, 1296.65s/it] Epoch: 48%|████▊ | 24/50 [8:39:46<9:21:39, 1296.14s/it]Epoch: 50%|█████ | 25/50 [9:01:19<8:59:36, 1295.07s/it]Epoch: 52%|█████▏ | 26/50 [9:22:50<8:37:35, 1293.99s/it]Epoch: 54%|█████▍ | 27/50 [9:44:19<8:15:29, 1292.57s/it]Epoch: 56%|█████▌ | 28/50 [10:05:52<7:53:56, 1292.56s/it]Epoch: 58%|█████▊ | 29/50 [10:27:23<7:32:14, 1292.12s/it]Epoch: 60%|██████ | 30/50 [10:48:53<7:10:27, 1291.36s/it]Epoch: 62%|██████▏ | 31/50 [11:10:24<6:48:54, 1291.27s/it]Epoch: 64%|██████▍ | 32/50 [11:31:58<6:27:37, 1292.06s/it]Epoch: 66%|██████▌ | 33/50 [11:53:30<6:06:06, 1292.17s/it]Epoch: 68%|██████▊ | 34/50 [12:15:02<5:44:34, 1292.14s/it]Epoch: 70%|███████ | 35/50 [12:36:36<5:23:09, 1292.65s/it]Epoch: 72%|███████▏ | 36/50 [12:58:09<5:01:37, 1292.69s/it]Epoch: 74%|███████▍ | 37/50 [13:19:40<4:40:00, 1292.31s/it]Epoch: 76%|███████▌ | 38/50 [13:41:12<4:18:26, 1292.19s/it]Epoch: 78%|███████▊ | 39/50 [14:02:44<3:56:53, 1292.13s/it]Epoch: 80%|████████ | 40/50 [14:24:17<3:35:23, 1292.35s/it]Epoch: 82%|████████▏ | 41/50 [14:45:49<3:13:50, 1292.27s/it]Epoch: 84%|████████▍ | 42/50 [15:07:21<2:52:18, 1292.34s/it]Epoch: 86%|████████▌ | 43/50 [15:28:52<2:30:42, 1291.80s/it]Epoch: 88%|████████▊ | 44/50 [15:50:21<2:09:05, 1290.94s/it]Epoch: 90%|█████████ | 45/50 [16:11:49<1:47:30, 1290.13s/it]Epoch: 92%|█████████▏| 46/50 [16:33:16<1:25:57, 1289.26s/it]Epoch: 94%|█████████▍| 47/50 [16:54:43<1:04:25, 1288.37s/it]Epoch: 96%|█████████▌| 48/50 [17:16:10<42:55, 1287.91s/it] Epoch: 98%|█████████▊| 49/50 [17:37:36<21:27, 1287.60s/it]Epoch: 100%|██████████| 50/50 [17:59:02<00:00, 1287.09s/it]Epoch: 100%|██████████| 50/50 [17:59:02<00:00, 1294.86s/it]
retrain: True
### converting data to BERT input...
...is done: 0hour:0min:25sec
#of instance: 17838 17838
Train loss: 1.114669406212042
your model is saved: /disk/data/models/framenet/mulModel-100/0/
Train loss: 0.5482335628392196
your model is saved: /disk/data/models/framenet/mulModel-100/1/
Train loss: 0.38100157818467856
your model is saved: /disk/data/models/framenet/mulModel-100/2/
Train loss: 0.279914715312286
your model is saved: /disk/data/models/framenet/mulModel-100/3/
Train loss: 0.2118144831684375
your model is saved: /disk/data/models/framenet/mulModel-100/4/
Train loss: 0.16118122543137192
your model is saved: /disk/data/models/framenet/mulModel-100/5/
Train loss: 0.12799698540444676
your model is saved: /disk/data/models/framenet/mulModel-100/6/
Train loss: 0.10575426188230219
your model is saved: /disk/data/models/framenet/mulModel-100/7/
Train loss: 0.08645944007712467
your model is saved: /disk/data/models/framenet/mulModel-100/8/
Train loss: 0.07701514887298011
your model is saved: /disk/data/models/framenet/mulModel-100/9/
Train loss: 0.07017353976260039
your model is saved: /disk/data/models/framenet/mulModel-100/10/
Train loss: 0.061662140202273424
your model is saved: /disk/data/models/framenet/mulModel-100/11/
Train loss: 0.05402080997000951
your model is saved: /disk/data/models/framenet/mulModel-100/12/
Train loss: 0.04963090171638232
your model is saved: /disk/data/models/framenet/mulModel-100/13/
Train loss: 0.04647701758244227
your model is saved: /disk/data/models/framenet/mulModel-100/14/
Train loss: 0.04351052764824199
your model is saved: /disk/data/models/framenet/mulModel-100/15/
Train loss: 0.041062651484956594
your model is saved: /disk/data/models/framenet/mulModel-100/16/
Train loss: 0.037949613917576425
your model is saved: /disk/data/models/framenet/mulModel-100/17/
Train loss: 0.03542679761373485
your model is saved: /disk/data/models/framenet/mulModel-100/18/
Train loss: 0.03500061590802313
your model is saved: /disk/data/models/framenet/mulModel-100/19/
Train loss: 0.03415927137236863
your model is saved: /disk/data/models/framenet/mulModel-100/20/
Train loss: 0.031231175492930235
your model is saved: /disk/data/models/framenet/mulModel-100/21/
Train loss: 0.031673282379050205
your model is saved: /disk/data/models/framenet/mulModel-100/22/
Train loss: 0.03039947204805117
your model is saved: /disk/data/models/framenet/mulModel-100/23/
Train loss: 0.031125316373228216
your model is saved: /disk/data/models/framenet/mulModel-100/24/
Train loss: 0.026694852179922488
your model is saved: /disk/data/models/framenet/mulModel-100/25/
Train loss: 0.028328387479716333
your model is saved: /disk/data/models/framenet/mulModel-100/26/
Train loss: 0.026991923950265354
your model is saved: /disk/data/models/framenet/mulModel-100/27/
Train loss: 0.02705248723248412
your model is saved: /disk/data/models/framenet/mulModel-100/28/
Train loss: 0.02528035317505936
your model is saved: /disk/data/models/framenet/mulModel-100/29/
Train loss: 0.026225507797784865
your model is saved: /disk/data/models/framenet/mulModel-100/30/
Train loss: 0.0240961885286711
your model is saved: /disk/data/models/framenet/mulModel-100/31/
Train loss: 0.02535563805907137
your model is saved: /disk/data/models/framenet/mulModel-100/32/
Train loss: 0.023530700162745395
your model is saved: /disk/data/models/framenet/mulModel-100/33/
Train loss: 0.02671007565882754
your model is saved: /disk/data/models/framenet/mulModel-100/34/
Train loss: 0.024057988132321767
your model is saved: /disk/data/models/framenet/mulModel-100/35/
Train loss: 0.023660566026195196
your model is saved: /disk/data/models/framenet/mulModel-100/36/
Train loss: 0.023187386925156615
your model is saved: /disk/data/models/framenet/mulModel-100/37/
Train loss: 0.02207362492822517
your model is saved: /disk/data/models/framenet/mulModel-100/38/
Train loss: 0.02365834985894721
your model is saved: /disk/data/models/framenet/mulModel-100/39/
Train loss: 0.021602879860907073
your model is saved: /disk/data/models/framenet/mulModel-100/40/
Train loss: 0.02339855448185109
your model is saved: /disk/data/models/framenet/mulModel-100/41/
Train loss: 0.02256645742814479
your model is saved: /disk/data/models/framenet/mulModel-100/42/
Train loss: 0.020998091342834972
your model is saved: /disk/data/models/framenet/mulModel-100/43/
Train loss: 0.024528811216839988
your model is saved: /disk/data/models/framenet/mulModel-100/44/
Train loss: 0.02271598102919976
your model is saved: /disk/data/models/framenet/mulModel-100/45/
Train loss: 0.019349228300734177
your model is saved: /disk/data/models/framenet/mulModel-100/46/
Train loss: 0.022960911779975317
your model is saved: /disk/data/models/framenet/mulModel-100/47/
Train loss: 0.02199616368825198
your model is saved: /disk/data/models/framenet/mulModel-100/48/
Train loss: 0.019958225119685376
your model is saved: /disk/data/models/framenet/mulModel-100/49/
...training is done
Traceback (most recent call last):
File "evaluate_multilingual.ipynb", line 313, in <module>
"scrolled": true
NameError: name 'true' is not defined
Using TensorFlow backend.
### Korean FrameNet ###
# contact: hahmyg@kaist, hahmyg@gmail.com #
###DEVICE: cuda:0
###multilingual-for-en-masking
###multilingual-for-ko-masking
###multilingual-for-en-without-masking
###multilingual-for-ko-without-masking
###multilingual-for-ko-masking
### Your result would be saved to: /disk/data/models/results/framenet/koModel/ko_only_for_ko_with_masking_result.txt
### loading Korean FrameNet 1.1 data...
# of instances in training data: 17838
# of instances in dev data: 2548
# of instances in test data: 5097
# of instances in trn: 17838
# of instances in dev: 2548
# of instances in tst: 5097
data example: [['태풍', 'Hugo가', '남긴', '피해들과', '회사', '내', '몇몇', '주요', '부서들의', '저조한', '실적들을', '반영하여,', 'Aetna', 'Life', 'and', 'Casualty', 'Co.의', '3분기', '<tgt>', '순이익이', '</tgt>', '182.6', '백만', '달러', '또는', '주당', '1.63', '달러로', '22', '%', '하락하였다.'], ['_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '이익.n', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_'], ['_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', 'Earnings_and_losses', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_'], ['O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-Earner', 'I-Earner', 'I-Earner', 'I-Earner', 'I-Earner', 'B-Time', 'X', 'O', 'X', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O']]
### EVALUATION
MODE: framenet
target LANGUAGE: ko
trained LANGUAGE: ko_only
Viterbi: False
masking: True
using TGT token: True
model dir: /disk/data/models/framenet/koModel/36/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/koModel/36/
/disk/data/models/framenet/koModel/36/
...model is loaded
../kaiser/src/utils.py:269: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
pred_logits = sm(masked_logit).view(1,-1)
# EPOCH: 36
SenseId Accuracy: 0.8077300372768295
ArgId Precision: 0.3867320819112628
ArgId Recall: 0.44859581838426327
ArgId F1: 0.4153731599747981
full-structure Precision: 0.5503433976934041
full-structure Recall: 0.6010472686102463
full-structure F1: 0.574578908205371
-----processing time: 0hour:6min:0sec
model dir: /disk/data/models/framenet/koModel/3/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/koModel/3/
/disk/data/models/framenet/koModel/3/
...model is loaded
# EPOCH: 3
SenseId Accuracy: 0.7510300176574456
ArgId Precision: 0.32383846056462845
ArgId Recall: 0.31646665841890387
ArgId F1: 0.3201101238893756
full-structure Precision: 0.5069044879171462
full-structure Recall: 0.4987262949334843
full-structure F1: 0.5027821372521043
-----processing time: 0hour:12min:12sec
model dir: /disk/data/models/framenet/koModel/15/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/koModel/15/
/disk/data/models/framenet/koModel/15/
...model is loaded
# EPOCH: 15
SenseId Accuracy: 0.8057680988816951
ArgId Precision: 0.3986439898873822
ArgId Recall: 0.42917233700358776
ArgId F1: 0.41334524873398865
full-structure Precision: 0.5641077852440101
full-structure Recall: 0.5880979337673365
full-structure F1: 0.5758531093019228
-----processing time: 0hour:18min:30sec
model dir: /disk/data/models/framenet/koModel/37/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/koModel/37/
/disk/data/models/framenet/koModel/37/
...model is loaded
# EPOCH: 37
SenseId Accuracy: 0.8094957818324504
ArgId Precision: 0.3876292968341866
ArgId Recall: 0.45898799950513425
ArgId F1: 0.42030134813639974
full-structure Precision: 0.5489644592175914
full-structure Recall: 0.6076988395131616
full-structure F1: 0.57684040838259
-----processing time: 0hour:24min:45sec
model dir: /disk/data/models/framenet/koModel/12/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/koModel/12/
/disk/data/models/framenet/koModel/12/
...model is loaded
# EPOCH: 12
SenseId Accuracy: 0.8026289974494801
ArgId Precision: 0.37383683185457695
ArgId Recall: 0.427440306816776
ArgId F1: 0.3988455988455989
full-structure Precision: 0.5420242384539797
full-structure Recall: 0.5854797622417209
full-structure F1: 0.5629145831207267
-----processing time: 0hour:31min:3sec
model dir: /disk/data/models/framenet/koModel/23/
srl model: framenet
language: multilingual
version: 1.1
using viterbi: False
using masking: True
pretrained BERT: bert-base-multilingual-cased
using TGT special token: True
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
...loaded model path: /disk/data/models/framenet/koModel/23/
/disk/data/models/framenet/koModel/23/
...model is loaded
Using TensorFlow backend.
### Korean FrameNet ###
# contact: hahmyg@kaist, hahmyg@gmail.com #
### loading Korean FrameNet 1.1 data...
# of instances in training data: 17838
# of instances in dev data: 2548
# of instances in test data: 5097
# of instances in trn: 17838
# of instances in dev: 2548
# of instances in tst: 5097
data example: [['태풍', 'Hugo가', '남긴', '피해들과', '회사', '내', '몇몇', '주요', '부서들의', '저조한', '실적들을', '반영하여,', 'Aetna', 'Life', 'and', 'Casualty', 'Co.의', '3분기', '<tgt>', '순이익이', '</tgt>', '182.6', '백만', '달러', '또는', '주당', '1.63', '달러로', '22', '%', '하락하였다.'], ['_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '이익.n', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_'], ['_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', 'Earnings_and_losses', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_', '_'], ['O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-Earner', 'I-Earner', 'I-Earner', 'I-Earner', 'I-Earner', 'B-Time', 'X', 'O', 'X', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O']]
FineTuning Multilingual
### TRAINING
MODEL: framenet
LANGUAGE: multi
PRETRAINED BERT: bert-base-multilingual-cased
training data:
(ko): 17838
BATCH_SIZE: 6
MAX_LEN: 256
used dictionary:
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lu2idx.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_lufrmap.json
/disk/kaiser/kaiser/src/../koreanframenet/resource/info/mul_bio_frargmap.json
original model: /disk/data/models/dict_framenet/enModel-with-exemplar/9/
your model would be saved at /disk/data/models/dict_framenet/mulModel-100/
retrain: True
### converting data to BERT input...
...is done: 0hour:0min:25sec
#of instance: 17838 17838
Epoch: 0%| | 0/50 [00:00<?, ?it/s]../kaiser/src/utils.py:275: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
pred_logits = sm(masked_logit).view(1,-1)
Train loss: 2.0430601907676613
your model is saved: /disk/data/models/dict_framenet/mulModel-100/0/
Epoch: 2%|█▍ | 1/50 [21:20<17:25:21, 1280.03s/it]Train loss: 1.2216012973527575
your model is saved: /disk/data/models/dict_framenet/mulModel-100/1/
Epoch: 4%|██▉ | 2/50 [43:06<17:10:23, 1288.00s/it]Train loss: 0.8523951496952739
your model is saved: /disk/data/models/dict_framenet/mulModel-100/2/
Epoch: 6%|████▎ | 3/50 [1:04:55<16:53:52, 1294.32s/it]Train loss: 0.617459437998154
your model is saved: /disk/data/models/dict_framenet/mulModel-100/3/
Epoch: 8%|█████▊ | 4/50 [1:26:44<16:35:42, 1298.76s/it]Train loss: 0.46581477113444014
your model is saved: /disk/data/models/dict_framenet/mulModel-100/4/
Epoch: 10%|███████▏ | 5/50 [1:48:33<16:16:14, 1301.66s/it]Train loss: 0.3570377936582352
your model is saved: /disk/data/models/dict_framenet/mulModel-100/5/
Epoch: 12%|████████▋ | 6/50 [2:10:23<15:56:23, 1304.17s/it]Train loss: 0.28856113556837626
your model is saved: /disk/data/models/dict_framenet/mulModel-100/6/
Epoch: 14%|██████████ | 7/50 [2:32:12<15:35:43, 1305.66s/it]Train loss: 0.23058658427807255
your model is saved: /disk/data/models/dict_framenet/mulModel-100/7/
Epoch: 16%|███████████▌ | 8/50 [2:53:59<15:14:16, 1306.11s/it]Train loss: 0.19215678063519187
your model is saved: /disk/data/models/dict_framenet/mulModel-100/8/
Epoch: 18%|████████████▉ | 9/50 [3:15:46<14:52:38, 1306.31s/it]Train loss: 0.1619501535879718
your model is saved: /disk/data/models/dict_framenet/mulModel-100/9/
Epoch: 20%|██████████████▏ | 10/50 [3:37:36<14:31:38, 1307.47s/it]Train loss: 0.14102853276054797
your model is saved: /disk/data/models/dict_framenet/mulModel-100/10/
Epoch: 22%|███████████████▌ | 11/50 [3:59:25<14:10:08, 1307.92s/it]Train loss: 0.12404432183359003
your model is saved: /disk/data/models/dict_framenet/mulModel-100/11/
Epoch: 24%|█████████████████ | 12/50 [4:21:15<13:48:41, 1308.47s/it]Train loss: 0.10590035958075254
your model is saved: /disk/data/models/dict_framenet/mulModel-100/12/
Epoch: 26%|██████████████████▍ | 13/50 [4:43:03<13:26:55, 1308.53s/it]Train loss: 0.09765679454083219
your model is saved: /disk/data/models/dict_framenet/mulModel-100/13/
Epoch: 28%|███████████████████▉ | 14/50 [5:04:51<13:04:57, 1308.25s/it]Train loss: 0.08665195872746859
your model is saved: /disk/data/models/dict_framenet/mulModel-100/14/
Epoch: 30%|█████████████████████▎ | 15/50 [5:26:36<12:42:29, 1307.14s/it]Train loss: 0.07943220664465908
your model is saved: /disk/data/models/dict_framenet/mulModel-100/15/
Epoch: 32%|██████████████████████▋ | 16/50 [5:48:18<12:19:58, 1305.85s/it]Train loss: 0.07674171336498294
your model is saved: /disk/data/models/dict_framenet/mulModel-100/16/
Epoch: 34%|████████████████████████▏ | 17/50 [6:10:03<11:57:57, 1305.37s/it]Train loss: 0.07096644593107496
your model is saved: /disk/data/models/dict_framenet/mulModel-100/17/
Epoch: 36%|█████████████████████████▌ | 18/50 [6:31:46<11:35:50, 1304.71s/it]Train loss: 0.06573391464431425
your model is saved: /disk/data/models/dict_framenet/mulModel-100/18/
Epoch: 38%|██████████████████████████▉ | 19/50 [6:53:30<11:13:57, 1304.45s/it]Train loss: 0.06206626622058768
your model is saved: /disk/data/models/dict_framenet/mulModel-100/19/
Epoch: 40%|████████████████████████████▍ | 20/50 [7:15:12<10:51:56, 1303.90s/it]Train loss: 0.06225214267150625
your model is saved: /disk/data/models/dict_framenet/mulModel-100/20/
Epoch: 42%|█████████████████████████████▊ | 21/50 [7:36:53<10:29:46, 1302.98s/it]Train loss: 0.055789347293129565
your model is saved: /disk/data/models/dict_framenet/mulModel-100/21/
Epoch: 44%|███████████████████████████████▏ | 22/50 [7:58:33<10:07:41, 1302.20s/it]Train loss: 0.0580104508143946
your model is saved: /disk/data/models/dict_framenet/mulModel-100/22/
Epoch: 46%|█████████████████████████████████ | 23/50 [8:20:12<9:45:28, 1301.04s/it]Train loss: 0.054401203268632466
your model is saved: /disk/data/models/dict_framenet/mulModel-100/23/
Epoch: 48%|██████████████████████████████████▌ | 24/50 [8:41:50<9:23:28, 1300.33s/it]Train loss: 0.05093434202443514
your model is saved: /disk/data/models/dict_framenet/mulModel-100/24/
Epoch: 50%|████████████████████████████████████ | 25/50 [9:03:29<9:01:32, 1299.72s/it]Train loss: 0.04614005457591331
your model is saved: /disk/data/models/dict_framenet/mulModel-100/25/
Epoch: 52%|█████████████████████████████████████▍ | 26/50 [9:25:06<8:39:38, 1299.12s/it]Train loss: 0.045692054481987064
your model is saved: /disk/data/models/dict_framenet/mulModel-100/26/
Epoch: 54%|██████████████████████████████████████▉ | 27/50 [9:46:44<8:17:48, 1298.62s/it]Train loss: 0.04615123512868307
your model is saved: /disk/data/models/dict_framenet/mulModel-100/27/
Epoch: 56%|███████████████████████████████████████▊ | 28/50 [10:08:23<7:56:11, 1298.73s/it]Train loss: 0.044389460784700255
your model is saved: /disk/data/models/dict_framenet/mulModel-100/28/
Epoch: 58%|█████████████████████████████████████████▏ | 29/50 [10:30:03<7:34:40, 1299.05s/it]Train loss: 0.04371812499340986
your model is saved: /disk/data/models/dict_framenet/mulModel-100/29/
Epoch: 60%|██████████████████████████████████████████▌ | 30/50 [10:51:43<7:13:11, 1299.56s/it]Train loss: 0.04533760472554658
your model is saved: /disk/data/models/dict_framenet/mulModel-100/30/
Epoch: 62%|████████████████████████████████████████████ | 31/50 [11:13:25<6:51:42, 1300.11s/it]Train loss: 0.040327722189281774
your model is saved: /disk/data/models/dict_framenet/mulModel-100/31/
Epoch: 64%|█████████████████████████████████████████████▍ | 32/50 [11:35:04<6:29:57, 1299.88s/it]Train loss: 0.040875897337498114
your model is saved: /disk/data/models/dict_framenet/mulModel-100/32/
Epoch: 66%|██████████████████████████████████████████████▊ | 33/50 [11:56:47<6:08:30, 1300.61s/it]Train loss: 0.04103026588071157
your model is saved: /disk/data/models/dict_framenet/mulModel-100/33/
Epoch: 68%|████████████████████████████████████████████████▎ | 34/50 [12:18:27<5:46:49, 1300.61s/it]Train loss: 0.04240306345791001
your model is saved: /disk/data/models/dict_framenet/mulModel-100/34/
Epoch: 70%|█████████████████████████████████████████████████▋ | 35/50 [12:40:06<5:24:58, 1299.93s/it]Train loss: 0.03953588212324271
your model is saved: /disk/data/models/dict_framenet/mulModel-100/35/
Epoch: 72%|███████████████████████████████████████████████████ | 36/50 [13:01:43<5:03:07, 1299.10s/it]Train loss: 0.03654907728049771
your model is saved: /disk/data/models/dict_framenet/mulModel-100/36/
Epoch: 74%|████████████████████████████████████████████████████▌ | 37/50 [13:23:22<4:41:28, 1299.10s/it]Train loss: 0.03975163119958634
your model is saved: /disk/data/models/dict_framenet/mulModel-100/37/
Epoch: 76%|█████████████████████████████████████████████████████▉ | 38/50 [13:45:00<4:19:45, 1298.75s/it]Train loss: 0.03613893439610997
your model is saved: /disk/data/models/dict_framenet/mulModel-100/38/
Epoch: 78%|███████████████████████████████████████████████████████▍ | 39/50 [14:06:37<3:58:02, 1298.41s/it]Train loss: 0.036986272479208764
your model is saved: /disk/data/models/dict_framenet/mulModel-100/39/
Epoch: 80%|████████████████████████████████████████████████████████▊ | 40/50 [14:28:16<3:36:24, 1298.42s/it]Train loss: 0.03570315815496242
your model is saved: /disk/data/models/dict_framenet/mulModel-100/40/
Epoch: 82%|██████████████████████████████████████████████████████████▏ | 41/50 [14:49:54<3:14:44, 1298.27s/it]Train loss: 0.035585971708651104
your model is saved: /disk/data/models/dict_framenet/mulModel-100/41/
Epoch: 84%|███████████████████████████████████████████████████████████▋ | 42/50 [15:11:34<2:53:10, 1298.76s/it]Train loss: 0.03245486886125649
your model is saved: /disk/data/models/dict_framenet/mulModel-100/42/
Epoch: 86%|█████████████████████████████████████████████████████████████ | 43/50 [15:33:16<2:31:39, 1299.93s/it]Train loss: 0.03471557601375809
your model is saved: /disk/data/models/dict_framenet/mulModel-100/43/
Epoch: 88%|██████████████████████████████████████████████████████████████▍ | 44/50 [15:54:57<2:10:01, 1300.29s/it]Train loss: 0.03426823178229669
your model is saved: /disk/data/models/dict_framenet/mulModel-100/44/
Epoch: 90%|███████████████████████████████████████████████████████████████▉ | 45/50 [16:16:38<1:48:22, 1300.53s/it]Train loss: 0.034149167967135026
your model is saved: /disk/data/models/dict_framenet/mulModel-100/45/
Epoch: 92%|█████████████████████████████████████████████████████████████████▎ | 46/50 [16:38:17<1:26:39, 1299.99s/it]Train loss: 0.03404321651744593
your model is saved: /disk/data/models/dict_framenet/mulModel-100/46/
Epoch: 94%|██████████████████████████████████████████████████████████████████▋ | 47/50 [16:59:58<1:05:00, 1300.13s/it]Train loss: 0.03287793978847738
your model is saved: /disk/data/models/dict_framenet/mulModel-100/47/
Epoch: 96%|██████████████████████████████████████████████████████████████████████ | 48/50 [17:21:40<43:21, 1300.80s/it]Train loss: 0.03238603348286498