Skip to content

Commit 76bf2c8

Browse files
committed
add first locally connected example
1 parent 6c66672 commit 76bf2c8

File tree

2 files changed

+551
-0
lines changed

2 files changed

+551
-0
lines changed

P1B3/contrib/Fangfang/README.md

Lines changed: 104 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,104 @@
1+
```
2+
Input dim = 29281
3+
____________________________________________________________________________________________________
4+
Layer (type) Output Shape Param # Connected to
5+
====================================================================================================
6+
locallyconnected1d_1 (LocallyCon (None, 29281, 10) 585620 locallyconnected1d_input_1[0][0]
7+
____________________________________________________________________________________________________
8+
maxpooling1d_1 (MaxPooling1D) (None, 292, 10) 0 locallyconnected1d_1[0][0]
9+
____________________________________________________________________________________________________
10+
flatten_1 (Flatten) (None, 2920) 0 maxpooling1d_1[0][0]
11+
____________________________________________________________________________________________________
12+
dense_1 (Dense) (None, 1000) 2921000 flatten_1[0][0]
13+
____________________________________________________________________________________________________
14+
dropout_1 (Dropout) (None, 1000) 0 dense_1[0][0]
15+
____________________________________________________________________________________________________
16+
dense_2 (Dense) (None, 500) 500500 dropout_1[0][0]
17+
____________________________________________________________________________________________________
18+
dropout_2 (Dropout) (None, 500) 0 dense_2[0][0]
19+
____________________________________________________________________________________________________
20+
dense_3 (Dense) (None, 100) 50100 dropout_2[0][0]
21+
____________________________________________________________________________________________________
22+
dropout_3 (Dropout) (None, 100) 0 dense_3[0][0]
23+
____________________________________________________________________________________________________
24+
dense_4 (Dense) (None, 50) 5050 dropout_3[0][0]
25+
____________________________________________________________________________________________________
26+
dropout_4 (Dropout) (None, 50) 0 dense_4[0][0]
27+
____________________________________________________________________________________________________
28+
dense_5 (Dense) (None, 1) 51 dropout_4[0][0]
29+
====================================================================================================
30+
Total params: 4,062,321
31+
Trainable params: 4,062,321
32+
Non-trainable params: 0
33+
____________________________________________________________________________________________________
34+
Epoch 1/30
35+
I tensorflow/core/common_runtime/gpu/gpu_device.cc:885] Found device 0 with properties:
36+
name: Tesla P100-SXM2-16GB
37+
major: 6 minor: 0 memoryClockRate (GHz) 0.405
38+
pciBusID 0000:8a:00.0
39+
Total memory: 15.90GiB
40+
Free memory: 15.62GiB
41+
I tensorflow/core/common_runtime/gpu/gpu_device.cc:906] DMA: 0
42+
I tensorflow/core/common_runtime/gpu/gpu_device.cc:916] 0: Y
43+
I tensorflow/core/common_runtime/gpu/gpu_device.cc:975] Creating TensorFlow device (/gpu:0) -> (device: 0, name: Tesla P100-SXM2-16GB, pci bus id: 0000:8a:00.0)
44+
Epoch 1/30
45+
2113709/2113700 [==============================] - 17376s - loss: 0.1550 - val_loss: 0.1306
46+
Epoch 2/30
47+
2113748/2113700 [==============================] - 17110s - loss: 0.1204 - val_loss: 0.1110
48+
Epoch 3/30
49+
2113737/2113700 [==============================] - 17116s - loss: 0.1030 - val_loss: 0.0921
50+
Epoch 4/30
51+
2113712/2113700 [==============================] - 17087s - loss: 0.0890 - val_loss: 0.0797
52+
Epoch 5/30
53+
2113709/2113700 [==============================] - 17230s - loss: 0.0782 - val_loss: 0.0701
54+
Epoch 6/30
55+
2113736/2113700 [==============================] - 17198s - loss: 0.0700 - val_loss: 0.0639
56+
Epoch 7/30
57+
2113730/2113700 [==============================] - 17219s - loss: 0.0639 - val_loss: 0.0580
58+
Epoch 8/30
59+
2113728/2113700 [==============================] - 17156s - loss: 0.0592 - val_loss: 0.0539
60+
Epoch 9/30
61+
2113703/2113700 [==============================] - 17145s - loss: 0.0555 - val_loss: 0.0510
62+
Epoch 10/30
63+
2113738/2113700 [==============================] - 17103s - loss: 0.0528 - val_loss: 0.0515
64+
Epoch 11/30
65+
2113723/2113700 [==============================] - 17754s - loss: 0.0506 - val_loss: 0.0477
66+
Epoch 12/30
67+
2113711/2113700 [==============================] - 18338s - loss: 0.0489 - val_loss: 0.0472
68+
Epoch 13/30
69+
2113732/2113700 [==============================] - 18728s - loss: 0.0473 - val_loss: 0.0469
70+
Epoch 14/30
71+
2113745/2113700 [==============================] - 18740s - loss: 0.0458 - val_loss: 0.0445
72+
Epoch 15/30
73+
2113754/2113700 [==============================] - 18613s - loss: 0.0448 - val_loss: 0.0439
74+
Epoch 16/30
75+
2113711/2113700 [==============================] - 18552s - loss: 0.0437 - val_loss: 0.0434
76+
Epoch 17/30
77+
2113740/2113700 [==============================] - 18645s - loss: 0.0428 - val_loss: 0.0425
78+
Epoch 18/30
79+
2113743/2113700 [==============================] - 18649s - loss: 0.0420 - val_loss: 0.0420
80+
Epoch 19/30
81+
2113703/2113700 [==============================] - 18796s - loss: 0.0412 - val_loss: 0.0425
82+
Epoch 20/30
83+
2113748/2113700 [==============================] - 18403s - loss: 0.0406 - val_loss: 0.0423
84+
Epoch 21/30
85+
2113746/2113700 [==============================] - 18722s - loss: 0.0401 - val_loss: 0.0412
86+
Epoch 22/30
87+
2113752/2113700 [==============================] - 18599s - loss: 0.0395 - val_loss: 0.0419
88+
Epoch 23/30
89+
2113715/2113700 [==============================] - 18243s - loss: 0.0389 - val_loss: 0.0409
90+
Epoch 24/30
91+
2113710/2113700 [==============================] - 18734s - loss: 0.0384 - val_loss: 0.0419
92+
Epoch 25/30
93+
2113744/2113700 [==============================] - 18820s - loss: 0.0380 - val_loss: 0.0399
94+
Epoch 26/30
95+
2113719/2113700 [==============================] - 18118s - loss: 0.0375 - val_loss: 0.0400
96+
Epoch 27/30
97+
2113753/2113700 [==============================] - 18785s - loss: 0.0372 - val_loss: 0.0397
98+
Epoch 28/30
99+
2113706/2113700 [==============================] - 18752s - loss: 0.0368 - val_loss: 0.0398
100+
Epoch 29/30
101+
2113749/2113700 [==============================] - 18731s - loss: 0.0364 - val_loss: 0.0405
102+
Epoch 30/30
103+
2113709/2113700 [==============================] - 18683s - loss: 0.0361 - val_loss: 0.0388
104+
```

0 commit comments

Comments
 (0)