Skip to content

Commit 1146cc0

Browse files
committed
adding backpropagation algorithm with the technique of regularization weigth decay
1 parent cc94465 commit 1146cc0

File tree

1 file changed

+7
-5
lines changed

1 file changed

+7
-5
lines changed

neural_network/backpropagation_weight_decay.py

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,9 @@
88
warnings.filterwarnings("ignore", category=DeprecationWarning)
99

1010

11-
def train_network(neurons, x_train, y_train, epochs):
11+
def train_network(
12+
neurons: int, x_train: np.array, y_train: np.array, epochs: int
13+
) -> tuple:
1214
"""
1315
Code the backpropagation algorithm with the technique of regularization
1416
weight decay.
@@ -70,7 +72,7 @@ def train_network(neurons, x_train, y_train, epochs):
7072
return w_co, bias_co, w_cs, bias_cs, error
7173

7274

73-
def relu(x):
75+
def relu(x: np.array) -> np.array:
7476
"""
7577
Relu activation function
7678
Hidden Layer due to it is less susceptible to vanish gradient
@@ -80,7 +82,7 @@ def relu(x):
8082
return x
8183

8284

83-
def d_relu(x):
85+
def d_relu(x: np.array) -> np.array:
8486
"""
8587
Relu Activation derivate function
8688
"""
@@ -92,15 +94,15 @@ def d_relu(x):
9294
return x
9395

9496

95-
def sigmoid(x):
97+
def sigmoid(x: float) -> float:
9698
"""
9799
Sigmoid activation function
98100
Output layer
99101
"""
100102
return 1 / (1 + np.exp(-x))
101103

102104

103-
def d_sigmoid(x):
105+
def d_sigmoid(x: float) -> float:
104106
"""
105107
Sigmoid activation derivate
106108
"""

0 commit comments

Comments
 (0)