-
Notifications
You must be signed in to change notification settings - Fork 442
Open
Labels
Description
Hi, thanks for the excellent work.
I have a few doubts because I am bad at math.
I notice that you use this:
np.linalg.norm(input - weight, axis=-1)To calculate the euclidean distance matric-wise. Could you explain more about this? How did you come up with this approach?
Another part is the neighbourhood function:

Here, you handle it with the knowledge of linear algebra.
Could you provide some material on how you calculated them?
The way you implement them is highly efficient compared with the regular (someone like me who is terrible at math 😢) implementation.
Following are my implementations:
from scipy.spatial import distance
def winner(input, net):
# normal way, extremely low performance
# dis_map = np.apply_along_axis(distance.euclidean, 2, net, input)
dis_map = np.linalg.norm(input - net, axis=-1)
return np.unravel_index(np.argmin(dis_map, axis=None), dis_map.shape)Also, for the update, you use the Einstein Summation Convention, it's hard to connect it to the way you calculate the update formula

def process(x, net, sigma, lr, iterations):
np.random.shuffle(x.copy())
X, Y = np.meshgrid(np.arange(0, net.shape[0], 1), np.arange(0, net.shape[1], 1))
for i in tqdm(range(iterations)):
# competition
t = i % len(x) - 1
winner_idx = winner(x[t], net)
# update
eta_ = asymptotic_decay(lr, t, iterations) # η(t) learning rate
sigma_ = asymptotic_decay(sigma, t, iterations) # σ(t) neighborhood size
# neighborhood function (27, 27)
gau = neighborhood_function(X, Y, winner_idx, sigma_)
# normal way, very low performance
# broadcast substraction delta (27, 27, 2)
# delta = x[t] - net
# for i in range(net.shape[0]):
# for j in range(net.shape[1]):
# net[i,j] += (gau * eta_)[i,j] * delta[i,j]
net += np.einsum('ij, ijk->ijk', gau * eta_, x[t] - net)
return netReactions are currently unavailable