Replies: 2 comments 2 replies
-
|
Hi @buligar, The MI encapsulates the statistical dependencies between two variables, just as Pearson or Spearman correlations. I think you're confused here because the from frites.core import gcmi_nd_cc
# pop_1 = (n_neurons, n_potential, n_times)
# pop_2 = (n_neurons, n_potential, n_times)
# compute mi across n_neurons dimension (axis=0)
mi = gcmi_nd_cc(pop_1, pop_2, traxis=0)
# mi.shape = (n_potential, n_times)Hope it is more clear like that, Best, |
Beta Was this translation helpful? Give feedback.
1 reply
-
|
Hard to answer since I don't know the data, the ground truth etc. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment


Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Good afternoon, I have a question about the calculation of mutual information. I would like to calculate the mutual information between two populations of neurons. One population consists of an array (number of neurons, membrane potential, time) What are x and y? What should the x and y data look like? Instead of x there should be one population of neurons, and instead of y there should be another population? Or should these two populations be in x? How should I convert vm1_res, vm2_res?
vm1_res = vm1.reshape(999,100)
vm2_res = vm2.reshape(999,100)
dataset = DatasetEphy(x,y=y, roi=ch, times=times)
wf = WfMi(mi_type='cc',inference='ffx', verbose=False)
mi, _ = wf.fit(dataset)
plt.figure()
time = np.arange(0.999)
plt.plot(time, mi)
plt.xlabel("Time (s)"), plt.ylabel("MI (bits)")
plt.title('I(C; C)')
plt.show()
Beta Was this translation helpful? Give feedback.
All reactions