Skip to content

Commit 1134136

Browse files
committed
DOC: remove discrete SINDy examples from feature overview
1 parent 6549ee7 commit 1134136

File tree

2 files changed

+455
-757
lines changed

2 files changed

+455
-757
lines changed

examples/1_feature_overview/example.ipynb

Lines changed: 452 additions & 674 deletions
Large diffs are not rendered by default.

examples/1_feature_overview/example.py

Lines changed: 3 additions & 83 deletions
Original file line numberDiff line numberDiff line change
@@ -114,7 +114,7 @@ def ignore_specific_warnings():
114114
x_dot_test_predicted = model.predict(x_test)
115115

116116
# Compute derivatives with a finite difference method, for comparison
117-
x_dot_test_computed = model.differentiation_method._differentiate(x_test, t=dt)
117+
x_dot_test_computed = model.differentiation_method(x_test, t=dt)
118118

119119
fig, axs = plt.subplots(x_test.shape[1], 1, sharex=True, figsize=(7, 9))
120120
for i in range(x_test.shape[1]):
@@ -149,30 +149,6 @@ def ignore_specific_warnings():
149149

150150
fig.show()
151151

152-
# %% [markdown]
153-
# ## Discrete time dynamical system (map)
154-
155-
# %%
156-
157-
158-
def f(x):
159-
return 3.6 * x * (1 - x)
160-
161-
162-
if __name__ != "testing":
163-
n_steps = 1000
164-
else:
165-
n_steps = 10
166-
eps = 0.001 # Noise level
167-
x_train_map = np.zeros((n_steps))
168-
x_train_map[0] = 0.5
169-
for i in range(1, n_steps):
170-
x_train_map[i] = f(x_train_map[i - 1]) + eps * np.random.randn()
171-
model = ps.DiscreteSINDy()
172-
model.fit(x_train_map, t=1)
173-
174-
model.print()
175-
176152
# %% [markdown]
177153
# ## Optimization options
178154
# In this section we provide examples of different parameters accepted by the built-in sparse regression optimizers `STLSQ`, `SR3`, `ConstrainedSR3`, `MIOSR`, `SSR`, and `FROLS`. The `Trapping` optimizer is not straightforward to use; please check out Example 8 for some examples. We also show how to use a scikit-learn sparse regressor with PySINDy.
@@ -782,7 +758,7 @@ def f(x):
782758
x_dot_test_predicted = model.predict(x_test)
783759

784760
# Compute derivatives with a finite difference method, for comparison
785-
x_dot_test_computed = model.differentiation_method._differentiate(x_test, t=dt)
761+
x_dot_test_computed = model.differentiation_method(x_test, t=dt)
786762

787763
fig, axs = plt.subplots(x_test.shape[1], 1, sharex=True, figsize=(7, 9))
788764
for i in range(x_test.shape[1]):
@@ -905,7 +881,7 @@ def u_fun(t):
905881
x_dot_test_predicted = model.predict(x_test, u=u_test)
906882

907883
# Compute derivatives with a finite difference method, for comparison
908-
x_dot_test_computed = model.differentiation_method._differentiate(x_test, t=dt)
884+
x_dot_test_computed = model.differentiation_method(x_test, t=dt)
909885

910886
fig, axs = plt.subplots(x_test.shape[1], 1, sharex=True, figsize=(7, 9))
911887
for i in range(x_test.shape[1]):
@@ -1053,62 +1029,6 @@ def u_fun(t):
10531029
model.fit(x_train, t=t)
10541030
model.print()
10551031

1056-
# %% [markdown]
1057-
# ## SINDy with control parameters (SINDyCP)
1058-
# The control input in PySINDy can be used to discover equations parameterized by control parameters in conjunction with the `ParameterizedLibrary`. We demonstrate on the logistic map
1059-
# $$ x_{n+1} = r x_n(1-x_n)$$
1060-
# which depends on a single parameter $r$.
1061-
1062-
# %%
1063-
# Iterate the map and drop the initial 500-step transient. The behavior is chaotic for r>3.6.
1064-
if __name__ != "testing":
1065-
num = 1000
1066-
N = 1000
1067-
N_drop = 500
1068-
else:
1069-
num = 20
1070-
N = 20
1071-
N_drop = 10
1072-
r0 = 3.5
1073-
rs = r0 + np.arange(num) / num * (4 - r0)
1074-
xss = []
1075-
for r in rs:
1076-
xs = []
1077-
x = 0.5
1078-
for n in range(N + N_drop):
1079-
if n >= N_drop:
1080-
xs = xs + [x]
1081-
x = r * x * (1 - x)
1082-
xss = xss + [xs]
1083-
1084-
plt.figure(figsize=(4, 4), dpi=100)
1085-
for ind in range(num):
1086-
plt.plot(np.ones(N) * rs[ind], xss[ind], ",", alpha=0.1, c="black", rasterized=True)
1087-
plt.xlabel("$r$")
1088-
plt.ylabel("$x_n$")
1089-
plt.show()
1090-
1091-
# %% [markdown]
1092-
# We construct a `parameter_library` and a `feature_library` to act on the input data `x` and the control input `u` independently. The `ParameterizedLibrary` is composed of products of the two libraries output features. This enables fine control over the library features, which is especially useful in the case of PDEs like those arising in pattern formation modeling. See this [notebook](https://github.com/dynamicslab/pysindy/blob/master/examples/17_parameterized_pattern_formation/17_parameterized_pattern_formation.ipynb) for examples.
1093-
1094-
# %%
1095-
# use four parameter values as training data
1096-
rs_train = [3.6, 3.7, 3.8, 3.9]
1097-
xs_train = [np.array(xss[np.where(np.array(rs) == r)[0][0]]) for r in rs_train]
1098-
1099-
feature_lib = ps.PolynomialLibrary(degree=3, include_bias=True)
1100-
parameter_lib = ps.PolynomialLibrary(degree=1, include_bias=True)
1101-
lib = ps.ParameterizedLibrary(
1102-
feature_library=feature_lib,
1103-
parameter_library=parameter_lib,
1104-
num_features=1,
1105-
num_parameters=1,
1106-
)
1107-
opt = ps.STLSQ(threshold=1e-1, normalize_columns=False)
1108-
model = ps.DiscreteSINDy(feature_library=lib, optimizer=opt)
1109-
model.fit(xs_train, u=rs_train, t=1, feature_names=["x", "r"])
1110-
model.print()
1111-
11121032
# %% [markdown]
11131033
# ## PDEFIND Feature Overview
11141034
# PySINDy now supports SINDy for PDE identification (PDE-FIND) (Rudy, Samuel H., Steven L. Brunton, Joshua L. Proctor, and J. Nathan Kutz. "Data-driven discovery of partial differential equations." Science Advances 3, no. 4 (2017): e1602614.). We illustrate a basic example on Burgers' equation:

0 commit comments

Comments
 (0)