Replies: 3 comments 1 reply
-
|
You can use the normal |
Beta Was this translation helpful? Give feedback.
-
|
Oh and just for posterity: you want to use Line 17 in d9f1057 |
Beta Was this translation helpful? Give feedback.
-
Oh sorry I didn't even read this part, my bad. Yeah I would do soft_abs(x::T) where {T} = sqrt(x^2 + T(1e-7))(Why not just and then extra_sympy_mappings={"soft_abs": lambda x: sympy.sqrt(x ** 2 + 1e-7)} |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
Dear,
I tried using PySR on a normalized data set between 0 and 1 for x and y values and I wanted to include a square root operator within the search. I attempted to do it by using only the absolute values of the variable abs(x), but this lead to not continous solutions. Therefore I tried to incorporate a smooth square root operator, but PySR does not accept it saying that it is not well defined. Even if I try to avoid taking values that are less then 0 (which they all should be) with only 5 iterations and it does not run through.
I have attached how I am setting up the regressor here:
Version
0.16.3
Operating System
Windows
Package Manager
Conda
Interface
Jupyter Notebook
Relevant log output
--------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) Cell In[13], line 35 16 # Create a PySR model 17 model = PySRRegressor( 18 maxsize=20, 19 niterations=iterations, # < Increase me for better results (...) 32 # ^ Custom loss function (julia syntax) 33 ) ---> 35 model.fit(input_CS.reshape(-1, 1), target_CS) File c:\Users\XXXX\Anaconda3\envs\tristan\Lib\site-packages\pysr\sr.py:1970, in PySRRegressor.fit(self, X, y, Xresampled, weights, variable_names, X_units, y_units) 1967 self._checkpoint() 1969 # Perform the search: -> 1970 self._run(X, y, mutated_params, weights=weights, seed=seed) 1972 # Then, after fit, we save again, so the pickle file contains 1973 # the equations: 1974 if not self.temp_equation_file: File c:\Users\XXXX\Anaconda3\envs\tristan\Lib\site-packages\pysr\sr.py:1800, in PySRRegressor._run(self, X, y, mutated_params, weights, seed) 1796 y_variable_names = [f"y{_subscriptify(i)}" for i in range(y.shape[1])] 1798 # Call to Julia backend. 1799 # See https://github.com/MilesCranmer/SymbolicRegression.jl/blob/master/src/SymbolicRegression.jl -> 1800 self.raw_julia_state_ = SymbolicRegression.equation_search( 1801 Main.X, 1802 Main.y, 1803 weights=Main.weights, 1804 niterations=int(self.niterations), 1805 variable_names=self.feature_names_in_.tolist(), 1806 display_variable_names=self.display_feature_names_in_.tolist(), 1807 y_variable_names=y_variable_names, 1808 X_units=self.X_units_, 1809 y_units=self.y_units_, 1810 options=options, 1811 numprocs=cprocs, 1812 parallelism=parallelism, 1813 saved_state=self.raw_julia_state_, 1814 return_state=True, 1815 addprocs_function=cluster_manager, 1816 progress=progress and self.verbosity > 0 and len(y.shape) == 1, 1817 verbosity=int(self.verbosity), 1818 ) 1820 # Set attributes 1821 self.equations_ = self.get_hof() RuntimeError: <PyCall.jlwrap (in a Julia function called from Python) JULIA: The operator `sqrt_smooth` is not well-defined over the real line, as it threw the error `UndefVarError` when evaluating the input -100.0. You can work around this by returning NaN for invalid inputs. For example, `safe_log(x::T) where {T} = x > 0 ? log(x) : T(NaN)`. Stacktrace: [1] error(s::String) @ Base .\error.jl:35 [2] test_operator(op::typeof(sqrt_smooth), x::Float32, y::Nothing) @ SymbolicRegression C:\Users\XXXX\.julia\packages\SymbolicRegression\XKtla\src\Configure.jl:8 [3] test_operator(op::typeof(sqrt_smooth), x::Float32) @ SymbolicRegression C:\Users\XXXX\.julia\packages\SymbolicRegression\XKtla\src\Configure.jl:4 [4] assert_operators_well_defined(T::Type, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}) @ SymbolicRegression C:\Users\XXXX\.julia\packages\SymbolicRegression\XKtla\src\Configure.jl:42 [5] test_option_configuration(T::Type, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}) @ SymbolicRegression C:\Users\XXXX\.julia\packages\SymbolicRegression\XKtla\src\Configure.jl:58 [6] _equation_search(#unused#::Val{:multithreading}, #unused#::Val{1}, datasets::Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}, Nothing, Nothing, Nothing, Nothing}}, niterations::Int64, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, runtests::Bool, saved_state::Nothing, verbosity::Int64, progress::Bool, #unused#::Val{true}) @ SymbolicRegression C:\Users\XXXX\.julia\packages\SymbolicRegression\XKtla\src\SymbolicRegression.jl:566 [7] equation_search(datasets::Vector{Dataset{Float32, Float32, Matrix{Float32}, Vector{Float32}, Nothing, NamedTuple{(), Tuple{}}, Nothing, Nothing, Nothing, Nothing}}; niterations::Int64, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, parallelism::String, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, runtests::Bool, saved_state::Nothing, return_state::Bool, verbosity::Int64, progress::Bool, v_dim_out::Val{1}) @ SymbolicRegression C:\Users\XXXX\.julia\packages\SymbolicRegression\XKtla\src\SymbolicRegression.jl:507 [8] equation_search(X::Matrix{Float32}, y::Matrix{Float32}; niterations::Int64, weights::Nothing, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, variable_names::Vector{String}, display_variable_names::Vector{String}, y_variable_names::Nothing, parallelism::String, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, runtests::Bool, saved_state::Nothing, return_state::Bool, loss_type::Type{Nothing}, verbosity::Int64, progress::Bool, X_units::Nothing, y_units::Nothing, v_dim_out::Val{1}, multithreaded::Nothing, varMap::Nothing) @ SymbolicRegression C:\Users\XXXX\.julia\packages\SymbolicRegression\XKtla\src\SymbolicRegression.jl:385 [9] #equation_search#24 @ C:\Users\XXXX\.julia\packages\SymbolicRegression\XKtla\src\SymbolicRegression.jl:414 [inlined] [10] invokelatest(::Any, ::Any, ::Vararg{Any}; kwargs::Base.Pairs{Symbol, Any, NTuple{15, Symbol}, NamedTuple{(:weights, :niterations, :variable_names, :display_variable_names, :y_variable_names, :X_units, :y_units, :options, :numprocs, :parallelism, :saved_state, :return_state, :addprocs_function, :progress, :verbosity), Tuple{Nothing, Int64, Vector{String}, Vector{String}, Nothing, Nothing, Nothing, Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum, false, Optim.Options{Float64, Nothing}, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, Nothing, String, Nothing, Bool, Nothing, Bool, Int64}}}) @ Base .\essentials.jl:731 [11] _pyjlwrap_call(f::Function, args_::Ptr{PyCall.PyObject_struct}, kw_::Ptr{PyCall.PyObject_struct}) @ PyCall C:\Users\XXXX\.julia\packages\PyCall\ilqDX\src\callback.jl:32 [12] pyjlwrap_call(self_::Ptr{PyCall.PyObject_struct}, args_::Ptr{PyCall.PyObject_struct}, kw_::Ptr{PyCall.PyObject_struct}) @ PyCall C:\Users\XXXX\.julia\packages\PyCall\ilqDX\src\callback.jl:44>Extra Info
No response
Beta Was this translation helpful? Give feedback.
All reactions