Releases: itskalvik/sgp-tools
Releases · itskalvik/sgp-tools
v2.0.7
Release v2.0.6
Packaging
- Capped setuptools to address
pkg_resourcesdeprecation issues.
Release v2.0.5
Changes
- Benchmarking: Updated the Attentive kernel benchmark to use
tf.Adamfor better performance, and added mutual information (MI) to benchmark outputs. - Attentive kernel: Skip redundant representation computation when inputs match for faster execution.
- Metrics: Added mean squared error (MSE).
- Datasets: Added support for
.npydatasets. - TSP: Removed the planner-level time limit argument and improved solver behavior/options (initial path support, solution limits, and running the secondary solver only when a time limit is provided).
Packaging
- Capped setuptools to address
pkg_resourcesdeprecation issues, tightened dependency constraints (including TensorFlow-related requirements), and addednumba.
Release v2.0.4
Key Updates
- Fix default float type casting
Release v2.0.3
Key Updates
- Improve random seed pass-throughs
Release v2.0.2
Key Updates
- Add sanity check to catch
infvalues in the objective function output for CMA - Update kernel function naming convention to match GPFlow
- Update non-stationary kernel function tutorial to use
get_kernelandget methodinterface - Improve error descriptions for
get_kernel,get method, andget objective
Release v2.0.1
Key Updates
- Added
project_waypointsmethod to project waypoints to be within the environment - Added
get_kernelmethod to get a kernel with a string - Added new optimization objectives (compatible with CMA, BayesianOpt, and GreedyObjective methods)
- A-Optimal Design
- B-Optimal Design
- D-Optimal Design
- Updated mutual information methods to support caching
- Added Schur complement-based mutual information
- Added new method
DifferentiableObjectivethat can optimize any objective using gradient-based approaches
Release v2.0.0
Key Updates
- Added new unified interface for all SP/IPP methods
- Allows getting and running any method with
sgptools.methods.get_method(e.g.,ContinuousSGP,CMA) - Allows selecting the optimizer with a string (e.g.,
scipy.L-BFGS-B,tf.Adam) - Allows changing the backend objective function with a string (e.g.,
MI,SLogMI)
- Allows getting and running any method with
- Added new
Datasetclass for cleaner data management - Improved code readability with type hints
- Improved documentation
Release v1.2.0
Key Updates
- Added new Attentive Non-stationary Kernel
- Fixed Neural Kernel Non-stationary Kernel
- Updated SGPR's update method to support all kernel functions
Release v1.1.8
Key Updates
- Add option to skip inducing variables update in SSGP