- 
                Notifications
    You must be signed in to change notification settings 
- Fork 32
          Improve inference in solve pipeline
          #654
        
          New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
This package makes heavy use of polyalgorithms. This provides a lot of flexibility and performance, but one downside is that it hinders inferrability. The lack of inferrability does not seem to cause serious performance issues, but it does stand in the way of being able to precompile the package via direct-chain-of-inference. (See JuliaHomotopyContinuation#643 for issues that prevent usage of PrecompileTools.jl, which otherwise would be an easy solution because it circumvents the need for high-quality inference.) The goal of this PR is to make at least a subset of the `solve` pipeline (specifically, the part I'm using!) inferrable. It targets the following usage: ```julia solver, starts = solver_startsolutions(S, sols; start_parameters, target_parameters, ishomogeneous=false) Ssolve = solve(solver, starts; show_progress=false) ``` where `S` is a `CompiledSystem`. The eventual goal is that if users write their code this way, the entire pipeline can inferred and thus precompiled. To get closer to this goal, this PR adds: - the `ishomogeneous` keyword to bypass a non-inferrable branch in `parameter_homotopy` - several `Base.@constprop :aggressive` annotations to help the compiler optimize branches that depend on keyword arguments
| Interesting. Thanks for the initiative. One question: According to the docs  | 
| I think we want to improve time-to-first-execute (TTFX)? We also suffer a lot from this in HarmonicBalance.jl | 
| Yes to both. Here's the issue: if you can rely on  So the alternative is to use  So this PR begins the effort to try to make  Now, keep in mind that if we could solve the issues surrounding reinitialization of  Does that help clarify the goals of this PR? | 
| BTW @PBrdng, if you're not familiar with the consequences of precompilation, my advice is to try this: tim@hypnotic:/tmp$ mkdir pctest
tim@hypnotic:/tmp$ cd pctest
tim@hypnotic:/tmp/pctest$ julia -q --startup-file=no
julia> using Pkg; Pkg.activate("."); Pkg.add("GLMakie")
 ⋮
 
julia> using GLMakie; @time @eval(display(lines(randn(10))))
  6.904183 seconds (273.82 k allocations: 24.814 MiB, 6.94% compilation time)
GLMakie.Screen(...)and then see what happens when you disable usage of pkgimages: tim@hypnotic:/tmp/pctest$ julia --project -q --startup-file=no --pkgimages=no
julia> using GLMakie; @time @eval(display(lines(randn(10))))
 38.212825 seconds (1.84 M allocations: 87.696 MiB, 0.82% gc time, 82.30% compilation time)
GLMakie.Screen(...)It's a big reduction in TTFX, thanks to the fact that the Makie developers have put some effort into their precompilation pipeline (e.g., https://github.com/MakieOrg/Makie.jl/blob/master/Makie/src/precompiles.jl). They clearly had some similar issues to those encountered here, since they use a mix of  The goal here is to confer some of the same benefits to packages that use HomotopyContinuation. | 
| @timholy Indeed, I'm not familiar with precompilation and all that. But I'm interested to make it work. Would you have time for a meeting at some point this summer to explain it to me? | 
| Yes, happy to do that. I'm at JuliaCon next week, any chance you'll be there? If not, sometime in August we could get together. Meanwhile, check the release announcement for Julia 1.9: https://julialang.org/blog/2023/04/julia-1.9-highlights/#caching_of_native_code | 
| I will not be at JuliaCon. Our community has almost no intersection to the JuliaCon community (unfortunately!). Let's talk in august. | 
This package makes heavy use of polyalgorithms. This provides a lot of flexibility and performance, but one downside is that it hinders inferrability. The lack of inferrability does not seem to cause serious performance issues, but it does stand in the way of being able to precompile the package via direct-chain-of-inference. (See #643 for issues that prevent usage of PrecompileTools.jl, which otherwise would be an easy solution because it circumvents the need for high-quality inference.)
The goal of this PR is to make at least a subset of the
solvepipeline (specifically, the part I'm using!) inferrable. It targets the following usage:where
Sis aCompiledSystem. The eventual goal is that if users write their code this way, the entire pipeline can inferred and thus precompiled.To get closer to this goal, this PR adds:
ishomogeneouskeyword to bypass a non-inferrable branch inparameter_homotopyBase.@constprop :aggressiveannotations to help the compiler optimize branches that depend on keyword argumentsThis does not put it over the finish line
, but it is enough to let me use(UPDATE: I was a bit hasty, the compiled IDs would need to populatePrecompileToolsin my own package, and that's really good news.TSYSTEM_TABLEand currently I don't have that working.)If you're curious about the current point of failure in inferrability, it's
HomotopyContinuation.jl/src/linear_algebra.jl
Line 36 in 6a6ce86
in the construction of the
MatrixWorkspacefor the Jacobian. It would seem that the best option is to allow the user to supply their ownMatrixWorkspacetoJacobianandTrackerState, but that's starting to look pretty intrusive. So I decided to stop here and gauge thoughts from the maintainers.Note: there's no urgency in merging this, best to wait until it gets a little more complete.