Skip to content

Commit 814375b

Browse files
ChrisRackauckas-ClaudeclaudeChrisRackauckas
authored
Fix missing imports in CUDA extension (#742)
* Fix missing imports in CUDA extension Add missing imports for LU, CuVector, CuMatrix, and LinearVerbosity to fix compilation errors in LinearSolveCUDAExt. Fixes the issue reported at: https://discourse.julialang.org/t/linearsolvecudaext-fails-to-compile/131693/2 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]> * Add sources section to LinearSolveAutotune Project.toml Add [sources] section with paths to LinearSolveAutotune and LinearSolve packages for proper local development setup. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]> * Move sources section to main Project.toml following OrdinaryDiffEq pattern - Remove sources section from lib/LinearSolveAutotune/Project.toml - Add sources section to main LinearSolve.jl Project.toml - This follows the same pattern used in OrdinaryDiffEq.jl where sources are defined in the main package Project.toml, not in individual lib packages 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]> * Fix sources section: LinearSolveAutotune should source LinearSolve - Remove incorrectly added sources from main Project.toml - Add sources section to LinearSolveAutotune/Project.toml pointing to LinearSolve - This allows LinearSolveAutotune to use the local LinearSolve package during development 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]> * Fix CUDA extension issues - Remove stub init_cacheval for CudaOffloadLUFactorization from base package to avoid method overwriting error during precompilation - Move CUDA from deps to weakdeps since it's only needed for the extension These changes fix: 1. Method overwriting error during module precompilation 2. Aqua.jl stale deps test failure for CUDA 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <[email protected]> * Update src/factorization.jl --------- Co-authored-by: Claude <[email protected]> Co-authored-by: Christopher Rackauckas <[email protected]>
1 parent cec9e37 commit 814375b

File tree

3 files changed

+8
-2
lines changed

3 files changed

+8
-2
lines changed

Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -122,8 +122,8 @@ RecursiveFactorization = "0.2.23"
122122
Reexport = "1.2.2"
123123
SafeTestsets = "0.1"
124124
SciMLBase = "2.70"
125-
SciMLOperators = "1"
126125
SciMLLogging = "1"
126+
SciMLOperators = "1"
127127
Setfield = "1.1.1"
128128
SparseArrays = "1.10"
129129
Sparspak = "0.3.9"

ext/LinearSolveCUDAExt.jl

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,16 @@
11
module LinearSolveCUDAExt
22

33
using CUDA
4+
using CUDA: CuVector, CuMatrix
45
using LinearSolve: LinearSolve, is_cusparse, defaultalg, cudss_loaded, DefaultLinearSolver,
56
DefaultAlgorithmChoice, ALREADY_WARNED_CUDSS, LinearCache,
67
needs_concrete_A,
78
error_no_cudss_lu, init_cacheval, OperatorAssumptions,
89
CudaOffloadFactorization, CudaOffloadLUFactorization, CudaOffloadQRFactorization,
9-
SparspakFactorization, KLUFactorization, UMFPACKFactorization
10+
SparspakFactorization, KLUFactorization, UMFPACKFactorization,
11+
LinearVerbosity
1012
using LinearSolve.LinearAlgebra, LinearSolve.SciMLBase, LinearSolve.ArrayInterface
13+
using LinearAlgebra: LU
1114
using SciMLBase: AbstractSciMLOperator
1215

1316
function LinearSolve.is_cusparse(A::Union{

lib/LinearSolveAutotune/Project.toml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,9 @@ uuid = "67398393-80e8-4254-b7e4-1b9a36a3c5b6"
33
authors = ["SciML"]
44
version = "1.8.0"
55

6+
[sources]
7+
LinearSolve = {path = "../.."}
8+
69
[deps]
710
Base64 = "2a0f44e3-6c83-55bd-87e4-b1978d98bd5f"
811
BenchmarkTools = "6e4b80f9-dd63-53aa-95a3-0cdb28fa8baf"

0 commit comments

Comments
 (0)