You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
> All problems in computer science can be solved by another level of indirection
6
4
7
5
Using [PLT trampolines](https://en.wikipedia.org/wiki/Trampoline_(computing)) to provide a BLAS and LAPACK demuxing library. Watch a detailed [JuliaCon 2021 talk on libblastrampoline](https://www.youtube.com/watch?v=t6hptekOR7s).
8
6
9
7
These BLAS libraries are known to work with libblastrampoline (successfully tested in Julia):
10
-
1. OpenBLAS through [OpenBLASHighCoreCount.jl](https://github.com/giordano/OpenBLASHighCoreCount.jl)
11
-
2. Intel MKL through [MKL.jl](https://github.com/JuliaLinearAlgebra/MKL.jl)
12
-
3. Apple Accelerate through [AppleAccelerate.jl](https://github.com/JuliaMath/AppleAccelerate.jl)
13
-
4. BLIS through [BLISBLAS.jl](https://github.com/carstenbauer/BLISBLAS.jl)
14
-
5. Fujitsu BLAS through [FujitsuBLAS.jl](https://github.com/giordano/FujitsuBLAS.jl)
8
+
9
+
1.[OpenBLAS](https://github.com/OpenMathLib/OpenBLAS) (supported by default in Julia)
10
+
2.[Intel oneMKL](https://www.intel.com/content/www/us/en/developer/tools/oneapi/onemkl.html) (use in Julia through [MKL.jl](https://github.com/JuliaLinearAlgebra/MKL.jl))
11
+
3.[Apple Accelerate](https://developer.apple.com/documentation/accelerate/blas) (use in Julia through [AppleAccelerate.jl](https://github.com/JuliaMath/AppleAccelerate.jl))
12
+
4.[BLIS](https://github.com/flame/blis/) (use in Julia through [BLISBLAS.jl](https://github.com/carstenbauer/BLISBLAS.jl))
13
+
5. Fujitsu BLAS (use in Julia through [FujitsuBLAS.jl](https://github.com/giordano/FujitsuBLAS.jl))
0 commit comments