Skip to content

Commit 5e2b820

Browse files
authored
Merge pull request #325 from JuliaGPU/tb/rm_device
Remove AbstractGPUDevice
2 parents fac0a81 + ac06d0f commit 5e2b820

File tree

6 files changed

+8
-34
lines changed

6 files changed

+8
-34
lines changed

docs/src/interface.md

Lines changed: 3 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -9,15 +9,8 @@ one that actually can be instantiated on the device (i.e. in kernels).
99
## Device functionality
1010

1111
Several types and interfaces are related to the device and execution of code on it. First of
12-
all, you need to provide a type that represents your device and exposes some properties of
13-
it:
14-
15-
```@docs
16-
GPUArrays.AbstractGPUDevice
17-
GPUArrays.threads
18-
```
19-
20-
Another important set of interfaces relates to executing code on the device:
12+
all, you need to provide a type that represents your execution back-end and a way to call
13+
kernels:
2114

2215
```@docs
2316
GPUArrays.AbstractGPUBackend
@@ -26,7 +19,7 @@ GPUArrays.gpu_call
2619
GPUArrays.thread_block_heuristic
2720
```
2821

29-
Finally, you need to provide implementations of certain methods that will be executed on the
22+
You then need to provide implementations of certain methods that will be executed on the
3023
device itself:
3124

3225
```@docs

src/GPUArrays.jl

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,6 @@ using AbstractFFTs
1313
using Adapt
1414

1515
# device functionality
16-
include("device/device.jl")
1716
include("device/execution.jl")
1817
## executed on-device
1918
include("device/abstractarray.jl")
@@ -38,5 +37,7 @@ include("host/uniformscaling.jl")
3837
# CPU reference implementation
3938
include("reference.jl")
4039

40+
include("deprecated.jl")
41+
4142

4243
end # module

src/deprecated.jl

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
function device end

src/device/device.jl

Lines changed: 0 additions & 12 deletions
This file was deleted.

src/host/abstractarray.jl

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,9 +15,6 @@ const AbstractGPUVector{T} = AbstractGPUArray{T, 1}
1515
const AbstractGPUMatrix{T} = AbstractGPUArray{T, 2}
1616
const AbstractGPUVecOrMat{T} = Union{AbstractGPUArray{T, 1}, AbstractGPUArray{T, 2}}
1717

18-
device(::AbstractGPUDevice) = error("Not implemented") # COV_EXCL_LINE
19-
backend(::Type{<:AbstractGPUDevice}) = error("Not implemented") # COV_EXCL_LINE
20-
2118

2219
# convenience aliases for working with wrapped arrays
2320

src/reference.jl

Lines changed: 2 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -16,10 +16,6 @@ using Adapt
1616
# Device functionality
1717
#
1818

19-
## device properties
20-
21-
struct JLDevice <: AbstractGPUDevice end
22-
2319
const MAXTHREADS = 256
2420

2521

@@ -266,7 +262,7 @@ struct JLArrayStyle{N} <: AbstractGPUArrayStyle{N} end
266262
JLArrayStyle(::Val{N}) where N = JLArrayStyle{N}()
267263
JLArrayStyle{M}(::Val{N}) where {N,M} = JLArrayStyle{N}()
268264

269-
BroadcastStyle(::Type{<:AnyJLArray{T,N}}) where {T,N} = JLArrayStyle{N}()
265+
BroadcastStyle(::Type{JLArray{T,N}}) where {T,N} = JLArrayStyle{N}()
270266

271267
# Allocating the output container
272268
Base.similar(bc::Broadcasted{JLArrayStyle{N}}, ::Type{T}) where {N,T} =
@@ -368,9 +364,7 @@ Random.randn!(A::AnyJLArray) = Random.randn!(GPUArrays.default_rng(JLArray), A)
368364

369365
## GPUArrays interfaces
370366

371-
GPUArrays.device(x::AnyJLArray) = JLDevice()
372-
373-
GPUArrays.backend(::Type{<:AnyJLArray}) = JLBackend()
367+
GPUArrays.backend(::Type{<:JLArray}) = JLBackend()
374368

375369
Adapt.adapt_storage(::Adaptor, x::JLArray{T,N}) where {T,N} =
376370
JLDeviceArray{T,N}(x.data, x.dims)

0 commit comments

Comments
 (0)