-
-
Notifications
You must be signed in to change notification settings - Fork 13
Open
Description
Taking SDL_SetTextureColorMod for an example:
SDL docs:
bool SDL_SetTextureColorMod(SDL_Texture *texture, Uint8 r, Uint8 g, Uint8 b);Currently, PySDL3 only allows c_uint8 to be passed:
((...) -> Any) | (def SDL_SetTextureColorMod(texture: Divergent, r: c_ubyte, g: c_ubyte, b: c_ubyte) -> c_bool)Which means if I do this, which works perfectly due to the automatic conversion:
# `colour.r`, `g` and `b` are an `int` here
sdl3.SDL_SetTextureColorMod(self.texture, colour.r, colour.g, colour.b)Instead of:
sdl3.SDL_SetTextureColorMod(self.texture, c_ubyte(colour.r), c_ubyte(colour.g), c_ubyte(colour.b))I get a working program, but a type error:
Argument to function `SDL_SetTextureColorMod` is incorrect: Expected `c_ubyte`, found `int`
Doing the conversions explicitly into the ctypes doesn't help much, as if the values being converted were wrong in the first place (i.e. effectively putting 260 into a c_uint8), it would still error, having to do the explicit conversion does not add much here.
Therefore, would it be possible to set the typing as c_ubyte | int instead of just c_ubyte and doing the similar everywhere where it makes sense?
Metadata
Metadata
Assignees
Labels
No labels