Skip to content

[Raspberry Pi] Consider using the GPU for precise GPIO timings, rather than DMA #65

@Wallacoloo

Description

@Wallacoloo

Gert showed that it was possible to dump the value of over 50 million 16-bit values per second into the GPIOs using the Pi's GPU. His application was to have each 16 bits present one pixel (RGB565) and use a resistor DAC to send it to a VGA LCD, along with clock signals generated by the GPU, all with accurate enough timings for video (in order to correctly place each pixel, that requires timing accurate to about 20 ns).

Unfortunately, his code, at the time of writing, is only supplied as a binary blob and the documentation of the GPU features he used is nonexistent.

But there are other efforts to demystify the Pi's GPU. There is, for example, this tutorial for writing GPU code on the Pi. It may be possible to commandeer one GPU core permanently and use it to precisely drive the GPIO pins (Not sure how much control we have over scheduling).

Useful documents:
Herman Hermitage Unofficial Videocore Docs
Official Broadcom Videocore Documentation

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions