You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add audiodelays & audioeffects to unix coverage port
.. and add a very basic audioeffects test, showing that it plausibly
is working
I had to address several build errors that occurred in the Unix build,
mostly related to conversion from FP types to integral types (replaced
by explicit casts) and by
accidental mixing of regular & f-suffixed floating constants (replaced
with the MICROPY_FLOAT_CONST macro)
Particularly this change could use consideration:
```diff
- self->max_echo_buffer_len = self->sample_rate / 1000.0f * max_delay_ms * (self->channel_count * sizeof(uint16_t)); // bytes
+ self->max_echo_buffer_len = (uint32_t)(self->sample_rate / 1000.0f * max_delay_ms) * (self->channel_count * sizeof(uint16_t)); // bytes
```
The buffer length is being calculated in floating point based on the
millisecond delay & the sample rate. The result could then be a fractional
number such as 529.2 for a 12ms delay at 44.1kHz. Multiplying a floating
number by the size required for each echo buffer item
(`(self->channel_count * sizeof(uint16_t))`) could yield a number of bytes
that doesn't correspond to an integral number of buffer items. I grouped
the float->int conversion so that it converts the number of echo buffer
items to an integer and then multiplies by the size of the item.
0 commit comments