@shensley I was wondering about the max sample length here …
If we have a total RAM Size of 64MB so 67108864 byte and a float has 4 byte, we get 16777216 / 48000 = 349.525333333 sec of sample time.
Wouldn’t it be possible to capture the raw input in 16bit and get double the sample length here? Then convert to float when reading the values stored in RAM? Or is that too slow?
Yeah, that approach could definitely work. I’ve actually done similar stuff to squeeze longer delays on boards with a lot less RAM.
just a quick daisy::f2s16() before writing to a delayline, and then daisy::s162f() would do the trick. The daisysp DelayLine class is a template so you can easily just change any existing delay lines to a 16-bit int type (i.e. int16_t, short, etc.) and you should be good to go.