Hi All,
I’ve been working with the daisy patch to design a delay/multi grain delay type effect. I’m using oopsy daisy and gen~. My patch works and sounds good on my machine (2021 MBP AS) but when I load it onto my daisy, the wet signal becomes very distorted. I noticed this earlier when I tried to implement a wave shaping distortion as well (via a code box and some 4th order polynomials). The way I’m indexing into my delay buffer is via the [sample] object (takes in a float between 0-1 and interpolates a sample from the corresponding buffer). There are a few other places where I am using floats to control timing/envelopes etc. The cpu load on my daisy covers around 20% so I don’t think its an overload issue.
I remember reading something about daisy and floating point processors… Is this potentially the issue I’m running up against? Thanks Y’all.