Daisy Patch Distorted Signal

Hi All,

I’ve been working with the daisy patch to design a delay/multi grain delay type effect. I’m using oopsy daisy and gen~. My patch works and sounds good on my machine (2021 MBP AS) but when I load it onto my daisy, the wet signal becomes very distorted. I noticed this earlier when I tried to implement a wave shaping distortion as well (via a code box and some 4th order polynomials). The way I’m indexing into my delay buffer is via the [sample] object (takes in a float between 0-1 and interpolates a sample from the corresponding buffer). There are a few other places where I am using floats to control timing/envelopes etc. The cpu load on my daisy covers around 20% so I don’t think its an overload issue.

I remember reading something about daisy and floating point processors… Is this potentially the issue I’m running up against? Thanks Y’all.

Hey sando!

I’m sorry for the delay in response to your question. I hope all is well.

Would it ok for you to share a portion of the gen~ patcher as a screenshot?
And hopefully in the process of scaling down to take a screenshot, you may be able to pin point what’s causing the issue.

I also recommend sharing the screenshot or file of the patcher over on our Discord Daisy
There’s an oopsy channel and we would be happy to help there too.

1 Like