This is not implemented yet, but is very high in the priority list for new features.
Yeah, great to ear
Yes, I’d love to see an example of this. In the meantime time if anyone has any pointers to explore more how streaming audio over USB would work, that’d be great to see…
Totally agreed as USB Audio helps on streaming and recording quality. Plus connectivity to DAW mixing and post recording cleanup. Hope the feature comes soon.
Looks like there is some STM32 Cube example code that enables USB audio streaming.
I might be messing with this soon too. Will update if I get anywhere
@tenkai I’ve run some of the discovery board usb audio examples before with sucess, and I think I got a few really basic tests running on one of the first daisy prototypes. Excited to see what you come up with when you start messing around with it.
I’m planning on getting around to adding some libdaisy support (for Audio and for MIDI) in the next few months. It’s just after rounding out the basic peripheral support, and adding better support for QSPI (as storage, or extra code-space) on my list of features-to-get-to.
A few months sounds way too optimistic. I can believe that it would appear in second half of next year or so.
That said, I’ve got USB audio working on OWL port for Daisy Patch. If truth to be told, I’ve only added support for using 2 separate audio buffers for Patch, the rest was done in upstream branch. This still hasn’t been released for any Rebeltech devices, because there are several unresolved minor issues.
I would expect libDaisy to get a MIDI-only USB endpoint support first for device and host before trying to get a working composite audio + MIDI support. Audio interface support would take more time to get working on every OS and may require changes to audio processing (i.e. for sharing DMA buffer with codec).
Admittedly, I do have a habit of underestimating how long things will take. What I probably should have said is that I plan on starting work on it in a few months, once I’ve gotten through the other stuff I mentioned first.
Good points, though, and interesting note on having it it working on the OWL port already. I may take a look at that a bit before digging in for libdaisy.
And your expectation is accurate; the plan will be to add MIDI-only first, and then flesh out a composite class after that.
How we can speed up this?
What the community can do for help?
Unless you’re an expert in USB stack on microcontrollers, all you can do is wait till it’s ready.
@shensley, was going to let you know that implementing sync audio device would not work for everyone, in case if you’ve considered it as good enough.
I definitely anticipate the actually audio-interface implementation to take a little while, and don’t have enough experience to have a clear idea of what hurdles we may or may not face yet. Audio class based MIDI, will be tackled first (as mentioned previously). And once that is stable, actual Audio will begin getting worked on.
That said, before we get back to USB, QSPI, and SDMMC we will be doing a few things to libdaisy to create a clear versioning system and a dev branch with experimental or in-progress features.
That way while some of these things are being worked on we can have a better way for community users to test, and play with features that are not completely polished. This will also make reporting bugs, and announcing any API-breaking changes a much clearer process as well as making it a bit easier for users to plan when they want to upgrade, and be able to access previous versions (as right now the only way to do this is keep track of the commit hash you’re using).
We are a fairly small team, and are working hard to bring these features to fruition but we have a lot going on. However, we are always happy to see community based contributions to the library from anyone that does have experience, and it’s often easier to make time to review content then to add additional new features to our own plates while still resolving existing issues.
this was obvious to me
Thanks for the reply,
but these were the hot spots of the kickstart campaign, and I honestly believed that audio and usb midi were a high priority feature, but i understand the complexity of making it.
(Update) Edited the post to remove the assumption on whether it’s possible to make Daisy Seed support USB High-speed mode for high quality USB audio.
Thanks @antisvin for providing some information regarding USB support limitation on Daisy in this thread, which I think should be highlighted in this thread for anyone interested in utilizing USB audio support with Daisy:
- The USB port on Daisy seed board only supports full-speed operations (12Mbit/s) with on-chip PHY.
- USB 2.0 High-speed operation (480Mbit/s) is possible with external USB PHY, but none of the current daisy boards provide it.
- AN4879 Application note: USB hardware and PCB guidelines using STM32 MCUs
- STM32H7 - USB OTG HS
In short, the USB audio support out of current Daisy boards are limited / crippled (despite Daisy boards provide 96k/24bit DAC chips, it’s limited to USB FS bandwidth). Not totally sure if a custom PCB with external PHY can make Daisy support USB HS but it doesn’t seem that Daisy Seed exported all the ULPI pins for USB OTG HS support.
Hope that helps!
HS PHY can be connected on USB host peripheral, so it won’t help with lack of bandwidth for audio devices.
Btw, the codec supports 192kHz too, not that I expect it to be particularly useful. And it would require changing clock settings for SAI for this to be usable in Owlsy (ignoring the USB audio issue).