Running NAM on Embedded Hardware

This looks super cool.

Has anyone seen this? Looks pretty cool and a great validation for the Daisy: Running NAM on Embedded Hardware: What We Learned · TONE3000

3 Likes

Thanks for the link, I did not see it yet, I also didn’t see the slimmable stuff, neat

@keyth72 added NAM to My Daisy Guitar Pedal Designs on GitHub a while back and I have been loving it, it sounds very good to my ears.

I will dig into this and see if there are any big wins/optimizations I can make using some of these results/library changes

@kidproquo Yeah running neural nets on microcontrollers is a really interesting problem, it sounds like they were able to run nano size NAM models by trading the tan activation for the computationally cheaper relu. In the DaisySeedProjects implementation I use a different library from the official NamCore called RTNeural, since I was familiar with it from my GuitarML stuff. Using that I’m still using tanh, but with a custom size smaller than nano that I unofficially call “pico”. I was able to get very good accuracy on most models using that.

There’s also another library called NeuralAudio the makes some optimizations, and using that I was able to run normal nano models on the Daisy seed: GitHub - mikeoliphant/NeuralAudio: C++ library for neural amp modeler and other audio network models

I haven’t looked into the slimmable models, but that sounds very interesting for embedded. The test models mentioned in that tone3000 article are likely compatible to what’s in the DaisySeedProjects in terms of accuracy, but I haven’t tried NAM with the relu activation so I’m not sure how it performs.

I think the binary models it mentions are similar to how I did it, where the model weights are extracted and compiled down to binary. Although if they can load custom ones at runtime that would be cool, at the moment I’ve only done pre compiled models built into the c code in header files.

Digging a little deeper, there’s another article linked from that first one with a ton of benchmark tests done, they put a lot of work into this: João Felipe Santos | Bringing NAM A2 to Embedded Hardware

@keyth72 yes I am well aware of your “Pico” profiles and RTNeural based implementation for NAM - awesome work. My pedal’s code is based on bkshepherd’s project, which has your contributions for NAM. Thank you :slight_smile:

I tried running the linked nam-pedal code on my pedal, but just couldn’t - it crashes in the audio callback. Will continue exploring/digging. Let’s see if anyone else has better luck.

My main goal is to enable easy use of model weights from Tone3000 via OTA downloads - so getting nano or nano-relu models dynamically loaded during runtime would be cool. I have used NeuralAudio for my pi-based NAM project. Didn’t realize I could try it on the Daisy. Do you have any sample code to share?

Oh awesome, and yes, haven’t put the code up on github yet but the conversation I had with Mike about integrating NeuralAudio on the Daisy can be found here along with some code examples:

1 Like

I was able to load NAM nano models as is from the SD card to the wavent model suggested in GuitarML Mercury ( @keyth72 you will soon have to go out of the solar system for pedal names :slight_smile: ). Switch from tanh activation to an approximation. It is pretty embedded in my JanMate pedal and quite messy but if it helps anyone:

on the main function:

num_nam_files_found = browse_nam_files(nam_file_list);

rtneural_wavenet.prepare(BLOCK_SIZE);

rtneural_wavenet.prewarm();


on the audio callback (block processing)

rtneural_wavenet.forward(in1, out1, size);

when you want to update the model (with some optional monitoring)

if(updateNAM)

    {

        int readWeights = parse_nam_weights(

            nam_file_list[NAM_FileIndex], BossOD3_weights_data, 1024);




        load_nano_weights(BossOD3_weights_data);

        hw.PrintLine("Loaded NAM file: %s", nam_file_list[NAM_FileIndex]);

        hw.PrintLine("Read %d weights from NAM file", readWeights);

        hw.PrintLine("First 5 weights: %f, %f, %f, %f, %f",

                     BossOD3_weights_data[0],

                     BossOD3_weights_data[1],

                     BossOD3_weights_data[2],

                     BossOD3_weights_data[3],

                     BossOD3_weights_data[4]);

        updateNAM      = false;

        weights_loaded = true;

    }

this can get you started. I will try to make a clean project with just the NAM part.

This is very modest size on the SRAM (I use the APP_TYPE = BOOT_SRAM), but consumed ~70% of CPU. I am currently using blockSize = 256 for other effects FFT compliance. Looking forward for the slimmable or more optimized model.
nam_model.zip (10.1 KB)

1 Like

OK. Major update from my good friend Claude.
Managed to put together all the recommendation from the Tone3000 benchmark, and finally omit the RTneural and the overkill templated wavenet model. Figure since Daisy won’t be running more than Nano sized NAM models it makes since hardcoding for this specific Neural Model and size (842 weights total). So just a header and cpp to add in make file, init, load the float* weights (read from file or hardcoded) and you get ~40% of CPU at 48Khz. I am still at 256 Block size so maybe more CPU for smaller blocks. But this means you can run 2 NAM models if you wish (and of course IR, reverb, delay and basically any other combination of common effects).
Have fun.
Daisy_NAM.zip (5.9 KB)

2 Likes

Wow! This is incredible! Let me try it out on my pedal. Or let me ask my buddy Claude to try it :slight_smile:

Are you running the nano-relu variant or plain nano from tone3000?

plain nano

1 Like

I tried it and it works on Hoopi Pedal! Here’s a demo video of my friend playing Green Day. Hoopi is running NAM, specifically the Orange OR120 (Dark Crunch) nano profile. Block size is 48.

You should post this on the nam-pedal GitHub repo. This is much better since it fits in SRAM and the CPU usage is reasonable to have additional processing (on Hoopi, I have the 2nd channel doing vocal reverb). Thanks a lot for this.

1 Like