NeuralAmpModelerCore
NeuralAmpModelerCore copied to clipboard
Core DSP library for NAM plugins
Tanh uses a lot of compute (hence "fast tanh"). But this e.g.: https://github.com/sdatkinson/NeuralAmpModelerCore/blob/846968710a670d662b15e449edba852d747d748e/NAM/activations.h#L75-L81 should be able to be implemented more idiomatically with Eigen. It would also be nice for any...
Are there any plans to support inference on some accelerators? Let's say use ONNXRuntime or TensorRT to free up CPU resources.
Following on from #125, this issue is to track improvements to the code for [gating activations in WaveNet layers](https://github.com/sdatkinson/NeuralAmpModelerCore/blob/e181f61efb8d05d34add45b5eecb3893ff21177c/NAM/wavenet.cpp#L38-L48).
I'd like to offer you a piece of work that I have recently been doing for the ToobAmp project. The net benefit: a 25% performance improvement for the most commonly...
I'm just putting this out here in case it's something you might consider merging @sdatkinson. Although i expect it would be a rather controversial change. I find the code style...
And const_iterator rather than iterator
I haven't properly checked if this is the case here, but using Eigen types as function parameters can result in unwanted temporary copies in some cases. The Eigen::Ref class exists...
See http://eigen.tuxfamily.org/dox-3.2/group__TopicStructHavingEigenMembers.html and act accordingly.