Wang Zeng
Wang Zeng
> > > @inisis @Kayzwer thanks guys! Do you have benchmarks or info on i.e. YOLOv8n models exported with both [onnx-simplifier](https://github.com/daquexian/onnx-simplifier.git) and [onnx-slim](https://github.com/tsingmicro-toolchain/OnnxSlim.git)? > > > Are the number of...
> > > > > @inisis @Kayzwer thanks guys! Do you have benchmarks or info on i.e. YOLOv8n models exported with both [onnx-simplifier](https://github.com/daquexian/onnx-simplifier.git) and [onnx-slim](https://github.com/tsingmicro-toolchain/OnnxSlim.git)? > > > > >...
> @initialencounter hey, thanks for adding to the discussion! Regarding applying onnx-simplifier and onnx-slim sequentially, you can definitely try using both tools on a model to maximize optimization. Each tool...
Why not give [onnxslim](https://github.com/inisis/OnnxSlim) a try? onnxslim is a fully python-based onnx optimizer that requires no compilation, making it easier to debug. It has already been adopted by repositories such...
谢谢你! Thank you!