Files
llama.cpp/ggml/src
Mengqing Cao d2d3200b38 cann : add Ascend NPU support (whisper/2336)
* enable Ascend NPU in src/whisper.cpp
  * sync test-backend-ops with llama.cpp
2024-09-08 11:05:55 +03:00
..
2024-08-27 22:41:27 +03:00
2024-08-27 22:41:27 +03:00