This website requires JavaScript.
Explore
Help
Register
Sign In
zzh
/
vllm-npu-plugin
Watch
1
Star
0
Fork
0
You've already forked vllm-npu-plugin
mirror of
https://github.com/handsomezhuzhu/vllm-npu-plugin.git
synced
2026-02-20 19:50:15 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
e22617f72e600072f8d5543cb3ffc0727641e6f4
vllm-npu-plugin
/
vllm_npu
/
attention
History
handsomezhuzhu
e22617f72e
feat: Add Ascend NPU attention backend for vLLM using FlashAttention operators.
2026-02-10 22:15:26 +08:00
..
__init__.py
feat: initial vllm-npu-plugin for Ascend NPU adaptation
2026-02-10 11:06:01 +08:00
attention_v1.py
feat: Add Ascend NPU attention backend for vLLM using FlashAttention operators.
2026-02-10 22:15:26 +08:00