-
Notifications
You must be signed in to change notification settings - Fork 606
Pull requests: flashinfer-ai/flashinfer
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
chore: add __all__ exports to Python modules and document missing APIs
#2251
opened Dec 20, 2025 by
yzh119
Loading…
5 tasks
bugfix: skip CUTLASS kernel generation when AOT cache exists
#2248
opened Dec 19, 2025 by
yongwww
Loading…
5 tasks
feat: Support numLocalTokens=0 for moe All-to-all
#2247
opened Dec 19, 2025 by
trevor-m
Loading…
5 tasks
misc: Add runtime validation for plan/run consistency in BatchMLAPagedAttentionWrapper
#2246
opened Dec 19, 2025 by
bkryu
Loading…
3 of 5 tasks
refactor: pull trtllm-gen batch-gemm/gemm headers from artifactory; update tma descriptor shape init
#2235
opened Dec 17, 2025 by
jimmyzho
Loading…
5 tasks
cicd / testing: Add xfails tracker script
#2227
opened Dec 16, 2025 by
kahyunnam
Loading…
5 tasks done
Fix: Add mask_indptr conversion in BatchPrefillWithPagedKVCacheWrapper.plan()
#2201
opened Dec 11, 2025 by
Dutch-voyage
Loading…
5 tasks
Add CUDA graph buffers for persistent attention
#2185
opened Dec 7, 2025 by
Edenzzzz
Loading…
5 tasks
[Flashinfer-Bench integration] HF end-to-end inference
#2151
opened Nov 30, 2025 by
sfc-gh-goliaro
•
Draft
5 tasks
Enable Hopper FA3 FP8 attention in decode.py
#2148
opened Nov 28, 2025 by
nvpohanh
Loading…
5 tasks done
feat: BF16 GEMM using CUTLASS backend for SM100
#2070
opened Nov 10, 2025 by
raayandhar
Loading…
5 tasks done
Refactor flashinfer/__init__.py so that applications could selectively pack submodules without modifying __init__.py
#2027
opened Nov 3, 2025 by
bangshengtang
Loading…
5 tasks done
chore: agentic workflow for automatic version bump
#1947
opened Oct 19, 2025 by
yzh119
Loading…
5 tasks
Previous Next
ProTip!
What’s not been updated in a month: updated:<2025-11-20.