-
Notifications
You must be signed in to change notification settings - Fork 233
Pull requests: PrimeIntellect-ai/prime-rl
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
feat: Vendor VeriHop env and add configs/verihop/rl.toml
#2068
opened Mar 21, 2026 by
nevasini1
Loading…
feat: make vision encoder optionally trainable
#2066
opened Mar 21, 2026 by
hallerite
Loading…
4 tasks done
perf: run VLM image preprocessing in thread to unblock event loop
#2065
opened Mar 21, 2026 by
hallerite
Loading…
3 tasks done
fix: skip multimodal samples that exceed seq_len instead of truncating
#2064
opened Mar 21, 2026 by
hallerite
Loading…
5 tasks done
feat: add explicit vlm config flag for VLM detection
#2063
opened Mar 21, 2026 by
hallerite
Loading…
5 tasks done
chore: consolidate VLM detection in trainer
#2062
opened Mar 21, 2026 by
hallerite
Loading…
2 tasks done
chore: update dependency versions to match upstream
#2059
opened Mar 21, 2026 by
S1ro1
Loading…
2 tasks
SFT distillation: bug fixes, VLM support, and pretokenization optimization
#2053
opened Mar 19, 2026 by
eligotts
Loading…
6 tasks done
fix: wire default_chat_template_kwargs to chat handler and /tokenize
#2050
opened Mar 19, 2026 by
mudithj
Loading…
4 tasks done
feat: add NemotronH (Nemotron-3-Super-120B-A12B) model support
#2046
opened Mar 19, 2026 by
samsja
Loading…
6 of 7 tasks
feat: collect and log per-engine inference server metrics to W&B
#2043
opened Mar 18, 2026 by
mikasenghaas
•
Draft
vLLM monkey-patches for dcp support on tp > node gpu count
#2041
opened Mar 18, 2026 by
JannikSt
Loading…
feat: add sample-level loss normalization for SFT
#2032
opened Mar 16, 2026 by
hallerite
Loading…
9 tasks done
Previous Next
ProTip!
Follow long discussions with comments:>50.