Jeffrey Morgan
|
369eda65f5
update llama.cpp submodule to `ceca1ae` (#3064)
|
1 anno fa |
Jeffrey Morgan
|
0e4669b04f
update llama.cpp submodule to `6cdabe6` (#2999)
|
1 anno fa |
Jeffrey Morgan
|
21347e1ed6
update llama.cpp submodule to `c29af7e` (#2868)
|
1 anno fa |
Jeffrey Morgan
|
26b13fc33c
patch: always add token to cache_tokens (#2459)
|
1 anno fa |
Daniel Hiltgen
|
de76b95dd4
Bump llama.cpp to b2081
|
1 anno fa |
Daniel Hiltgen
|
72b12c3be7
Bump llama.cpp to b1999
|
1 anno fa |
Jeffrey Morgan
|
a64570dcae
Fix clearing kv cache between requests with the same prompt (#2186)
|
1 anno fa |