kaitchup 's Collections

Quantized Olmo 3

Verified models. All compatible with vLLM for very fast inference. Use the 3.1 models as they are more recent.