IQ3_KS is awesome!

#4
by mtcl - opened

My Config:

I can load full IQ3_KS on 2x6000 Pros however i can also do it on single 6000 pro for testing and keeping another fast model on vllm on secondary GPU. here re the results that most of hybrid cpu gpu people will be interested in.

System Specifications for Hybrid CPU/GPU Inference

System Overview

OS: Ubuntu 24.04.3 LTS
Kernel: Linux 6.14.0-37-generic


Motherboard Specifications

Model: ASUS Pro WS W790E-SAGE SE

  • Platform: Intel W790 Chipset

CPU Specifications

Processor: Intel Xeon w9-3495X
- Cores: 56 physical cores / 112 threads
- AMX (Advanced Matrix Extensions): Full support
- NUMA Nodes: 1

GPU Specifications

Graphics Cards: 2x NVIDIA RTX PRO 6000 Black Edition

  • VRAM per GPU: 96 GB
  • Driver Version: 580.95.05
  • CUDA Version: 13.0
  • Power Consumption: 600W max each

Memory Configuration

Total RAM: 512 GB
Memory Type: DDR5 (8x DIMM slots populated)
Memory Speed: DDR5-4800+ (OC capable)
OC Supported: 6000 OC Stable
Swap: 0 GB (disabled)
Memory Architecture: 8-channel DDR5 configuration
Available for Inference: 503 GB total capacity for model loading

(base) mukul@jarvis:~/dev-ai/ik_llama.cpp$ CUDA_DEVICE_ORDER=PCI_BUS_ID CUDA_VISIBLE_DEVICES="1" ./build/bin/llama-server \
    --model /media/mukul/data/models/ubergarm/GLM-4.7-GGUF/IQ3_KS/GLM-4.7-IQ3_KS-00001-of-00005.gguf \
    --alias ubergarm/GLM-4.7 \
    --ctx-size 131072 \
    -ger \
    --merge-qkv \
    -ngl 99 \
    --n-cpu-moe 65 \
    -ctk q8_0 -ctv q8_0 \
    -ub 8192 -b 8192 \
    --threads 56 \
    --parallel 1 \
    --host 0.0.0.0 \
    --port 10002 \
    --no-mmap \
    --jinja
main: n_kv_max = 131072, n_batch = 8192, n_ubatch = 8192, flash_attn = 1, n_gpu_layers = 99, n_threads = 56, n_threads_batch = 56

|    PP |     TG |   N_KV |   T_PP s | S_PP t/s |   T_TG s | S_TG t/s |
|-------|--------|--------|----------|----------|----------|----------|
|  8192 |   2048 |      0 |   13.942 |   587.57 |  100.606 |    20.36 |
|  8192 |   2048 |   8192 |   15.024 |   545.27 |  109.013 |    18.79 |
|  8192 |   2048 |  16384 |   16.048 |   510.48 |  116.656 |    17.56 |
|  8192 |   2048 |  24576 |   17.579 |   466.01 |  126.207 |    16.23 |
|  8192 |   2048 |  32768 |   19.308 |   424.29 |  140.504 |    14.58 |
|  8192 |   2048 |  40960 |   21.887 |   374.29 |  156.982 |    13.05 |
|  8192 |   2048 |  49152 |   24.145 |   339.28 |  173.008 |    11.84 |
|  8192 |   2048 |  57344 |   26.427 |   309.98 |  193.310 |    10.59 |
|  8192 |   2048 |  65536 |   28.705 |   285.39 |  219.126 |     9.35 |
|  8192 |   2048 |  73728 |   30.605 |   267.67 |  242.317 |     8.45 |
|  8192 |   2048 |  81920 |   32.814 |   249.65 |  263.043 |     7.79 |
|  8192 |   2048 |  90112 |   34.653 |   236.40 |  282.582 |     7.25 |
|  8192 |   2048 |  98304 |   37.128 |   220.64 |  303.721 |     6.74 |
|  8192 |   2048 | 106496 |   39.245 |   208.74 |  322.416 |     6.35 |
|  8192 |   2048 | 114688 |   41.300 |   198.35 |  342.002 |     5.99 |
|  8192 |   2048 | 122880 |   43.872 |   186.73 |  360.998 |     5.67 |

Sign up or log in to comment