Quantum Readiness LLM (GUI Version)
A fine-tuned language model specialized in quantum computing readiness assessment, including post-quantum cryptography (PQC) vulnerability scanning and quantum feasibility analysis.
Model Description
This model is fine-tuned from Qwen/Qwen2.5-3B-Instruct using QLoRA (4-bit quantization) on a custom dataset of quantum security and quantum computing analysis examples.
Key Features
- Post-Quantum Cryptography (PQC) Analysis: Identifies quantum-vulnerable algorithms (RSA, ECDSA, ECDH, DSA, DH, DES, 3DES, RC4, MD5, SHA-1)
- NIST Compliance Guidance: References FIPS 203 (ML-KEM/Kyber), FIPS 204 (ML-DSA/Dilithium), FIPS 205 (SLH-DSA/SPHINCS+)
- Quantum Feasibility Analysis (QFA): Identifies opportunities for quantum algorithms (QAOA, VQE, Grover's, HHL, QML)
- Migration Guidance: Provides code examples for transitioning to post-quantum alternatives
- Standalone Operation: Provides direct responses without requiring tool orchestration
Model Details
| Property | Value |
|---|---|
| Base Model | Qwen/Qwen2.5-3B-Instruct |
| Parameters | ~3B |
| Fine-tuning Method | QLoRA (4-bit NF4) |
| LoRA Rank | 32 |
| LoRA Alpha | 64 |
| Training Epochs | 3 |
| Final Loss | 0.02 |
| Token Accuracy | 99% |
| Context Length | 4096 tokens |
| Format | GGUF (F16) |
| Size | ~6.2 GB |
Intended Use
Primary Use Cases
- Security Auditing: Scan codebases for quantum-vulnerable cryptographic implementations
- Compliance Assessment: Evaluate alignment with NIST post-quantum standards
- Migration Planning: Get guidance on transitioning to post-quantum algorithms
- Quantum Opportunity Analysis: Identify code patterns suitable for quantum acceleration
- Education: Learn about quantum threats and post-quantum cryptography
Example Prompts
# PQC Security Scanning
"What cryptographic algorithms are vulnerable to quantum attacks?"
"How do I replace RSA with post-quantum alternatives?"
"Explain the quantum threat to ECDSA"
# Quantum Feasibility
"What quantum algorithms could benefit my optimization problems?"
"Explain QAOA and its use cases"
"What is the Quantum Feasibility Index?"
# NIST Standards
"What are the NIST PQC standards?"
"Explain ML-KEM (Kyber) and FIPS 203"
"What's the difference between ML-DSA and SLH-DSA?"
How to Use
With Ollama (Recommended)
- Download the GGUF file and Modelfile
- Create the model:
ollama create quantum-readiness-gui -f Modelfile
- Run:
ollama run quantum-readiness-gui "What algorithms are vulnerable to quantum attacks?"
Modelfile Configuration
FROM quantum-readiness-gui-f16.gguf
PARAMETER num_ctx 4096
PARAMETER temperature 0.7
PARAMETER top_p 0.9
PARAMETER repeat_penalty 1.1
SYSTEM """You are the ArcQubit Quantum Assistant, an AI-powered expert in quantum computing readiness, post-quantum cryptography (PQC), and quantum feasibility analysis."""
TEMPLATE """{{- if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{- range .Messages }}<|im_start|>{{ .Role }}
{{ .Content }}<|im_end|>
{{ end }}<|im_start|>assistant
"""
With llama.cpp
./main -m quantum-readiness-gui-f16.gguf \
-p "<|im_start|>user\nWhat algorithms are vulnerable to quantum attacks?<|im_end|>\n<|im_start|>assistant\n" \
-n 512 --temp 0.7
With Transformers (Merged Model)
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("path/to/merged-model-gui")
tokenizer = AutoTokenizer.from_pretrained("path/to/merged-model-gui")
messages = [
{"role": "user", "content": "What is ML-KEM (Kyber)?"}
]
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=512)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Training Data
The model was trained on ~490 examples covering:
- PQC Vulnerability Scanning: RSA, ECDSA, ECDH, DSA, DH, DES, 3DES, RC4, MD5, SHA-1 detection and remediation
- Quantum Feasibility Analysis: QAOA, VQE, Grover's, HHL, QML opportunity identification
- Migration Guides: Step-by-step code examples for transitioning to post-quantum algorithms
- NIST Standards: FIPS 203, 204, 205 compliance guidance
Training Configuration
model:
base_model: "Qwen/Qwen2.5-3B-Instruct"
load_in_4bit: true
bnb_4bit_quant_type: "nf4"
lora:
r: 32
lora_alpha: 64
lora_dropout: 0.05
target_modules: ["q_proj", "k_proj", "v_proj", "o_proj", "gate_proj", "up_proj", "down_proj"]
training:
num_train_epochs: 3
per_device_train_batch_size: 4
gradient_accumulation_steps: 4
learning_rate: 2.0e-4
lr_scheduler_type: "cosine"
NIST PQC Standards Reference
| Algorithm | Purpose | Standard | Status |
|---|---|---|---|
| ML-KEM (Kyber) | Key Encapsulation | FIPS 203 | Finalized |
| ML-DSA (Dilithium) | Digital Signatures | FIPS 204 | Finalized |
| SLH-DSA (SPHINCS+) | Stateless Signatures | FIPS 205 | Finalized |
Quantum Algorithm Categories
| Category | Algorithm | Use Case |
|---|---|---|
| Optimization | QAOA | Portfolio optimization, scheduling, logistics |
| Simulation | VQE | Chemistry, materials science |
| Search | Grover's | Database search, cryptanalysis |
| Linear Algebra | HHL | Machine learning, optimization |
| Machine Learning | QML | Classification, clustering |
Limitations
- Not a Security Tool: This model provides guidance but should not replace professional security audits
- General Patterns: File paths and line numbers in responses are illustrative, not from actual scans
- Knowledge Cutoff: Based on NIST standards as of early 2025
- No Code Execution: The model provides analysis but does not execute code or access repositories
Ethical Considerations
- Use responsibly for security improvement, not for identifying vulnerabilities to exploit
- Verify all recommendations with security professionals before implementation
- The model is intended for defensive security purposes only
Files
| File | Description | Size |
|---|---|---|
quantum-readiness-gui-f16.gguf |
GGUF model (F16 quantization) | ~6.2 GB |
Modelfile |
Ollama configuration | ~2 KB |
Citation
@misc{quantum-readiness-llm,
title={Quantum Readiness LLM: A Fine-tuned Model for Post-Quantum Cryptography Analysis},
author={ArcQubit},
year={2024},
publisher={Hugging Face},
url={https://huggingface.co/tbowman/quantum-readiness-gui}
}
License
Apache 2.0
Acknowledgments
- Base model: Qwen/Qwen2.5-3B-Instruct
- NIST Post-Quantum Cryptography Standardization
- Hugging Face Transformers and PEFT libraries
- Downloads last month
- -
Hardware compatibility
Log In
to view the estimation
16-bit