Summit Health Data

âš›ī¸ Quantum LLM Training

Quantum Student Learning Pipeline

Train quantum-compressed models using knowledge distillation from pretrained classical teacher models.

Quantum Training Pipeline

1

Teacher Model

Load Pretrained Classical Model

2

Quantum Student

Initialize Quantum Circuit

3

Distillation

Knowledge Transfer

4

Compression

Quantum Compression

5

Training

Quantum-Enhanced Training

âš›ī¸ Quantum Training Configuration

Recommended: Medical pretrained model (checkpoint-5000) available on Teraq Backend instance. This model is trained on medical data and is accessible from the quantum training instance. Located at: /home/ec2-user/Training_Data/models/tinyllama-1b-medical-phase1-48vcpu/checkpoint-5000
Hold Ctrl/Cmd to select multiple
Higher temperature = softer teacher predictions
Balance between teacher and student loss (0.7 = 70% teacher, 30% student)

📊 TensorBoard Status

Checking TensorBoard status...
Loading TensorBoard information...

📊 Training Logs Terminal

Checking for running training...
Loading training logs...

âš›ī¸ Quantum Distillation

Compress models using quantum-inspired algorithms for 2x-5x size reduction while maintaining performance. The quantum student learns from the classical teacher through knowledge distillation.

đŸ”Ŧ 29-Qubit Quantum Circuits

Advanced quantum ansatz layers with 29 qubits for enhanced model training and optimization. Uses MPS (Matrix Product State) backend for efficient simulation.

🔄 Hybrid Classical-Quantum Training

Seamlessly integrate quantum and classical training methods. Start with a pretrained classical teacher model and distill knowledge into a quantum-compressed student model.