LoRA Rank Calculator

Calculate LoRA parameters and memory savings.

LoRA Configuration

B

LoRA Parameters

8.39M

0.120% of original model

πŸ’ΎLoRA Memory
16.0 MB
πŸ“‰Memory Saved
74.9%

LoRA Details

Scaling Factor (alpha/r)2.00
Target ModulesQuery + Value
Params per Layer262.1K
Params per Module131.1K

Memory Comparison

Full Fine-tuning52.2 GB
LoRA Fine-tuning13.1 GB
Training Speedup~9.9x

Common settings: r=8-16 for simple tasks, r=32-64 for complex tasks. Alpha is typically 2x the rank.