File size: 1,718 Bytes
fc8723f 028ea45 fc8723f 028ea45 fc8723f 028ea45 fc8723f 028ea45 fc8723f 028ea45 fc8723f 028ea45 fc8723f 028ea45 fc8723f 028ea45 fc8723f 028ea45 fc8723f 028ea45 fc8723f 028ea45 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 |
---
license: apache-2.0
tags:
- pruned
- python
- optimized
- wanda
base_model: LGAI-EXAONE/EXAONE-4.0-1.2B
pipeline_tag: text-generation
---
# EXAONE-4.0-1.2B-python-aggressive
> 🎯 **PYTHON-optimized** | 📦 **Aggressive** pruning | ⚡ **30% weights pruned**
This model is a **aggressively pruned** version of [LGAI-EXAONE/EXAONE-4.0-1.2B](https://huggingface.co/LGAI-EXAONE/EXAONE-4.0-1.2B).
## Performance Comparison
| Category | Original | Pruned | Change |
|----------|----------|--------|--------|
| **Python** | 76.9% | 61.5% ⭐ | ↓ 15.4% |
| Html | 20.0% | 10.0% | ↓ 10.0% |
| Trivia | 86.7% | 53.3% | ↓ 33.3% |
| Math | 80.0% | 93.3% | ↑ 13.3% |
| Reasoning | 75.0% | 50.0% | ↓ 25.0% |
| Medical | 42.9% | 14.3% | ↓ 28.6% |
| Linux | 23.1% | 23.1% | → |
| Writing | 54.5% | 0.0% | ↓ 54.5% |
**Average**: 57.4% → 38.2% (-19.2%)
**Python Retention**: 80.0%

## Quick Start
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("CompactAI/EXAONE-4.0-1.2B-python-aggressive")
tokenizer = AutoTokenizer.from_pretrained("CompactAI/EXAONE-4.0-1.2B-python-aggressive")
inputs = tokenizer("Your prompt here", return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
## Technical Details
| Property | Value |
|----------|-------|
| Base Model | [LGAI-EXAONE/EXAONE-4.0-1.2B](https://huggingface.co/LGAI-EXAONE/EXAONE-4.0-1.2B) |
| Specialization | Python |
| Prune Mode | Aggressive |
| Weight Reduction | 30% weights pruned |
## License
This model inherits the license from the base model.
|