Instructions to use LocalLaws/LOCUS-Function with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use LocalLaws/LOCUS-Function with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="LocalLaws/LOCUS-Function")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("LocalLaws/LOCUS-Function") model = AutoModelForSequenceClassification.from_pretrained("LocalLaws/LOCUS-Function") - Notebooks
- Google Colab
- Kaggle
| base_model: answerdotai/ModernBERT-base | |
| library_name: transformers | |
| pipeline_tag: text-classification | |
| tags: | |
| - text-classification | |
| - legal | |
| - locus | |
| - modernbert | |
| license: apache-2.0 | |
| datasets: | |
| - LocalLaws/LOCUS-v1.0 | |
| # LocalLaws/LOCUS-Function | |
| A ModernBERT classifier for the **Primary Function** axis of the LOCUS | |
| (Local Ordinances Corpus, United States) dataset. | |
| Fine-tuned from [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) on | |
| [LocalLaws/LOCUS-v1.0](https://huggingface.co/datasets/LocalLaws/LOCUS-v1.0). | |
| ## Labels | |
| - `Context` | |
| - `Enforcement` | |
| - `Process` | |
| - `Rules` | |
| - `Structural` | |
| ## Training | |
| | | | | |
| |---|---| | |
| | Base model | `answerdotai/ModernBERT-base` | | |
| | Max length | 1024 | | |
| | Classifier pooling | `mean` | | |
| | Train / val / test | 79106 / 10447 / 10447 | | |
| ## Evaluation | |
| | | | | |
| |---|---| | |
| | Metric | macro-F1 | | |
| | Validation macro-F1 | 0.8443 | | |
| | Test macro-F1 | 0.8428 | | |
| | Test accuracy | 0.8849 | | |
| ``` | |
| precision recall f1-score support | |
| Context 0.8399 0.9138 0.8753 1033 | |
| Enforcement 0.7561 0.8682 0.8083 1032 | |
| Process 0.6038 0.7691 0.6765 654 | |
| Rules 0.9308 0.8570 0.8924 4896 | |
| Structural 0.9675 0.9555 0.9614 2832 | |
| accuracy 0.8849 10447 | |
| macro avg 0.8196 0.8727 0.8428 10447 | |
| weighted avg 0.8940 0.8849 0.8876 10447 | |
| ``` | |
| ## Usage | |
| ```python | |
| from transformers import AutoTokenizer, AutoModelForSequenceClassification | |
| import torch | |
| tok = AutoTokenizer.from_pretrained("LocalLaws/LOCUS-Function") | |
| model = AutoModelForSequenceClassification.from_pretrained("LocalLaws/LOCUS-Function") | |
| model.eval() | |
| text = "No person shall keep any swine within the city limits." | |
| enc = tok(text, return_tensors="pt", truncation=True, max_length=1024) | |
| with torch.no_grad(): | |
| logits = model(**enc).logits | |
| pred = logits.argmax(-1).item() | |
| print(model.config.id2label[pred]) | |
| ``` | |