Instructions to use bumblebee-testing/tiny-random-Gemma3TextForSequenceClassification with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use bumblebee-testing/tiny-random-Gemma3TextForSequenceClassification with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="bumblebee-testing/tiny-random-Gemma3TextForSequenceClassification")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("bumblebee-testing/tiny-random-Gemma3TextForSequenceClassification") model = AutoModelForSequenceClassification.from_pretrained("bumblebee-testing/tiny-random-Gemma3TextForSequenceClassification") - Notebooks
- Google Colab
- Kaggle
Upload Gemma3TextForSequenceClassification
Browse files- config.json +1 -1
- model.safetensors +1 -1
config.json
CHANGED
|
@@ -28,7 +28,7 @@
|
|
| 28 |
"num_hidden_layers": 2,
|
| 29 |
"num_key_value_heads": 2,
|
| 30 |
"pad_token_id": 0,
|
| 31 |
-
"query_pre_attn_scalar":
|
| 32 |
"rms_norm_eps": 1e-06,
|
| 33 |
"rope_local_base_freq": 10000.0,
|
| 34 |
"rope_scaling": null,
|
|
|
|
| 28 |
"num_hidden_layers": 2,
|
| 29 |
"num_key_value_heads": 2,
|
| 30 |
"pad_token_id": 0,
|
| 31 |
+
"query_pre_attn_scalar": 8,
|
| 32 |
"rms_norm_eps": 1e-06,
|
| 33 |
"rope_local_base_freq": 10000.0,
|
| 34 |
"rope_scaling": null,
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 954424
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:78fc8a3ab37254845fcd03d640819179cef305fb6e23d7ca380a1f1c95c159cf
|
| 3 |
size 954424
|