Instructions to use JLB-JLB/Model_folder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use JLB-JLB/Model_folder with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-classification", model="JLB-JLB/Model_folder") pipe("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png")# Load model directly from transformers import AutoImageProcessor, AutoModelForImageClassification processor = AutoImageProcessor.from_pretrained("JLB-JLB/Model_folder") model = AutoModelForImageClassification.from_pretrained("JLB-JLB/Model_folder") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- ec573f63e1606eb347528f8a212a2ce3625fc771e380737ed536825175860c4a
- Size of remote file:
- 343 MB
- SHA256:
- 67b9469d169f377e895430f3db2de9f63fe8ec369d24ad2a18f8ce1b1dfbb15f
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.