Instructions to use bigcode/starencoder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use bigcode/starencoder with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForPreTraining tokenizer = AutoTokenizer.from_pretrained("bigcode/starencoder") model = AutoModelForPreTraining.from_pretrained("bigcode/starencoder") - Notebooks
- Google Colab
- Kaggle
Finetuning on AWS Sagemaker
#4
by dshah3 - opened
Hi, I have a set of p4 (A100) instances available through Sagemaker training jobs. I would like to finetune StarEnCoder on two tasks: text classification and code retrieval. Would I be able to use the HuggingFace "Train" SageMaker interface and the Transformers library to run a fine-tuning job? Or would I have to make my own fine-tuning script?