Instructions to use nomic-ai/CodeRankEmbed with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- sentence-transformers
How to use nomic-ai/CodeRankEmbed with sentence-transformers:
from sentence_transformers import SentenceTransformer model = SentenceTransformer("nomic-ai/CodeRankEmbed", trust_remote_code=True) sentences = [ "The weather is lovely today.", "It's so sunny outside!", "He drove to the stadium." ] embeddings = model.encode(sentences) similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] - Notebooks
- Google Colab
- Kaggle
fix: Missing get_input_embeddings / set_input_embeddings on NomicBertModel
#3
by Pringled - opened
modeling_hf_nomic_bert.py
CHANGED
|
@@ -1050,6 +1050,12 @@ class NomicBertModel(NomicBertPreTrainedModel):
|
|
| 1050 |
|
| 1051 |
self.apply(partial(_init_weights, initializer_range=config.initializer_range))
|
| 1052 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1053 |
def forward(
|
| 1054 |
self,
|
| 1055 |
input_ids,
|
|
|
|
| 1050 |
|
| 1051 |
self.apply(partial(_init_weights, initializer_range=config.initializer_range))
|
| 1052 |
|
| 1053 |
+
def get_input_embeddings(self):
|
| 1054 |
+
return self.embeddings.word_embeddings
|
| 1055 |
+
|
| 1056 |
+
def set_input_embeddings(self, value):
|
| 1057 |
+
self.embeddings.word_embeddings = value
|
| 1058 |
+
|
| 1059 |
def forward(
|
| 1060 |
self,
|
| 1061 |
input_ids,
|