Will you add embedding models any time soon?
like ollama embeddings, open ai embeddings, gemini embeddings
Hi Mohamed, Currently we don’t have any plans to release embedding models, but I generally like using embedding models from HuggingFace or Mixedbread. Hope that helps! Jan
Could you remove mention of it from your API clients then?
from groq import Groq
client = Groq()
embedding = client.embeddings.create(
#
# Required parameters
#
# The input texts to embed.
input=["hello world"],
# The model to use.
model="nomic-embed-text-v1.5",
#
# Optional parameters
#
# Format to return the embeddings in.
# Only "float" is supported at the moment.
encoding_format="float",
# A unique identifier representing your end-user.
user="user",
This file has been truncated. show original
// File generated from our OpenAPI spec by Stainless. See CONTRIBUTING.md for details.
import { APIResource } from '../resource';
import * as Core from '../core';
export class Embeddings extends APIResource {
/**
* Creates an embedding vector representing the input text.
*
* @example
* ```ts
* const createEmbeddingResponse =
* await client.embeddings.create({
* input: 'The quick brown fox jumped over the lazy dog',
* model: 'nomic-embed-text-v1_5',
* });
* ```
*/
create(
body: EmbeddingCreateParams,
This file has been truncated. show original
It’s very confusing to see it in client code, then completely missing from API docs:
API Reference - GroqDocs It also leads to content farms abusing the confusion to generate slop:
https://medium.com/towards-agi/how-to-implement-groq-embeddings-in-langchain-tutorial-9608fb417f6f
How to Implement Groq Embeddings in LangChain: A Comprehensive Tutorial