Skip to main content

BGE on Hugging Face

BGE models on the HuggingFace are one of the best open-source embedding models. BGE model is created by the Beijing Academy of Artificial Intelligence (BAAI). BAAI is a private non-profit organization engaged in AI research and development.

This notebook shows how to use BGE Embeddings through Hugging Face

%pip install --upgrade --quiet  sentence_transformers
from langchain_community.embeddings import HuggingFaceBgeEmbeddings

model_name = "BAAI/bge-small-en"
model_kwargs = {"device": "cpu"}
encode_kwargs = {"normalize_embeddings": True}
hf = HuggingFaceBgeEmbeddings(
model_name=model_name, model_kwargs=model_kwargs, encode_kwargs=encode_kwargs
)

Note that you need to pass query_instruction="" for model_name="BAAI/bge-m3" see FAQ BGE M3.

embedding = hf.embed_query("hi this is harrison")
len(embedding)
384

Was this page helpful?


You can also leave detailed feedback on GitHub.