OpenVINOEmbeddings
class. If you have an Intel GPU, you can specify model_kwargs={"device": "GPU"}
to run inference on it.
Export IR model
It is possible to export your embedding model to the OpenVINO IR format withOVModelForFeatureExtraction
, and load the model from local folder.
BGE with OpenVINO
We can also access BGE embedding models via theOpenVINOBgeEmbeddings
class with OpenVINO.