- Embedding
- Chat
- Completion
Chat
corresponding
to the package langchain/chat_models
in langchain:
API Initialization
To use the LLM services based on Baidu Qianfan, you have to initialize these parameters: You could either choose to init the AK,SK in environment variables or init params:Current supported models
- ERNIE-Bot-turbo (default models)
- ERNIE-Bot
- BLOOMZ-7B
- Llama-2-7b-chat
- Llama-2-13b-chat
- Llama-2-70b-chat
- Qianfan-BLOOMZ-7B-compressed
- Qianfan-Chinese-Llama-2-7B
- ChatGLM2-6B-32K
- AquilaChat-7B
Set up
Usage
Streaming
Use different models in Qianfan
The default model is ERNIE-Bot-turbo, in the case you want to deploy your own model based on Ernie Bot or third-party open-source model, you could follow these steps:- (Optional, if the model are included in the default models, skip it) Deploy your model in Qianfan Console, get your own customized deploy endpoint.
- Set up the field called
endpoint
in the initialization:
Model Params
For now, onlyERNIE-Bot
and ERNIE-Bot-turbo
support model params below, we might support more models in the future.
- temperature
- top_p
- penalty_score