.. |
__init__.py
|
feat: server multi models support (#799)
|
2023-08-12 00:57:00 +08:00 |
anthropic_model.py
|
feat: optimize anthropic connection pool (#1066)
|
2023-08-31 16:18:59 +08:00 |
azure_openai_model.py
|
feat: support basic feature of OpenAI new models (#1476)
|
2023-11-07 04:05:59 -06:00 |
baichuan_model.py
|
refactor advanced prompt core. (#1350)
|
2023-10-18 20:02:52 +08:00 |
base.py
|
feat: support basic feature of OpenAI new models (#1476)
|
2023-11-07 04:05:59 -06:00 |
chatglm_model.py
|
feat: hf inference endpoint stream support (#1028)
|
2023-08-26 19:48:34 +08:00 |
huggingface_hub_model.py
|
refactor advanced prompt core. (#1350)
|
2023-10-18 20:02:52 +08:00 |
localai_model.py
|
feat: add LocalAI local embedding model support (#1021)
|
2023-08-29 22:22:02 +08:00 |
minimax_model.py
|
feat: optimize minimax llm call (#1312)
|
2023-10-11 07:17:41 -05:00 |
openai_model.py
|
feat: [backend] vision support (#1510)
|
2023-11-13 22:05:46 +08:00 |
openllm_model.py
|
refactor advanced prompt core. (#1350)
|
2023-10-18 20:02:52 +08:00 |
replicate_model.py
|
feat: hf inference endpoint stream support (#1028)
|
2023-08-26 19:48:34 +08:00 |
spark_model.py
|
feat: hf inference endpoint stream support (#1028)
|
2023-08-26 19:48:34 +08:00 |
tongyi_model.py
|
fix: compatibility issues with the tongyi model. (#1310)
|
2023-10-11 05:16:26 -05:00 |
wenxin_model.py
|
feat: support weixin ernie-bot-4 and chat mode (#1375)
|
2023-10-18 02:35:24 -05:00 |
xinference_model.py
|
refactor advanced prompt core. (#1350)
|
2023-10-18 20:02:52 +08:00 |
zhipuai_model.py
|
fix: app config zhipu chatglm_std model, but it still use chatglm_lit… (#1377)
|
2023-10-18 05:07:36 -05:00 |