Quellcode durchsuchen

add `LiteLLM` to `README.md`

Jeffrey Morgan vor 1 Jahr
Ursprung
Commit
ed969d2a06
1 geänderte Dateien mit 1 neuen und 0 gelöschten Zeilen
  1. 1 0
      README.md

+ 1 - 0
README.md

@@ -156,6 +156,7 @@ curl -X POST http://localhost:11434/api/generate -d '{
 
 - [LangChain](https://python.langchain.com/docs/integrations/llms/ollama) and [LangChain.js](https://js.langchain.com/docs/modules/model_io/models/llms/integrations/ollama) with a question-answering [example](https://js.langchain.com/docs/use_cases/question_answering/local_retrieval_qa).
 - [Continue](https://github.com/continuedev/continue) - embeds Ollama inside Visual Studio Code. The extension lets you highlight code to add to the prompt, ask questions in the sidebar, and generate code inline.
+- [LiteLLM](https://github.com/BerriAI/litellm) a lightweight python package to simplify LLM API calls
 - [Discord AI Bot](https://github.com/mekb-turtle/discord-ai-bot) - interact with Ollama as a chatbot on Discord.
 - [Raycast Ollama](https://github.com/MassimilianoPasquini97/raycast_ollama) - Raycast extension to use Ollama for local llama inference on Raycast.
 - [Simple HTML UI for Ollama](https://github.com/rtcfirefly/ollama-ui)