|
@@ -35,7 +35,7 @@ Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you c
|
|
|
|
|
|
- 🤖 **Multiple Model Support**: Seamlessly switch between different chat models for diverse interactions.
|
|
|
|
|
|
-- 🗃️ **Modelfile Builder**: Easily create Ollama modelfiles via the web UI. Create and add your own character to Ollama by customizing system prompts, conversation starters, and more.
|
|
|
+- 🧩 **Modelfile Builder**: Easily create Ollama modelfiles via the web UI. Create and add characters/agents, customize chat elements, and import modelfiles effortlessly through [OllamaHub](https://ollamahub.com/) integration.
|
|
|
|
|
|
- ⚙️ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel.
|
|
|
|
|
@@ -59,7 +59,7 @@ Also check our sibling project, [OllamaHub](https://ollamahub.com/), where you c
|
|
|
|
|
|
- 🌟 **Continuous Updates**: We are committed to improving Ollama Web UI with regular updates and new features.
|
|
|
|
|
|
-## 🔗 Also Check Out OllamaHub!
|
|
|
+## 🔗 Also Check Out OllamaHub!
|
|
|
|
|
|
Don't forget to explore our sibling project, [OllamaHub](https://ollamahub.com/), where you can discover, download, and explore customized Modelfiles. OllamaHub offers a wide range of exciting possibilities for enhancing your chat interactions with Ollama! 🚀
|
|
|
|
|
@@ -121,6 +121,29 @@ docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api --name
|
|
|
|
|
|
While we strongly recommend using our convenient Docker container installation for optimal support, we understand that some situations may require a non-Docker setup, especially for development purposes. Please note that non-Docker installations are not officially supported, and you might need to troubleshoot on your own.
|
|
|
|
|
|
+### TL;DR 🚀
|
|
|
+
|
|
|
+Run the following commands to install:
|
|
|
+
|
|
|
+```sh
|
|
|
+git clone https://github.com/ollama-webui/ollama-webui.git
|
|
|
+cd ollama-webui/
|
|
|
+
|
|
|
+# Copying required .env file
|
|
|
+cp -RPp example.env .env
|
|
|
+
|
|
|
+# Building Frontend
|
|
|
+npm i
|
|
|
+npm run build
|
|
|
+
|
|
|
+# Serving Frontend with the Backend
|
|
|
+cd ./backend
|
|
|
+pip install -r requirements.txt
|
|
|
+sh start.sh
|
|
|
+```
|
|
|
+
|
|
|
+You should have the Ollama Web UI up and running at http://localhost:8080/. Enjoy! 😄
|
|
|
+
|
|
|
### Project Components
|
|
|
|
|
|
The Ollama Web UI consists of two primary components: the frontend and the backend (which serves as a reverse proxy, handling static frontend files, and additional features). Both need to be running concurrently for the development environment using `npm run dev`. Alternatively, you can set the `PUBLIC_API_BASE_URL` during the build process to have the frontend connect directly to your Ollama instance or build the frontend as static files and serve them with the backend.
|
|
@@ -211,7 +234,6 @@ See [TROUBLESHOOTING.md](/TROUBLESHOOTING.md) for information on how to troubles
|
|
|
|
|
|
Here are some exciting tasks on our roadmap:
|
|
|
|
|
|
-
|
|
|
- 🔄 **Multi-Modal Support**: Seamlessly engage with models that support multimodal interactions, including images (e.g., LLava).
|
|
|
- 📚 **RAG Integration**: Experience first-class retrieval augmented generation support, enabling chat with your documents.
|
|
|
- 🔐 **Access Control**: Securely manage requests to Ollama by utilizing the backend as a reverse proxy gateway, ensuring only authenticated users can send specific requests.
|