Browse Source

doc: setup instructions updated

Timothy J. Baek 1 year ago
parent
commit
9ddde1f833
1 changed files with 13 additions and 11 deletions
  1. 13 11
      README.md

+ 13 - 11
README.md

@@ -57,13 +57,9 @@ ChatGPT-Style Web Interface for Ollama 🦙
 
 ## How to Install 🚀
 
-### Prerequisites
+### Installing Both Ollama and Ollama Web UI Using Docker Compose
 
-Make sure you have the latest version of Ollama installed before proceeding with the installation. You can find the latest version of Ollama at [https://ollama.ai/](https://ollama.ai/).
-
-#### Installing Both Ollama and Ollama Web UI Using Docker Compose
-
-If you don't have Ollama installed, you can also use the provided Docker Compose file for a hassle-free installation. Simply run the following command:
+If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command:
 
 ```bash
 docker compose up --build
@@ -71,13 +67,19 @@ docker compose up --build
 
 This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support if needed.
 
-#### Checking Ollama
+### Installing Ollama Web UI Only
+
+#### Prerequisites
+
+Make sure you have the latest version of Ollama installed before proceeding with the installation. You can find the latest version of Ollama at [https://ollama.ai/](https://ollama.ai/).
+
+##### Checking Ollama
 
-After installing, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration.
+After installing Ollama, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration.
 
-### Using Docker 🐳
+#### Using Docker 🐳
 
-If Ollama is hosted on your local machine, run the following command:
+If Ollama is hosted on your local machine and accessible at [http://127.0.0.1:11434/](http://127.0.0.1:11434/), run the following command:
 
 ```bash
 docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
@@ -92,7 +94,7 @@ docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name o
 
 Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000) and accessible over LAN (or Network). Enjoy! 😄
 
-### Accessing External Ollama on a Different Server
+#### Accessing External Ollama on a Different Server
 
 Change `OLLAMA_API_BASE_URL` environment variable to match the external Ollama Server url: