Pārlūkot izejas kodu

added the drop capability and updated readme accordingly

Daniele Viti 1 gadu atpakaļ
vecāks
revīzija
7063f00b71
2 mainītis faili ar 18 papildinājumiem un 7 dzēšanām
  1. 12 3
      README.md
  2. 6 4
      run-compose.sh

+ 12 - 3
README.md

@@ -73,13 +73,22 @@ Don't forget to explore our sibling project, [OllamaHub](https://ollamahub.com/)
 
 
 ### Installing Both Ollama and Ollama Web UI Using Docker Compose
 ### Installing Both Ollama and Ollama Web UI Using Docker Compose
 
 
-If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command:
+If you don't have Ollama installed yet, you can use the provided bash script for a hassle-free installation. Simply run the following command:
 
 
+For cpu-only container
 ```bash
 ```bash
-docker compose up -d --build
+chmod +x run-compose.sh && ./run-compose.sh
 ```
 ```
 
 
-This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support and Exposing Ollama API outside the container stack if needed.
+For gpu-enabled container (to enable this you must have your gpu driver for docker, it mostly works with nvidia so this is the official install guide: [nvidia-container-toolkit](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html))
+```bash
+chmod +x run-compose.sh && ./run-compose.sh --enable-gpu[count=1]
+```
+
+Note that both the above commands will use the latest production docker image in repository, to be able to build the latest local version you'll need to append the `--build` parameter, for example:
+```bash
+./run-compose.sh --build --enable-gpu[count=1]
+```
 
 
 ### Installing Ollama Web UI Only
 ### Installing Ollama Web UI Only
 
 

+ 6 - 4
run-compose.sh

@@ -80,10 +80,12 @@ usage() {
     echo "  -h, --help                 Show this help message."
     echo "  -h, --help                 Show this help message."
     echo ""
     echo ""
     echo "Examples:"
     echo "Examples:"
-    echo "  $0 --enable-gpu[count=1]"
-    echo "  $0 --enable-api[port=11435]"
-    echo "  $0 --enable-gpu[count=1] --enable-api[port=12345] --webui[port=3000]"
-    echo "  $0 --enable-gpu[count=1] --enable-api[port=12345] --webui[port=3000] --data[folder=./ollama-data]"
+    echo "  ./$0 --drop"
+    echo "  ./$0 --enable-gpu[count=1]"
+    echo "  ./$0 --enable-api[port=11435]"
+    echo "  ./$0 --enable-gpu[count=1] --enable-api[port=12345] --webui[port=3000]"
+    echo "  ./$0 --enable-gpu[count=1] --enable-api[port=12345] --webui[port=3000] --data[folder=./ollama-data]"
+    echo "  ./$0 --enable-gpu[count=1] --enable-api[port=12345] --webui[port=3000] --data[folder=./ollama-data] --build"
     echo ""
     echo ""
     echo "This script configures and runs a docker-compose setup with optional GPU support, API exposure, and web UI configuration."
     echo "This script configures and runs a docker-compose setup with optional GPU support, API exposure, and web UI configuration."
     echo "About the gpu to use, the script automatically detects it using the "lspci" command."
     echo "About the gpu to use, the script automatically detects it using the "lspci" command."