浏览代码

Merge pull request #246 from gnscc/main

Refactor docker-compose configuration for modularity
Timothy Jaeryang Baek 1 年之前
父节点
当前提交
295ebb4f25
共有 4 个文件被更改,包括 33 次插入13 次删除
  1. 13 1
      README.md
  2. 7 0
      docker-compose.api.yml
  3. 13 0
      docker-compose.gpu.yml
  4. 0 12
      docker-compose.yml

+ 13 - 1
README.md

@@ -79,7 +79,19 @@ If you don't have Ollama installed yet, you can use the provided Docker Compose
 docker compose up -d --build
 docker compose up -d --build
 ```
 ```
 
 
-This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support and Exposing Ollama API outside the container stack if needed.
+This command will install both Ollama and Ollama Web UI on your system.
+
+#### Enable GPU
+Use the additional Docker Compose file designed to enable GPU support by running the following command:
+```bash
+docker compose -f docker-compose.yml -f docker-compose.gpu.yml up -d --build
+```
+
+#### Expose Ollama API outside the container stack
+Deploy the service with an additional Docker Compose file designed for API exposure:
+```bash
+docker compose -f docker-compose.yml -f docker-compose.api.yml up -d --build
+```
 
 
 ### Installing Ollama Web UI Only
 ### Installing Ollama Web UI Only
 
 

+ 7 - 0
docker-compose.api.yml

@@ -0,0 +1,7 @@
+version: '3.6'
+
+services:
+  ollama:
+    # Expose Ollama API outside the container stack
+    ports:
+      - 11434:11434

+ 13 - 0
docker-compose.gpu.yml

@@ -0,0 +1,13 @@
+version: '3.6'
+
+services:
+  ollama:
+    # GPU support
+    deploy:
+      resources:
+        reservations:
+          devices:
+            - driver: nvidia
+              count: 1
+              capabilities:
+                - gpu

+ 0 - 12
docker-compose.yml

@@ -2,20 +2,8 @@ version: '3.6'
 
 
 services:
 services:
   ollama:
   ollama:
-    # Uncomment below for GPU support
-    # deploy:
-    #   resources:
-    #     reservations:
-    #       devices:
-    #         - driver: nvidia
-    #           count: 1
-    #           capabilities:
-    #             - gpu
     volumes:
     volumes:
       - ollama:/root/.ollama
       - ollama:/root/.ollama
-    # Uncomment below to expose Ollama API outside the container stack
-    # ports:
-    #   - 11434:11434
     container_name: ollama
     container_name: ollama
     pull_policy: always
     pull_policy: always
     tty: true
     tty: true