Przeglądaj źródła

Added mention of the NOPRUNE env var

Signed-off-by: Matt Williams <m@technovangelist.com>
Matt Williams 1 rok temu
rodzic
commit
0d4fa34aee
1 zmienionych plików z 4 dodań i 0 usunięć
  1. 4 0
      docs/faq.md

+ 4 - 0
docs/faq.md

@@ -95,6 +95,10 @@ The manifest lists all the layers used in this model. You will see a `media type
 
 
 To modify where models are stored, you can use the `OLLAMA_MODELS` environment variable. Note that on Linux this means defining `OLLAMA_MODELS` in a drop-in `/etc/systemd/system/ollama.service.d` service file, reloading systemd, and restarting the ollama service.
 To modify where models are stored, you can use the `OLLAMA_MODELS` environment variable. Note that on Linux this means defining `OLLAMA_MODELS` in a drop-in `/etc/systemd/system/ollama.service.d` service file, reloading systemd, and restarting the ollama service.
 
 
+### I downloaded most of a model yesterday, but it's gone today. What happened?
+
+When the Ollama server starts, it looks for fragments of models that still exist on the system and cleans them out. If you have an Internet connection that can't complete a model download all at once, this can be frustrating. Adding the OLLAMA_NOPRUNE environment variable will prevent the server from pruning incomplete files.
+
 ## Does Ollama send my prompts and answers back to Ollama.ai to use in any way?
 ## Does Ollama send my prompts and answers back to Ollama.ai to use in any way?
 
 
 No. Anything you do with Ollama, such as generate a response from the model, stays with you. We don't collect any data about how you use the model. You are always in control of your own data.
 No. Anything you do with Ollama, such as generate a response from the model, stays with you. We don't collect any data about how you use the model. You are always in control of your own data.