|
@@ -1,5 +1,10 @@
|
|
|
# FAQ
|
|
|
|
|
|
+- [How can I view the logs?](#how-can-i-view-the-logs)
|
|
|
+- [How can I expose Ollama on my network?](#how-can-i-expose-ollama-on-my-network)
|
|
|
+- [How can I allow additional web origins to access Ollama?](#how-can-i-allow-additional-web-origins-to-access-ollama)
|
|
|
+- [Where are models stored?](#where-are-models-stored)
|
|
|
+
|
|
|
## How can I view the logs?
|
|
|
|
|
|
On macOS:
|
|
@@ -74,8 +79,6 @@ systemctl restart ollama
|
|
|
- macOS: Raw model data is stored under `~/.ollama/models`.
|
|
|
- Linux: Raw model data is stored under `/usr/share/ollama/.ollama/models`
|
|
|
|
|
|
-
|
|
|
-
|
|
|
Below the models directory you will find a structure similar to the following:
|
|
|
|
|
|
```shell
|
|
@@ -96,3 +99,11 @@ The manifest lists all the layers used in this model. You will see a `media type
|
|
|
### How can I change where Ollama stores models?
|
|
|
|
|
|
To modify where models are stored, you can use the `OLLAMA_MODELS` environment variable. Note that on Linux this means defining `OLLAMA_MODELS` in a drop-in `/etc/systemd/system/ollama.service.d` service file, reloading systemd, and restarting the ollama service.
|
|
|
+
|
|
|
+## Does Ollama send my prompts and answers back to Ollama.ai to use in any way?
|
|
|
+
|
|
|
+No. Anything you do with Ollama, such as generate a response from the model, stays with you. We don't collect any data about how you use the model. You are always in control of your own data.
|
|
|
+
|
|
|
+## How can I use Ollama in VSCode to help me code?
|
|
|
+
|
|
|
+There is already a large collection of plugins available for VSCode as well as other editors that leverage Ollama. You can see the list of plugins at the bottom of the main repository readme.
|