|
@@ -13,7 +13,7 @@ Get up and running with large language models locally.
|
|
|
|
|
|
### macOS
|
|
### macOS
|
|
|
|
|
|
-[Download](https://ollama.ai/download/Ollama-darwin.zip)
|
|
|
|
|
|
+[Download](https://ollama.ai/download/Ollama-darwin.zip)
|
|
|
|
|
|
### Linux & WSL2
|
|
### Linux & WSL2
|
|
|
|
|
|
@@ -23,7 +23,7 @@ curl https://ollama.ai/install.sh | sh
|
|
|
|
|
|
[Manual install instructions](https://github.com/jmorganca/ollama/blob/main/docs/linux.md)
|
|
[Manual install instructions](https://github.com/jmorganca/ollama/blob/main/docs/linux.md)
|
|
|
|
|
|
-### Windows
|
|
|
|
|
|
+### Windows
|
|
|
|
|
|
coming soon
|
|
coming soon
|
|
|
|
|
|
@@ -37,7 +37,7 @@ ollama run llama2
|
|
|
|
|
|
## Model library
|
|
## Model library
|
|
|
|
|
|
-Ollama supports a list of open-source models available on [ollama.ai/library](https://ollama.ai/library "ollama model library")
|
|
|
|
|
|
+Ollama supports a list of open-source models available on [ollama.ai/library](https://ollama.ai/library 'ollama model library')
|
|
|
|
|
|
Here are some example open-source models that can be downloaded:
|
|
Here are some example open-source models that can be downloaded:
|
|
|
|
|
|
@@ -66,13 +66,13 @@ Ollama supports importing GGUF and GGML file formats in the Modelfile. This mean
|
|
FROM ./vicuna-33b.Q4_0.gguf
|
|
FROM ./vicuna-33b.Q4_0.gguf
|
|
```
|
|
```
|
|
|
|
|
|
-3. Create the model in Ollama
|
|
|
|
|
|
+2. Create the model in Ollama
|
|
|
|
|
|
```
|
|
```
|
|
ollama create name -f path_to_modelfile
|
|
ollama create name -f path_to_modelfile
|
|
```
|
|
```
|
|
|
|
|
|
-5. Run the model
|
|
|
|
|
|
+3. Run the model
|
|
|
|
|
|
```
|
|
```
|
|
ollama run name
|
|
ollama run name
|
|
@@ -109,7 +109,7 @@ ollama run mario
|
|
Hello! It's your friend Mario.
|
|
Hello! It's your friend Mario.
|
|
```
|
|
```
|
|
|
|
|
|
-For more examples, see the [examples](./examples) directory. For more information on working with a Modelfile, see the [Modelfile](./docs/modelfile.md) documentation.
|
|
|
|
|
|
+For more examples, see the [examples](examples) directory. For more information on working with a Modelfile, see the [Modelfile](docs/modelfile.md) documentation.
|
|
|
|
|
|
## CLI Reference
|
|
## CLI Reference
|
|
|
|
|
|
@@ -195,7 +195,7 @@ Finally, in a separate shell, run a model:
|
|
|
|
|
|
## REST API
|
|
## REST API
|
|
|
|
|
|
-> See the [API documentation](./docs/api.md) for all endpoints.
|
|
|
|
|
|
+> See the [API documentation](docs/api.md) for all endpoints.
|
|
|
|
|
|
Ollama has an API for running and managing models. For example to generate text from a model:
|
|
Ollama has an API for running and managing models. For example to generate text from a model:
|
|
|
|
|