|
@@ -200,18 +200,21 @@ brew install cmake go
|
|
|
```
|
|
|
|
|
|
Then generate dependencies:
|
|
|
+
|
|
|
```
|
|
|
go generate ./...
|
|
|
```
|
|
|
+
|
|
|
Then build the binary:
|
|
|
+
|
|
|
```
|
|
|
go build .
|
|
|
```
|
|
|
|
|
|
More detailed instructions can be found in the [developer guide](https://github.com/jmorganca/ollama/blob/main/docs/development.md)
|
|
|
|
|
|
-
|
|
|
### Running local builds
|
|
|
+
|
|
|
Next, start the server:
|
|
|
|
|
|
```
|
|
@@ -253,6 +256,7 @@ See the [API documentation](./docs/api.md) for all endpoints.
|
|
|
## Community Integrations
|
|
|
|
|
|
### Web & Desktop
|
|
|
+
|
|
|
- [Bionic GPT](https://github.com/bionic-gpt/bionic-gpt)
|
|
|
- [HTML UI](https://github.com/rtcfirefly/ollama-ui)
|
|
|
- [Chatbot UI](https://github.com/ivanfioravanti/chatbot-ollama)
|
|
@@ -267,7 +271,6 @@ See the [API documentation](./docs/api.md) for all endpoints.
|
|
|
- [Ollama-SwiftUI](https://github.com/kghandour/Ollama-SwiftUI)
|
|
|
- [MindMac](https://mindmac.app)
|
|
|
|
|
|
-
|
|
|
### Terminal
|
|
|
|
|
|
- [oterm](https://github.com/ggozad/oterm)
|
|
@@ -307,7 +310,6 @@ See the [API documentation](./docs/api.md) for all endpoints.
|
|
|
- [Haystack](https://github.com/deepset-ai/haystack-integrations/blob/main/integrations/ollama.md)
|
|
|
- [Ollama for R - rollama](https://github.com/JBGruber/rollama)
|
|
|
|
|
|
-
|
|
|
### Mobile
|
|
|
|
|
|
- [Enchanted](https://github.com/AugustDev/enchanted)
|
|
@@ -329,4 +331,4 @@ See the [API documentation](./docs/api.md) for all endpoints.
|
|
|
- [Obsidian BMO Chatbot plugin](https://github.com/longy2k/obsidian-bmo-chatbot)
|
|
|
- [Open Interpreter](https://docs.openinterpreter.com/language-model-setup/local-models/ollama)
|
|
|
- [twinny](https://github.com/rjmacarthy/twinny) (Copilot and Copilot chat alternative using Ollama)
|
|
|
-
|
|
|
+- [Wingman-AI](https://github.com/RussellCanfield/wingman-ai) (Copilot code and chat alternative using Ollama and HuggingFace)
|