local wording was confusing people -- Ollama runs on cloud providers
@@ -6,7 +6,7 @@
[](https://discord.gg/ollama)
-Get up and running with large language models locally.
+Get up and running with large language models.
### macOS