소스 검색

Move python docs to separate file

Jeffrey Morgan 1 년 전
부모
커밋
55898a3382
2개의 변경된 파일66개의 추가작업 그리고 58개의 파일을 삭제
  1. 2 58
      README.md
  2. 64 0
      docs/python.md

+ 2 - 58
README.md

@@ -1,6 +1,6 @@
 # Ollama
 
-Ollama is a tool for running any large language model on any machine. It's designed to be easy to use and fast, supporting the largest number of models possible by using the fastest loader available for your platform and model.
+Ollama is a tool for running large language models. It's designed to be easy to use and fast.
 
 > _Note: this project is a work in progress. Certain models that can be run with `ollama` are intended for research and/or non-commercial use only._
 
@@ -38,63 +38,7 @@ Or directly via downloaded model files:
 ollama run ~/Downloads/orca-mini-13b.ggmlv3.q4_0.bin
 ```
 
-## Python SDK
-
-### Example
-
-```python
-import ollama
-ollama.generate("orca-mini-3b", "hi")
-```
-
-### `ollama.generate(model, message)`
-
-Generate a completion
-
-```python
-ollama.generate("./llama-7b-ggml.bin", "hi")
-```
-
-### `ollama.models()`
-
-List available local models
-
-```python
-models = ollama.models()
-```
-
-### `ollama.load(model)`
-
-Manually a model for generation
-
-```python
-ollama.load("model")
-```
-
-### `ollama.unload(model)`
-
-Unload a model
-
-```python
-ollama.unload("model")
-```
-
-### `ollama.pull(model)`
-
-Download a model
-
-```python
-ollama.pull("huggingface.co/thebloke/llama-7b-ggml")
-```
-
-### `ollama.search(query)`
-
-Search for compatible models that Ollama can run
-
-```python
-ollama.search("llama-7b")
-```
-
 ## Documentation
 
 - [Development](docs/development.md)
+- [Python SDK](docs/python.md)

+ 64 - 0
docs/python.md

@@ -0,0 +1,64 @@
+# Python SDK
+
+## Install
+
+```
+pip install ollama
+```
+
+## Example
+
+```python
+import ollama
+ollama.generate("orca-mini-3b", "hi")
+```
+
+## Reference
+
+### `ollama.generate(model, message)`
+
+Generate a completion
+
+```python
+ollama.generate("./llama-7b-ggml.bin", "hi")
+```
+
+### `ollama.models()`
+
+List available local models
+
+```python
+models = ollama.models()
+```
+
+### `ollama.load(model)`
+
+Manually a model for generation
+
+```python
+ollama.load("model")
+```
+
+### `ollama.unload(model)`
+
+Unload a model
+
+```python
+ollama.unload("model")
+```
+
+### `ollama.pull(model)`
+
+Download a model
+
+```python
+ollama.pull("huggingface.co/thebloke/llama-7b-ggml")
+```
+
+### `ollama.search(query)`
+
+Search for compatible models that Ollama can run
+
+```python
+ollama.search("llama-7b")
+```