瀏覽代碼

docs: add tokenize and detokenize api

Yurzs 8 月之前
父節點
當前提交
24613df094
共有 1 個文件被更改,包括 79 次插入0 次删除
  1. 79 0
      docs/api.md

+ 79 - 0
docs/api.md

@@ -13,6 +13,8 @@
 - [Push a Model](#push-a-model)
 - [Push a Model](#push-a-model)
 - [Generate Embeddings](#generate-embeddings)
 - [Generate Embeddings](#generate-embeddings)
 - [List Running Models](#list-running-models)
 - [List Running Models](#list-running-models)
+- [Tokenize Text](#tokenize-text)
+- [Detokenize Text](#detokenize-text)
 
 
 ## Conventions
 ## Conventions
 
 
@@ -1485,6 +1487,83 @@ A single JSON object will be returned.
 }
 }
 ```
 ```
 
 
+## Tokenize Text
+
+Tokenize text using a model
+
+```shell
+POST /api/tokenize
+```
+
+##### Parameters
+
+- `model`: name of model to generate tokens from
+- `prompt`: text to generate tokens for
+
+##### Advanced parameters:
+
+- `options`: additional model parameters listed in the documentation for the [Modelfile](./modelfile.md#valid-parameters-and-values) such as `temperature`
+- `keep_alive`: controls how long the model will stay loaded into memory following the request (default: `5m`)
+
+### Examples
+
+#### Request
+
+```shell
+curl -X POST http://localhost:11434/api/tokenize -d '{
+  "model": "llama3.1:8b",
+  "prompt": "Why the sky is blue?"
+}'
+```
+
+#### Response
+
+```json
+{
+  "model": "llama3.1:8b",
+  "tokens": [10445,279,13180,374,6437,30]
+}
+```
+
+## Detokenize Text
+
+Detokenize text using a model
+
+```shell
+POST /api/detokenize
+```
+
+#### Parameters
+
+- `model`: name of model to generate text from
+- `tokens`: list of tokens to generate text from
+
+##### Advanced parameters:
+
+- `options`: additional model parameters listed in the documentation for the [Modelfile](./modelfile.md#valid-parameters-and-values) such as `temperature`
+- `keep_alive`: controls how long the model will stay loaded into memory following the request (default: `5m`)
+
+### Examples
+
+#### Request
+
+```shell
+curl -X POST http://localhost:11434/api/detokenize -d '{
+  "model": "llama3.1:8b",
+  "tokens": [10445,279,13180,374,6437,30]
+}'
+```
+
+#### Response
+
+```json
+{
+  "model": "llama3.1:8b",
+  "text": "Why the sky is blue?"
+}
+```
+
+
 ## Generate Embedding
 ## Generate Embedding
 
 
 > Note: this endpoint has been superseded by `/api/embed`
 > Note: this endpoint has been superseded by `/api/embed`