瀏覽代碼

remove unnecessary `-X POST` from example `curl` commands

Jeffrey Morgan 1 年之前
父節點
當前提交
81092147c4
共有 2 個文件被更改,包括 12 次插入13 次删除
  1. 1 1
      README.md
  2. 11 12
      docs/api.md

+ 1 - 1
README.md

@@ -206,7 +206,7 @@ Ollama has a REST API for running and managing models.
 For example, to generate text from a model:
 For example, to generate text from a model:
 
 
 ```
 ```
-curl -X POST http://localhost:11434/api/generate -d '{
+curl http://localhost:11434/api/generate -d '{
   "model": "llama2",
   "model": "llama2",
   "prompt":"Why is the sky blue?"
   "prompt":"Why is the sky blue?"
 }'
 }'

+ 11 - 12
docs/api.md

@@ -58,7 +58,7 @@ Enable JSON mode by setting the `format` parameter to `json` and specifying the
 #### Request
 #### Request
 
 
 ```shell
 ```shell
-curl -X POST http://localhost:11434/api/generate -d '{
+curl http://localhost:11434/api/generate -d '{
   "model": "llama2",
   "model": "llama2",
   "prompt": "Why is the sky blue?"
   "prompt": "Why is the sky blue?"
 }'
 }'
@@ -113,7 +113,7 @@ To calculate how fast the response is generated in tokens per second (token/s),
 #### Request (No streaming)
 #### Request (No streaming)
 
 
 ```shell
 ```shell
-curl -X POST http://localhost:11434/api/generate -d '{
+curl http://localhost:11434/api/generate -d '{
   "model": "llama2",
   "model": "llama2",
   "prompt": "Why is the sky blue?",
   "prompt": "Why is the sky blue?",
   "stream": false
   "stream": false
@@ -147,7 +147,7 @@ If `stream` is set to `false`, the response will be a single JSON object:
 In some cases you may wish to bypass the templating system and provide a full prompt. In this case, you can use the `raw` parameter to disable formatting and context.
 In some cases you may wish to bypass the templating system and provide a full prompt. In this case, you can use the `raw` parameter to disable formatting and context.
 
 
 ```shell
 ```shell
-curl -X POST http://localhost:11434/api/generate -d '{
+curl http://localhost:11434/api/generate -d '{
   "model": "mistral",
   "model": "mistral",
   "prompt": "[INST] why is the sky blue? [/INST]",
   "prompt": "[INST] why is the sky blue? [/INST]",
   "raw": true,
   "raw": true,
@@ -175,7 +175,7 @@ curl -X POST http://localhost:11434/api/generate -d '{
 #### Request (JSON mode)
 #### Request (JSON mode)
 
 
 ```shell
 ```shell
-curl -X POST http://localhost:11434/api/generate -d '{
+curl http://localhost:11434/api/generate -d '{
   "model": "llama2",
   "model": "llama2",
   "prompt": "What color is the sky at different times of the day? Respond using JSON",
   "prompt": "What color is the sky at different times of the day? Respond using JSON",
   "format": "json",
   "format": "json",
@@ -224,7 +224,7 @@ The value of `response` will be a string containing JSON similar to:
 If you want to set custom options for the model at runtime rather than in the Modelfile, you can do so with the `options` parameter. This example sets every available option, but you can set any of them individually and omit the ones you do not want to override.
 If you want to set custom options for the model at runtime rather than in the Modelfile, you can do so with the `options` parameter. This example sets every available option, but you can set any of them individually and omit the ones you do not want to override.
 
 
 ```shell
 ```shell
-curl -X POST http://localhost:11434/api/generate -d '{
+curl http://localhost:11434/api/generate -d '{
   "model": "llama2",
   "model": "llama2",
   "prompt": "Why is the sky blue?",
   "prompt": "Why is the sky blue?",
   "stream": false,
   "stream": false,
@@ -297,19 +297,18 @@ Create a model from a [`Modelfile`](./modelfile.md). It is recommended to set `m
 ### Parameters
 ### Parameters
 
 
 - `name`: name of the model to create
 - `name`: name of the model to create
-- `path`: path to the Modelfile (deprecated: please use modelfile instead)
 - `modelfile`: contents of the Modelfile
 - `modelfile`: contents of the Modelfile
 - `stream`: (optional) if `false` the response will be returned as a single response object, rather than a stream of objects
 - `stream`: (optional) if `false` the response will be returned as a single response object, rather than a stream of objects
+- `path` (deprecated): path to the Modelfile
 
 
 ### Examples
 ### Examples
 
 
 #### Request
 #### Request
 
 
 ```shell
 ```shell
-curl -X POST http://localhost:11434/api/create -d '{
+curl http://localhost:11434/api/create -d '{
   "name": "mario",
   "name": "mario",
-  "path": "~/Modelfile",
-  "modelfile": "FROM llama2"
+  "modelfile": "FROM llama2\nSYSTEM You are mario from Super Mario Bros."
 }'
 }'
 ```
 ```
 
 
@@ -509,7 +508,7 @@ Download a model from the ollama library. Cancelled pulls are resumed from where
 #### Request
 #### Request
 
 
 ```shell
 ```shell
-curl -X POST http://localhost:11434/api/pull -d '{
+curl http://localhost:11434/api/pull -d '{
   "name": "llama2"
   "name": "llama2"
 }'
 }'
 ```
 ```
@@ -581,7 +580,7 @@ Upload a model to a model library. Requires registering for ollama.ai and adding
 #### Request
 #### Request
 
 
 ```shell
 ```shell
-curl -X POST http://localhost:11434/api/push -d '{
+curl http://localhost:11434/api/push -d '{
   "name": "mattw/pygmalion:latest"
   "name": "mattw/pygmalion:latest"
 }'
 }'
 ```
 ```
@@ -649,7 +648,7 @@ Advanced parameters:
 #### Request
 #### Request
 
 
 ```shell
 ```shell
-curl -X POST http://localhost:11434/api/embeddings -d '{
+curl http://localhost:11434/api/embeddings -d '{
   "model": "llama2",
   "model": "llama2",
   "prompt": "Here is an article about llamas..."
   "prompt": "Here is an article about llamas..."
 }'
 }'