|
1 年之前 | |
---|---|---|
desktop | 1 年之前 | |
models | 1 年之前 | |
.gitignore | 1 年之前 | |
LICENSE | 1 年之前 | |
README.md | 1 年之前 | |
build.py | 1 年之前 | |
model_prompts.json | 1 年之前 | |
ollama.py | 1 年之前 | |
requirements.txt | 1 年之前 | |
template.py | 1 年之前 |
🙊
Install dependencies:
pip install -r requirements.txt
Put your model in models/
and run:
python3 ollama.py serve
To run the app:
cd desktop
npm install
npm start
If using Apple silicon, you need a Python version that supports arm64:
wget https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-MacOSX-arm64.sh
bash Miniforge3-MacOSX-arm64.sh
Get the dependencies:
pip install -r requirements.txt
Then build a binary for your current platform:
python3 build.py
cd desktop
npm run package
GET /models
Returns a list of available models
POST /generate
Generates completions as a series of JSON objects
model: string
- The name of the model to use in the models
folder.
prompt: string
- The prompt to use.