add flags to server and client setup
This commit is contained in:
parent
01bc8235c5
commit
84039d271a
|
@ -36,3 +36,12 @@ poetry run 01
|
||||||
```bash
|
```bash
|
||||||
poetry run 01 --client
|
poetry run 01 --client
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Flags
|
||||||
|
|
||||||
|
- `--client`
|
||||||
|
Run client.
|
||||||
|
|
||||||
|
- `--client-type TEXT`
|
||||||
|
Specify the client type.
|
||||||
|
Default: `auto`.
|
||||||
|
|
|
@ -3,9 +3,86 @@ title: "Setup"
|
||||||
description: "Get your 01 server up and running"
|
description: "Get your 01 server up and running"
|
||||||
---
|
---
|
||||||
|
|
||||||
Setup (just run start.py --server , explain the flags (revealed via start.py --help))
|
|
||||||
|
|
||||||
- Interpreter
|
- Interpreter
|
||||||
- Open Interpreter (explains i.py, how you configure your interpreter, cover the basic settings of OI (that file is literally just modifying an interpreter from OI)
|
- Open Interpreter (explains i.py, how you configure your interpreter, cover the basic settings of OI (that file is literally just modifying an interpreter from OI)
|
||||||
- Language Model (LLM setup via interpreter.model in i.py or from the command line via start.py --server --llm-service llamafile)
|
- Language Model (LLM setup via interpreter.model in i.py or from the command line via start.py --server --llm-service llamafile)
|
||||||
- Voice Interface (explains that you can run --tts-service and --stt-service to swap out for different services, which are in /Services/Speech-to-text and /Services/Text-to-text)
|
- Voice Interface (explains that you can run --tts-service and --stt-service to swap out for different services, which are in /Services/Speech-to-text and /Services/Text-to-text)
|
||||||
|
|
||||||
|
## Run Server
|
||||||
|
|
||||||
|
```bash
|
||||||
|
poetry run 01 --server
|
||||||
|
```
|
||||||
|
|
||||||
|
## Flags
|
||||||
|
|
||||||
|
- `--server`
|
||||||
|
Run server.
|
||||||
|
|
||||||
|
- `--server-host TEXT`
|
||||||
|
Specify the server host where the server will deploy.
|
||||||
|
Default: `0.0.0.0`.
|
||||||
|
|
||||||
|
- `--server-port INTEGER`
|
||||||
|
Specify the server port where the server will deploy.
|
||||||
|
Default: `8000`.
|
||||||
|
|
||||||
|
- `--tunnel-service TEXT`
|
||||||
|
Specify the tunnel service.
|
||||||
|
Default: `ngrok`.
|
||||||
|
|
||||||
|
- `--expose`
|
||||||
|
Expose server to internet.
|
||||||
|
|
||||||
|
- `--server-url TEXT`
|
||||||
|
Specify the server URL that the client should expect.
|
||||||
|
Defaults to server-host and server-port.
|
||||||
|
Default: `None`.
|
||||||
|
|
||||||
|
- `--llm-service TEXT`
|
||||||
|
Specify the LLM service.
|
||||||
|
Default: `litellm`.
|
||||||
|
|
||||||
|
- `--model TEXT`
|
||||||
|
Specify the model.
|
||||||
|
Default: `gpt-4`.
|
||||||
|
|
||||||
|
- `--llm-supports-vision`
|
||||||
|
Specify if the LLM service supports vision.
|
||||||
|
|
||||||
|
- `--llm-supports-functions`
|
||||||
|
Specify if the LLM service supports functions.
|
||||||
|
|
||||||
|
- `--context-window INTEGER`
|
||||||
|
Specify the context window size.
|
||||||
|
Default: `2048`.
|
||||||
|
|
||||||
|
- `--max-tokens INTEGER`
|
||||||
|
Specify the maximum number of tokens.
|
||||||
|
Default: `4096`.
|
||||||
|
|
||||||
|
- `--temperature FLOAT`
|
||||||
|
Specify the temperature for generation.
|
||||||
|
Default: `0.8`.
|
||||||
|
|
||||||
|
- `--tts-service TEXT`
|
||||||
|
Specify the TTS service.
|
||||||
|
Default: `openai`.
|
||||||
|
|
||||||
|
- `--stt-service TEXT`
|
||||||
|
Specify the STT service.
|
||||||
|
Default: `openai`.
|
||||||
|
|
||||||
|
- `--local`
|
||||||
|
Use recommended local services for LLM, STT, and TTS.
|
||||||
|
|
||||||
|
- `--install-completion [bash|zsh|fish|powershell|pwsh]`
|
||||||
|
Install completion for the specified shell.
|
||||||
|
Default: `None`.
|
||||||
|
|
||||||
|
- `--show-completion [bash|zsh|fish|powershell|pwsh]`
|
||||||
|
Show completion for the specified shell, to copy it or customize the installation.
|
||||||
|
Default: `None`.
|
||||||
|
|
||||||
|
- `--help`
|
||||||
|
Show this message and exit.
|
||||||
|
|
Loading…
Reference in New Issue