update readme and webui launch

Former-commit-id: c66ffa57323ef6ea78a9b75ec5122d9ea25fd420
This commit is contained in:
hiyouga
2024-05-04 00:43:02 +08:00
parent 99125c8825
commit 37bcbf72b4
3 changed files with 12 additions and 8 deletions

View File

@@ -344,11 +344,12 @@ To enable FlashAttention-2 on the Windows platform, you need to install the prec
#### Use local environment
```bash
export CUDA_VISIBLE_DEVICES=0 # `set CUDA_VISIBLE_DEVICES=0` for Windows
export GRADIO_SERVER_PORT=7860 # `set GRADIO_SERVER_PORT=7860` for Windows
llamafactory-cli webui
```
> [!TIPS]
> To modify the default setting in the LLaMA Board GUI, you can use environment variables, e.g., `export CUDA_VISIBLE_DEVICES=0 GRADIO_SERVER_NAME=0.0.0.0 GRADIO_SERVER_PORT=7860 GRADIO_SHARE=False` (use `set` command on Windows OS).
<details><summary>For Alibaba Cloud users</summary>
If you encountered display problems in LLaMA Board on Alibaba Cloud, try using the following command to set environment variables before starting LLaMA Board:
@@ -392,7 +393,8 @@ docker compose -f ./docker-compose.yml up -d
See [examples/README.md](examples/README.md) for usage.
Use `llamafactory-cli train -h` to display arguments description.
> [!TIPS]
> Use `llamafactory-cli train -h` to display arguments description.
### Deploy with OpenAI-style API and vLLM