1011 Commits
v0.8.3 ... main

Author SHA1 Message Date
Pory
10a446e373 [model] ktransformers qwen3 support (#9485)
Co-authored-by: unknown <xiongchenhui@hisense.ad>
2025-11-13 20:09:44 +08:00
jiaqiw09
0aa4a051af [test] support slow skip and device skip in Uts (#9484) 2025-11-13 20:08:22 +08:00
Yaowei Zheng
8173a88a26 [assets] update readme (#9477) 2025-11-12 16:15:41 +08:00
Kingsley
fef86fa7fe [data] fix qwen3omni audio length calculation (#9467) 2025-11-12 10:37:15 +08:00
taohongsheng
5afa851f71 [misc] Modify pip install command for huggingface_hub (#9463) 2025-11-10 23:04:00 +08:00
MyungHa Kwon
a711bce664 [data] add openai format (#9449) 2025-11-06 20:10:20 +08:00
魅影
bd24350cbf [v1] add pair data converter (#9360)
Co-authored-by: frozenleaves <frozen@Mac.local>
2025-11-06 14:05:58 +08:00
Peilin Li
bd30c0003b [train] fix denominator of ga in ksft loss (#9409) 2025-11-05 20:53:23 +08:00
魅影
8edd2622ce [docker] update npu dockerfile (#9407)
Co-authored-by: frozenleaves <frozen@Mac.local>
2025-11-05 18:28:32 +08:00
Yaowei Zheng
eaf963f67f [model] update kt code (#9406) 2025-11-05 15:27:22 +08:00
Kingsley
56f45e826f [train] fix MPO re-weight (#9405) 2025-11-04 21:10:41 +08:00
魅影
14abb75126 [model] enable using FA in npu (#9397)
Co-authored-by: frozenleaves <frozen@Mac.local>
2025-11-04 19:32:30 +08:00
한송민
5a9939050e [model] add deepstack_merger_list to Qwen3-VL vision_model_keys (#9399) 2025-11-04 19:27:34 +08:00
Peilin Li
934b3084ee [train] KTransformers SFT as backend engine for LLaMA-Factory (#9400)
Co-authored-by: jimmy128 <jimmy128@noreply.gitcode.com>
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-11-04 15:54:12 +08:00
Yaowei Zheng
3ae15da9c0 [misc] lint code (#9395) 2025-11-03 22:08:59 +08:00
魅影
215580c77d [data] fix mm pluigin for qwen omni video training (#9388)
Co-authored-by: frozenleaves <frozen@Mac.local>
2025-11-03 11:44:27 +08:00
魅影
767b344fb4 [model] remove npu sdpa patch (#9368)
Co-authored-by: frozenleaves <frozen@Mac.local>
2025-10-30 16:26:35 +08:00
Kingsley
3057db15c3 [readme] upd mcore readme (#9352) 2025-10-27 21:23:31 +08:00
Kingsley
13170577b2 [feat] support megatron-LM training by mcore_adapter (#9237)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-10-26 16:21:30 +08:00
Xiaosu Zhu
129e918106 [data] Fix Qwen3VL plugin (#9297)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
Co-authored-by: kingsley <kingsleydodonow@gmail.com>
2025-10-26 16:07:04 +08:00
Yaowei Zheng
9c0d033a15 [model] add qwen3vl 2b & 32b (#9343) 2025-10-24 13:22:36 +08:00
Yaowei Zheng
2a822178de [deps] fix yanked packages (#9333) 2025-10-22 20:54:51 +08:00
Kingsley
b842457ef4 [ci] revert mac os ci setup (#9316) 2025-10-21 18:26:12 +08:00
魅影
2c6aded5d4 [v1] kernel plugin (#9274)
Co-authored-by: frozenleaves <frozen@Mac.local>
2025-10-18 18:02:14 +08:00
Yaowei Zheng
d9d67ba62d [misc] fix import error (#9299) 2025-10-17 17:46:27 +08:00
Yaowei Zheng
a442fa90ad [misc] fix import error (#9296) 2025-10-17 10:54:30 +08:00
wyfdgg
8c341cbaae [model] support hunyuan-mt model (#9284)
Co-authored-by: wyfdgg <liwenkun0812@163.com>
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-10-17 10:33:09 +08:00
Yaowei Zheng
47a7dc1698 [deps] upgrade vllm (#9293) 2025-10-16 23:20:26 +08:00
Yaowei Zheng
1037f63311 [model] add qwen3vl 4b + 8b (#9275) 2025-10-15 15:00:36 +08:00
Ximing Xing
c867e28093 [model] adds semantic initialization support for special tokens (#9267)
Co-authored-by: ximingxing <ximingxing@tencent.com>
2025-10-14 17:00:48 +08:00
Peter-Hamster
3dbca4b533 [data] add new reason tool calls demo data (#9249)
Co-authored-by: unknown <Peter Zeng>
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-10-13 17:16:47 +08:00
Yaowei Zheng
9d1acbc191 [ci] fix ci (#9265) 2025-10-13 16:24:40 +08:00
Yaowei Zheng
52e46e162e [v1] add data converter (#9263) 2025-10-13 15:54:47 +08:00
Jiayi Mao
48974783da [model]: add ernie4_5_moe support for DeepSpeed Zero3 training (#9262) 2025-10-13 13:13:31 +08:00
Yaowei Zheng
575e4099df [misc] add qwen bench script (#9259)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-13 11:45:25 +08:00
Yaowei Zheng
9687b71d3a [v1] init data plugins (#9248)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-09 22:36:48 +08:00
Yaowei Zheng
1c35db60d6 [v1] support read dataset (#9243) 2025-10-09 17:16:33 +08:00
Yaowei Zheng
10146029ba [v1] add v1 launcher (#9236)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-07 22:34:48 +08:00
Wu Wenhao
95b7188090 Merge commit from fork
* fix lfi and ssrf

* move utils to common

---------

Co-authored-by: d3do <chamlinx@outlook.com>
Co-authored-by: hiyouga <hiyouga@buaa.edu.cn>
2025-10-07 20:55:29 +08:00
Yaowei Zheng
d5bb4e6394 [assets] update readme (#9232) 2025-10-05 16:42:19 +08:00
Yaowei Zheng
3fe6f0febd [ci] update docker workflow (#9231) 2025-10-05 02:50:55 +08:00
Yaowei Zheng
40d3691e9e [misc] fix moe models (#9230) 2025-10-05 02:41:02 +08:00
Yaowei Zheng
af8437095a [ci] Change macOS version (#9229) 2025-10-05 02:18:30 +08:00
codingma
2e2f92701f [model] add qwen3-vl-30b (#9227) 2025-10-04 14:12:37 +08:00
Yaowei Zheng
7d60b840ef [v1] support switch v1 backend (#9226) 2025-10-02 15:59:19 +08:00
Yaowei Zheng
1d96c62df2 [v1] add v1 folders (#9225) 2025-10-02 15:25:57 +08:00
Yaowei Zheng
a0d44c650a [misc] add data files (#9224) 2025-10-02 14:02:07 +08:00
Yaowei Zheng
bcc2c1fd8f [misc] move wechat out (#9223) 2025-10-02 02:06:09 +08:00
Yaowei Zheng
7dd910f067 [misc] lint (#9221) 2025-10-01 22:58:58 +08:00
krli
d10d65e4ce [docker] update Dockerfile to set no_proxy and fix pydantic version (#8651) 2025-10-01 14:33:47 +08:00
Ben Feuer
1c44b60e3e [feat] fp8 training (#8960)
Co-authored-by: Benjamin Feuer <penfever@gmail.com>
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-10-01 14:32:53 +08:00
Yaowei Zheng
e2b1594d31 [data] fix reasoning template (#9219) 2025-09-30 18:11:45 +08:00
h7878778h
09dedf144f [npu] Redirect SDPA to torch_npu.npu_fusion_attention (opt-in, ZeRO-3 safe, no impact off NPU) (#8972) 2025-09-30 18:11:31 +08:00
魅影
a04d777d7f [cli] support lazy import (#9217)
Co-authored-by: frozenleaves <frozen@Mac.local>
2025-09-30 18:02:26 +08:00
Yaowei Zheng
6ffebe5ff7 [data] fix qwen omni plugin (#9204)
Co-authored-by: kingsley <kingsleydodonow@gmail.com>
2025-09-28 01:02:29 +08:00
xvxuopop
0761a4448f [model] add qwen3-vl/qwen3-omni (#9196)
Co-authored-by: kingsley <kingsleydodonow@gmail.com>
2025-09-27 01:21:47 +08:00
wangshaofei
abc3b1e1c4 [docs] update ling-v2 to the readme (#9188) 2025-09-24 15:23:21 +08:00
Hertz
344c760cc1 [model] supported ERNIE4.5 Text Models (#9165)
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-09-22 11:48:26 +08:00
Yaowei Zheng
80fe3a172d [model] add dots ocr (#9176) 2025-09-21 23:34:19 +08:00
Yaowei Zheng
800934b507 [assets] update readme (#9143) 2025-09-16 17:04:19 +08:00
Yaowei Zheng
e2ba32598d [assets] update readme (#9137) 2025-09-15 23:45:57 +08:00
Yaowei Zheng
812720909e [model] add qwen3 next (#9130) 2025-09-14 03:16:25 +08:00
Yaowei Zheng
260b5625c3 [assets] update wechat (#9129) 2025-09-14 03:05:08 +08:00
Yaowei Zheng
52488ac974 [deps] upgrade transformers to 4.56.1 (#9128) 2025-09-14 02:26:39 +08:00
Kingsley
610a3f1094 [data] Fix qwen_2vl with valuehead (#9078) 2025-09-14 02:22:20 +08:00
wangshaofei
a22dab97fd [data] bailing template v2 & openai data converter (#9112) 2025-09-14 02:20:34 +08:00
Yaowei Zheng
db223e3975 [misc] update readme (#9071) 2025-09-03 17:22:54 +08:00
Kingsley
7e710c6d3e [misc] update InternVL constants (#9046) 2025-08-29 13:30:28 +08:00
Kingsley
185f0556d4 [model] support Internvl3_5 (#9028) 2025-08-28 17:12:00 +08:00
Kingsley
1c675522fd [data] Fix glm4v mmplugin when not expand (#9029) 2025-08-27 11:45:24 +08:00
tc-mb
6c777375b7 [model] Support MiniCPM-V 4.5 (#9022) 2025-08-26 19:27:28 +08:00
Kingsley
9c433f6b41 [model] fix kimivl (#9018) 2025-08-25 16:32:23 +08:00
Yaowei Zheng
ec41ef08aa [misc] fix constants (#9008) 2025-08-23 23:04:30 +08:00
Kingsley
0ab0be9df2 [model] support Seed-OSS (#8992)
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-08-23 22:38:24 +08:00
Zeju Qiu
c14a5fefee [assets] fix readme error with OFT (#9000)
Co-authored-by: Zeju <zqiu@g003.internal.cluster.is.localnet>
2025-08-23 21:32:19 +08:00
Haian Huang(深度眸)
1664657d80 [model] Support Intern-S1-mini (#8976) 2025-08-20 23:52:51 +08:00
Kingsley
022a326ca4 [misc] update glm4v ligerkernel (#8978) 2025-08-20 23:39:56 +08:00
Yaowei Zheng
c1e1f24f5f [misc] fix gemma template (#8970) 2025-08-20 03:18:18 +08:00
Yaowei Zheng
2c31279316 [assets] update wechat (#8962) 2025-08-19 02:55:09 +08:00
Zeju Qiu
003a2acb1a [feature] adding orthogononal finetuning (OFT) to llama factory (#8623)
Co-authored-by: Zeju <zqiu@g003.internal.cluster.is.localnet>
Co-authored-by: Zeju <zqiu@login2.is.localnet>
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-08-18 18:22:47 +08:00
XLXW
1ada15981a [feature] add support for dft loss (#8917) 2025-08-15 23:29:57 +08:00
Kingsley
936f4fd78e [feature] Support MPO (#8930) 2025-08-15 15:09:59 +08:00
Kingsley
41648020db [data] Fix MiniCPMO default system prompt (#8906) 2025-08-13 23:05:20 +08:00
Yaowei Zheng
b8272a874b [assets] update wechat & readme (#8899) 2025-08-12 22:45:55 +08:00
Yaowei Zheng
e695fdfa70 [model] add qwen3 nothink (#8869) 2025-08-11 23:17:32 +08:00
Kingsley
893edb26d0 [model] support GLM4.5V (#8876) 2025-08-11 21:45:14 +08:00
Yaowei Zheng
dc61e78e77 [hparams] fix data args (#8863) 2025-08-08 15:35:50 +08:00
golangboy
ef507ae8e0 [file] Resolve file lock issue when deleting safetensors on Windows (#8839) 2025-08-08 14:59:54 +08:00
Kingsley
c244b1edb9 [model] add qwen3-4b-i/t (#8851) 2025-08-07 19:39:26 +08:00
kahlun
8a5d6c8a74 [data-loader] Allow dataset_dir to accept a dict for in-memory dataset_info (#8845) 2025-08-07 16:26:59 +08:00
Yaowei Zheng
b523543994 [data] fix template (#8827) 2025-08-06 06:58:09 +08:00
Yaowei Zheng
4dfad24902 [model] add gpt oss (#8826) 2025-08-06 05:56:46 +08:00
davidlightmysterion
c709c0378d [train] fix adjusting logits size after adding special tokens (#8823) 2025-08-05 20:35:07 +08:00
Yaowei Zheng
b5071f4b2c [assets] update readme & supporters (#8818) 2025-08-05 11:00:53 +08:00
Dennis Huang
4a9ca24122 [model] Support MiniCPM-V 4.0 (#8813) 2025-08-04 12:57:21 +08:00
Yaowei Zheng
cb03eb422d [model] Qwen3-30B-A3B-Thinking-2507 (#8788) 2025-07-30 23:05:12 +08:00
Yaowei Zheng
a416ab48d8 [deps] upgrade vllm to 0.10.0 (#8787) 2025-07-30 22:26:38 +08:00
Yaowei Zheng
d64651a637 [assets] update readme (#8784) 2025-07-30 17:57:17 +08:00
Yaowei Zheng
7c223c432b [model] add qwen3 2507 model (#8783) 2025-07-30 17:19:19 +08:00
Kingsley
52882d01c3 [model] support keye-vl-8b (#8776) 2025-07-29 21:24:08 +08:00
Kingsley
4e0bf35eb4 [model] update glm4.5 (#8770) 2025-07-29 19:57:29 +08:00
Yaowei Zheng
8efa506c16 [model] add qwen3 2507 models (#8750) 2025-07-25 20:21:47 +08:00
Kingsley
d6767f355a [model] add glm4moe (#8689) 2025-07-25 19:53:45 +08:00
Yaowei Zheng
c6e2871944 [assets] update readme (#8739) 2025-07-24 23:13:24 +08:00
Steven sun
9d6565d1a8 [model] support granite4 (#8680) 2025-07-21 14:15:36 +08:00
Kingsley
1639e4b587 [data] fix gemma3n mmplugin (#8627) 2025-07-14 13:32:57 +08:00
Kingsley
9c9b307d33 [model] add Devstral-Small-2507 (#8614) 2025-07-11 18:59:53 +08:00
Yaowei Zheng
cf13964c4c [webui] fix elems (#8587) 2025-07-08 21:16:42 +08:00
Kingsley
542fa97a72 [data] support glm4.1v video training (#8571) 2025-07-08 16:29:44 +08:00
Yaowei Zheng
7f8e5f52f9 [webui] fix abort finish (#8569) 2025-07-07 23:07:46 +08:00
Yaowei Zheng
12ed792db9 [webui] support other hub (#8567) 2025-07-07 22:18:48 +08:00
Yaowei Zheng
4b0ec83928 [deps] bump transformers to 4.49.0 (#8564) 2025-07-07 20:31:50 +08:00
Redwood-Digital
1b1ec9bfb6 [docs] add nvidia-container-toolkit to Linux Docker setup instructions (#8557) 2025-07-06 19:37:08 +08:00
Vivek Iyer
e0dfdb7dbb Revert "[model] add lora dropout to unsloth" - requested feature already exists (#8554)
Co-authored-by: viyer <vivek_iyer2@apple.com>
2025-07-05 11:25:31 +08:00
wjunLu
17ab40793b [ci] Add workflow for building NPU image (#8546)
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-07-04 20:56:59 +08:00
Vivek Iyer
0686206020 [model] add lora dropout to unsloth (#8548)
Co-authored-by: viyer <vivek_iyer2@apple.com>
2025-07-04 14:56:36 +08:00
Ze-Yi LIN
16f13d304b [tracking] fix swanlab hparams (#8532)
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-07-02 22:08:44 +08:00
Yaowei Zheng
57524751e0 [assets] update issue template (#8530) 2025-07-02 19:29:58 +08:00
Yaowei Zheng
906b31fd47 [assets] update readme (#8529) 2025-07-02 17:42:27 +08:00
Kingsley
bede213da7 [assets] update readme (#8519) 2025-07-02 15:38:38 +08:00
Kingsley
e9f70daabe [model] add gemma3n (#8509) 2025-07-01 22:37:24 +08:00
Yaowei Zheng
cbb65567a9 [assets] update wechat (#8517) 2025-07-01 21:23:48 +08:00
Injae Ryou
a5a93597b1 [parser] update config loading to use OmegaConf #7793 (#8505) 2025-07-01 21:05:13 +08:00
Kingsley
d17a672251 [model] add GLM-4.1V (#8462) 2025-06-30 01:09:41 +08:00
Liu Jiajun
4f0da0aec9 [data] fix gemma2 eos token (#8480)
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-06-27 18:19:15 +08:00
Yaowei Zheng
2c26ce6ac4 Merge commit from fork 2025-06-26 13:55:42 +08:00
Yaowei Zheng
abc6ce6168 [assets] update readme (#8461) 2025-06-25 22:15:03 +08:00
Yaowei Zheng
4407231a3b [webui] upgrade webui and fix api (#8460) 2025-06-25 21:59:58 +08:00
Yaowei Zheng
f276b9a963 [model] do not force load processor (#8457) 2025-06-25 19:43:00 +08:00
Yaowei Zheng
48897e5b16 [data] fix audio reader (#8448) 2025-06-24 20:53:20 +08:00
Yaowei Zheng
9cd81aa424 [misc] fix ci (#8441) 2025-06-24 16:37:32 +08:00
Kingsley
ecbccb4c5d [model] Add mistral-small 3.2 & kimi-dev (#8433) 2025-06-24 14:59:47 +08:00
Yaowei Zheng
9af7915f7b [model] add kimi vl 2506 (#8432) 2025-06-23 17:56:48 +08:00
Vivek Iyer
7b252b2368 [model] unsloth resume from checkpoint bug (#8423)
Co-authored-by: viyer <vivek_iyer2@apple.com>
2025-06-23 16:43:54 +08:00
Dhia Eddine Rhaiem
88a92be808 [model] add support for Falcon H1 (#8403) 2025-06-18 16:51:23 +08:00
Yaowei Zheng
c6c764388c [assets] update readme (#8396) 2025-06-17 16:15:20 +08:00
Yaowei Zheng
3e392473d1 [misc] set dev version (#8389) 2025-06-17 01:39:25 +08:00
Yaowei Zheng
ad345ec054 [ci] add docker version (#8390) 2025-06-17 01:37:47 +08:00
Yaowei Zheng
ca75f1edf3 [model] fix vlm utils (#8388) 2025-06-17 01:08:49 +08:00
Yaowei Zheng
3a3bae1cfe [data] fix qwen2vl pos ids (#8387) 2025-06-17 00:48:54 +08:00
Yaowei Zheng
31874e4f62 [version] release v0.9.3 (#8386) 2025-06-16 19:21:32 +08:00
Yaowei Zheng
9a2d1dec62 [assets] update wechat (#8385) 2025-06-16 18:23:22 +08:00
Aman Gupta
8e4ac78607 [trainer] Add LD-DPO objective (#8362) 2025-06-12 16:10:38 +08:00
Yaowei Zheng
44f1b9b5ad [misc] tiny fixes (#8348) 2025-06-10 15:30:58 +08:00
阿丹(adan)
b41697c9b6 [model] support MiniCPM4 (#8314) 2025-06-10 14:38:39 +08:00
Kingsley
31bca4d172 [model] support Mistral3.1 small 2503 (#8335) 2025-06-09 10:37:42 +08:00
Chenhao Zhang
fa4360dca7 [assets] Add awesome works used LLaMA-Factory (#8333) 2025-06-09 10:21:17 +08:00
Yaowei Zheng
9acab4949d [model] fix model generate (#8327) 2025-06-07 08:47:50 +08:00
Vivek Iyer
32b4574094 [model] pushing FFT with unsloth (#8325)
Co-authored-by: viyer <vivek_iyer2@apple.com>
2025-06-07 08:20:58 +08:00
Yaowei Zheng
03a93ec513 [data] fix empty template (#8312) 2025-06-06 13:50:50 +08:00
Yaowei Zheng
bcb6b94658 [setup] fix uv (#8311) 2025-06-06 11:54:15 +08:00
Yaowei Zheng
c0710be6d7 [assets] update readme (#8303) 2025-06-05 23:23:15 +08:00
Kingsley
212a8006dc [tests] add visual model save test (#8248)
Co-authored-by: Yaowei Zheng <hiyouga@buaa.edu.cn>
2025-06-05 20:38:01 +08:00
Yaowei Zheng
ed70f8d5a2 [assets] fix npu docker (#8298) 2025-06-05 19:09:20 +08:00
Butui Hu
1a33d65a56 [launcher] Add elastic and fault-tolerant training support (#8286)
Signed-off-by: Butui Hu <hot123tea123@gmail.com>
2025-06-05 16:40:03 +08:00
Kingsley
69c9e379d5 [script] add Script description for qwen_omni_merge (#8293) 2025-06-05 13:22:01 +08:00
Yaowei Zheng
e9fe9cee29 [assets] update docker files (#8291) 2025-06-04 23:30:46 +08:00
Yaowei Zheng
cb7ab69783 [assets] update readme (#8288) 2025-06-04 17:46:12 +08:00
Yaowei Zheng
c1ed76e109 [assets] add icon (#8276) 2025-06-03 20:36:21 +08:00
Kingsley
c224d17cb2 [data] support nested images input for videos (#8264) 2025-06-03 20:26:29 +08:00
Ze-Yi LIN
c4e51d40e0 [tracking] swanlab add llamafactory tag (#8258) 2025-06-03 18:42:29 +08:00
Kingsley
554e89ff02 [model] add MIMO_VL (#8249) 2025-06-01 03:54:54 +08:00
Yaowei Zheng
fee2122f09 [deps] upgrade transformers to 4.52.4 (#8245) 2025-05-31 16:51:40 +08:00
Akshat Sehgal
c7e63bead7 [model] add smollm2 support (#8220) 2025-05-31 16:29:01 +08:00
hoshi-hiyouga
3e1a7fcb9c [assets] update readme (#8235) 2025-05-30 16:52:12 +08:00
Kingsley
2aaede8ef4 [scripts] specify model class for qwen_omni merge (#8227) 2025-05-30 14:20:12 +08:00
hoshi-hiyouga
42bebc341d [model] add deepseek 0528 models (#8215) 2025-05-29 21:37:07 +08:00
hoshi-hiyouga
83a9ff5853 [assets] fix docker images (#8203) 2025-05-28 22:26:05 +08:00
yzoaim
519bab86e6 [workflow] auto push docker images (#8181)
Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-05-28 20:21:15 +08:00
hoshi-hiyouga
dbc9f5a5d9 [assets] update Dockerfile (#8201) 2025-05-28 20:20:59 +08:00
hoshi-hiyouga
9b152d9cb5 [webui] fix skip args (#8195) 2025-05-28 18:11:07 +08:00
Youngwoo Kim
6c3cd400b5 [data] Reading files from cloud is broken (#8182) (#8183) 2025-05-28 15:50:44 +08:00
hoshi-hiyouga
4d3ffa2ec4 [assets] fix docker image (#8180) 2025-05-27 19:01:31 +08:00
hoshi-hiyouga
2bf8e993ab [data] fix shared file system (#8179) 2025-05-27 18:36:03 +08:00
hoshi-hiyouga
d4a413eb37 [webui] add extra args to export (#8178) 2025-05-27 18:25:31 +08:00
hoshi-hiyouga
00974a3169 [assets] update docker files (#8176) 2025-05-27 18:15:23 +08:00
hoshi-hiyouga
46ccf84aaa [webui] add infer extra args (#8167) 2025-05-27 12:04:00 +08:00
hoshi-hiyouga
07343ca83d [webui] fix input args (#8162) 2025-05-27 02:05:54 +08:00
hoshi-hiyouga
3c7dc66a92 [model] add smollm2 and medgemma (#8161) 2025-05-26 23:19:58 +08:00
hoshi-hiyouga
ba032828e2 [deps] upgrade transformers (#8159) 2025-05-26 22:03:58 +08:00
Akshat Sehgal
501e7d8a8f feat: add smollm support (#8050) 2025-05-26 19:47:54 +08:00
wangzhan
12292e4283 [api] support repetition_penalty and align presence_penalty with OpenAI Client (#7958) 2025-05-26 18:45:11 +08:00
Kingsley
f08b748199 [data] fix internvl plugin when using PIL images (#8129) 2025-05-22 01:32:59 +08:00
hoshi-hiyouga
d2a3036a23 [misc] update data readme (#8128) 2025-05-21 22:41:18 +08:00
hoshi-hiyouga
9ae17cd173 [deps] update to transformers 4.52 (#8125) 2025-05-21 05:16:18 +08:00
hoshi-hiyouga
56926d76f9 [data] llama3 multi tool support (#8124) 2025-05-21 02:01:12 +08:00
hoshi-hiyouga
c2f6f2fa77 [assets] update readme (#8110) 2025-05-20 02:44:18 +08:00
hoshi-hiyouga
9b5baa97f0 [data] qwen3 fixes (#8109) 2025-05-20 02:00:30 +08:00
hoshi-hiyouga
45030ff803 [model] switch to gptqmodel (#8108) 2025-05-19 22:25:40 +08:00
piamo
bc7f00f2c7 [model] update rope kwargs for yarn (#8101) 2025-05-19 20:07:54 +08:00
hoshi-hiyouga
beae231af6 [doc] add no build isolation (#8103) 2025-05-19 19:25:13 +08:00
Ma, Xiaochen
a0b4b91577 [trainer] fix KeyError at end of pretrain (#8099) 2025-05-19 18:01:26 +08:00
Biao Wang
90492f3582 [misc] fix cli (#8095)
Co-authored-by: wangbiao11 <wangbiao11@baidu.com>
2025-05-19 17:59:39 +08:00
Saiya
ab41f7956c [infer] support lora adapter for SGLang backend (#8067) 2025-05-16 23:33:47 +08:00
Kingsley
52b23f9e56 [data] add forward compatibility for video_utils in Transformers 4.52.0 (#8077) 2025-05-16 17:41:04 +08:00
Eric Tang
a9aa392ba4 [data] support loading folder from remote (#8078) 2025-05-16 15:35:38 +08:00
Shawn Tao
0b773234e5 [infer] Modify vllm_infer.py to batch preprocess to avoid too much files opened error (#8051)
Co-authored-by: Kingsley <82590017+Kuangdd01@users.noreply.github.com>
2025-05-15 10:54:35 +08:00
hoshi-hiyouga
712c57f3b4 [assets] update windows installation (#8042) 2025-05-13 17:01:56 +08:00
hoshi-hiyouga
dc080399c6 [model] add seed coder and qwen3 quant models (#8039) 2025-05-13 15:59:55 +08:00
hoshi-hiyouga
68fc068cab [data] fix kimi vl template (#8015) 2025-05-11 20:45:19 +08:00
Kingsley
9620825892 [scripts] add video params for vllm infer (#7992) 2025-05-09 21:16:52 +08:00
yunhao-tech
26cbb03a5f [data] Avoid repetitive tool description warp (#8000)
Co-authored-by: chenyunhao <chenyunhao@wps.cn>
Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-05-09 21:16:37 +08:00
tpoisonooo
5f4b793e04 [docs] add GraphGen (#7974) 2025-05-07 12:23:11 +02:00
hoshi-hiyouga
994ab6424a [misc] update liger kernel patch (#7966) 2025-05-06 20:32:16 +02:00
hoshi-hiyouga
aa9ed4db59 [example] update examples (#7964) 2025-05-06 17:24:25 +02:00
Kingsley
ef86a53063 [model] add mimo7b (#7946) 2025-05-06 17:10:30 +02:00
hoshi-hiyouga
bf0286e1e3 [misc] fix qwen2 omni (#7962) 2025-05-06 15:39:13 +02:00
hoshi-hiyouga
ce7032e1b3 [model] add qwen2 omni 3b (#7945) 2025-05-03 16:36:51 +08:00
Eric Chen
5763017cea [assets] Warp Support README Update (#7887) 2025-05-02 00:08:48 +08:00
hoshi-hiyouga
13b05e74f1 [hparam] add enable think argument (#7928) 2025-04-30 17:21:30 +08:00
hoshi-hiyouga
c566e39b7d [data] fix base plugin (#7924) 2025-04-30 16:28:05 +08:00
hoshi-hiyouga
052ca871bd [data] optimize qwen3 loss computation (#7923) 2025-04-30 16:18:00 +08:00
hoshi-hiyouga
73198a6645 [misc] fix uv (#7913) 2025-04-30 07:45:03 +08:00
hoshi-hiyouga
d4ee44bdef [data] add eval_on_each_dataset arg (#7912) 2025-04-30 06:56:43 +08:00
hoshi-hiyouga
6d2cde43e7 [data] replace eos token for base models (#7911) 2025-04-30 06:52:28 +08:00
hoshi-hiyouga
11295cdea0 [data] improve mm plugin (#7910) 2025-04-30 06:34:28 +08:00
hoshi-hiyouga
98f23c6584 [model] add qwen3 (#7885) 2025-04-29 09:34:05 +08:00
Kingsley
db9559456c [data] fix qwen2.5 omni template (#7883) 2025-04-29 00:58:23 +08:00
hoshi-hiyouga
3ae5da2a04 [model] fix dsv3 leaf node (#7879) 2025-04-28 18:11:09 +08:00
hoshi-hiyouga
d173cb50f5 [data] fix qwen2 omni plugin (#7875) 2025-04-28 14:22:41 +08:00
zhaop-l
df27d7e48a [trainer] make projector trainable in freeze training (#7872)
Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-04-28 13:19:37 +08:00
hoshi-hiyouga
bb5b83352b [data] fix minicpmo vllm infer (#7870) 2025-04-28 01:59:53 +08:00
Kingsley
1157f4e246 fix attn patch for kimivl (#7867) 2025-04-27 23:12:28 +08:00
Eric Tang
ef03832cd4 [ray] add storage filesystem to ray config (#7854) 2025-04-27 22:12:40 +08:00
hoshi-hiyouga
2233b739fa [model] fix vit gradient checkpointing (#7830) 2025-04-23 22:48:48 +08:00
hoshi-hiyouga
091d2539e8 Merge commit from fork 2025-04-23 16:38:27 +08:00
hoshi-hiyouga
c1a7f2ebb2 [model] fix moe zero3 (#7826) 2025-04-23 15:30:49 +08:00
Kingsley
fa0eb91f1f [data] fix internvl plugin (#7817) 2025-04-23 00:58:22 +08:00
hoshi-hiyouga
49f9ed0232 [assets] update model readme (#7804) 2025-04-22 16:43:56 +08:00
Kingsley
2a564c25d1 [model] add arch check for InternVL (#7803) 2025-04-22 16:38:05 +08:00
Kingsley
7500e761d3 [misc] update internvl constants (#7801) 2025-04-22 15:53:08 +08:00
hoshi-hiyouga
fddcd43c88 [trainer] support early stop (#7797) 2025-04-22 01:59:33 +08:00
hoshi-hiyouga
0e4ce039ee [data] improve mmplugin (#7795) 2025-04-22 01:25:33 +08:00
hoshi-hiyouga
b07628dea5 [example] add bash usage (#7794) 2025-04-22 00:25:51 +08:00
Juanxi Tian
12ada72ed4 [trainer] Add Muon Optimizer (#7749)
Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-04-21 23:38:37 +08:00
hoshi-hiyouga
416853dd25 [parser] support omegaconf (#7793) 2025-04-21 23:30:30 +08:00
Changrui Chen
bd7bc31c79 [data] Fix wrong position ids with packed attention masks (#7754)
Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-04-21 23:19:36 +08:00
flashJd
0ac641326b [misc] fix new tokens adding (#7253)
Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-04-21 23:19:02 +08:00
ddddng
c5ba9106ec [model] fix gemma3 export (#7786)
Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-04-21 23:07:11 +08:00
Sachin Beldona
3b2d3794a5 [misc] fix bug in constant (#7765)
Co-authored-by: Sachin Beldona <sbeldona@cs.cmu.edu>
2025-04-21 23:06:31 +08:00
hoshi-hiyouga
b605c20768 [assets] update wechat (#7792) 2025-04-21 21:29:42 +08:00
hoshi-hiyouga
39169986ef [trainer] fix pt loss (#7748)
* fix pt loss

* robust

* fix

* test
2025-04-17 03:15:35 +08:00
hoshi-hiyouga
86ebb219d6 [breaking] bump transformers to 4.45.0 & improve ci (#7746)
* update ci

* fix

* fix

* fix

* fix

* fix
2025-04-17 02:36:48 +08:00
hoshi-hiyouga
d222f63cb7 [infer] set env for vllm ascend (#7745) 2025-04-17 01:08:55 +08:00
Kingsley
2e518f255f [model] support intern-VL 2.5-3 series (#7258)
* add internvl and rebase

* fix for internvl2&3

* remove lines

* fix video_inputs & lint

* nit

* add constants

* remove lines

* fix

* fix error

* pass ci

* pass ci

* skip internvl & nit
2025-04-17 00:31:30 +08:00
ENg-122
8f88a4e6a4 [misc] improve entrypoint (#7345)
* 纯粹优化下入口代码,因为看到if else太多了

* Update cli.py

---------

Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-04-16 21:48:23 +08:00
leo-pony
b9263ff5ac [infer] support vllm-ascend (#7739) 2025-04-16 20:06:47 +08:00
hoshi-hiyouga
ee2ab093a7 [api] fix chat messages (#7732) 2025-04-15 16:39:08 +08:00
hoshi-hiyouga
3df021d4d7 [deps] upgrade vllm (#7728) 2025-04-15 14:57:40 +08:00
Joe Schoonover
e252abf051 [docker] patch docker-rocm (#7725)
* Update Dockerfile

* Fix typo

* Fix syntax for /bin/sh conditional

* Add build args to docker-compose

* Change shell to /bin/bash

This is required for "==" syntax in conditional string comparison
2025-04-15 13:36:39 +08:00
hoshi-hiyouga
1134baeedd [assets] update model readme (#7724) 2025-04-15 00:41:09 +08:00
Kingsley
2101399c94 [model] Support Kimi_VL thinking/instruct (#7719)
* add kimi_vl

* patch config

* check version

* Update mm_plugin.py

* Update mm_plugin.py

---------

Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-04-15 00:21:58 +08:00
hoshi-hiyouga
3f91a95250 [misc] fix env vars (#7715) 2025-04-14 16:04:04 +08:00
hoshi-hiyouga
7c61b35106 [misc] upgrade cli (#7714) 2025-04-14 15:41:22 +08:00
hoshi-hiyouga
f518bfba5b [deps] upgrade transformers (#7704) 2025-04-13 18:11:34 +08:00
Yuxuan Zhang
8162f94db5 [model] add GLM-4-0414 (#7695)
* Update README_zh.md

* update
2025-04-13 17:10:45 +08:00
hoshi-hiyouga
1f0c52b73c [deps] fix uv conflicts (#7686)
* fix #7678

* Update setup.py

* Update tests.yml

* Update publish.yml

* Update Makefile
2025-04-11 18:02:24 +08:00
Eric Tang
a8caf09c7f [data] support for specifying a dataset in cloud storage (#7567)
* add support for loading datasets from s3/gcs

* add comments to readme

* run linter and address comments

* add option to pass in kwargs to ray init (i.e. runtime env)

* address comment

* revert mixed up changes
2025-04-10 11:31:35 +08:00
Eric Tang
bb8d79bae2 [ray] allow for specifying ray.init kwargs (i.e. runtime_env) (#7647)
* ray init kwargs

* Update trainer_utils.py

* fix ray args

---------

Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-04-10 11:31:05 +08:00
Dain Kim
1c436c9f25 [bugfix] enable_gemma_liger_kernel (#7660)
- The `enable_liger_kernel` function for the Gemma model series was not executed due to the existing `if` statement in the code.
- Changed the line to an `elif` statement so that the `apply_liger_kernel` function is executed properly.

resolved: #7628
2025-04-10 11:27:30 +08:00
jilongW
1b0934bccb [misc] fix cuda warn on intel GPU (#7655) 2025-04-09 21:37:54 +08:00
hoshi-hiyouga
4eec541857 [data] add coig-p dataset (#7657) 2025-04-09 21:18:25 +08:00
hoshi-hiyouga
89a4f9ec7f [assets] update readme (#7654) 2025-04-09 18:27:38 +08:00
hoshi-hiyouga
1abd71b551 [assets] update readme (#7644) 2025-04-09 01:06:06 +08:00
Kingsley
349c56c51c [data] Fix bugs of use_audio_in_video in Qwen2.5 Omni (#7638)
* cache _mm_inputs

* nit

* support for use_audio_in_video

* remove cache

* fix data

* Update mllm_video_audio_demo.json
2025-04-08 18:40:10 +08:00
Shawn Tao
acb09fa3a3 [trainer] fix key error (#7635) 2025-04-08 18:39:50 +08:00
Adarsh Shirawalmath
f75b91077b [sglang] support transformers 4.51.0 (#7639) 2025-04-08 18:39:23 +08:00
hoshi-hiyouga
c3c0efbaa0 [misc] fix packing and eval plot (#7623) 2025-04-07 18:20:57 +08:00
hoshi-hiyouga
5115dc8c7f [assets] update readme (#7612) 2025-04-06 13:58:49 +08:00
hoshi-hiyouga
831e7f1cfd [model] add llama4 (#7611) 2025-04-06 13:42:31 +08:00
Kingsley
d4cfa9507e [data] fix qwen2.5 omni plugin (#7578)
* specific entry

* Update mm_plugin.py

* fix fps cal

---------

Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-04-02 23:58:39 +08:00
Kingsley
d32c6c014d [data] fix qwen2.5 omni plugin (#7573)
* align key with qwen2vl

* nit && change scripts
2025-04-02 21:28:52 +08:00
gechengze
7b9deb9410 [trainer] fix batch processing in PPO trainer (#7576) 2025-04-02 21:17:48 +08:00
hoshi-hiyouga
5e22597ff1 [infer] vllm video/audio inference (#7566) 2025-04-02 02:27:04 +08:00
hoshi-hiyouga
2bfcad2394 [model] fix kv cache (#7564) 2025-04-01 23:07:46 +08:00
Yu Shi Jie
a13b1bb49a [model] fix use_cache patching for gemma3 multimodal (#7500) 2025-04-01 16:06:48 +08:00
Ritesh Goru
d10467d178 [data] specify position_ids in PackedSupervisedDatasetProcessor for neat_packing (#7318)
* use position_ids for neat_packing with fa2

* revert fa2 changes
2025-04-01 16:03:13 +08:00
taoharry
aac70663fd [webui] fix launch with proxy (#7332) 2025-04-01 15:52:56 +08:00
Billy Cao
00409ff28a [data] shard the dataset to allow multiprocessing when streaming is enabled (#7530)
* Shard the dataset when streaming to allow multiprocessing

* Allow user to not set dataset_shards to ensure backward compatibility
2025-04-01 15:36:23 +08:00
Hao
d70b3b4bc5 [trainer] new kto mismatch pair creation strategy (#7509) 2025-04-01 15:21:53 +08:00
hoshi-hiyouga
e76eba051d [data] fix qwen2.5 omni collator (#7553) 2025-04-01 00:15:12 +08:00
Kingsley
7eed496336 [model] add Qwen2.5-Omni model (#7537)
* preserve image_sizes

* preserve image_sizes

* init plugin

* support audio-text2text lora

* nit

* support image/video-text2text, audio-text2text

* remove args

* remove lines

* add docs && nit

* remove some comments

* fix && add merge part script

* add license
2025-03-31 20:39:35 +08:00
hoshi-hiyouga
0f8296626a [deps] pin pydantic to 2.10.6 (#7546) 2025-03-31 14:42:28 +08:00
Kingsley
8da1d2fa71 [data] fix pixtral plugin (#7505)
* preserve `image_sizes`

* add comments
2025-03-27 17:06:40 +08:00
Xu-pixel
b578a7d5b6 [3rdparty] support swanlab lark notification (#7481) 2025-03-27 01:52:01 +08:00
Kdump
24afceddb7 [trainer] fix wsd scheduler (#7304)
* [trainer] Warmup_stable_decay supports setting the number of stable and decay steps according to the warmup_ratio ratio

* Update trainer_utils.py

---------

Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-03-26 15:25:02 +08:00
hoshi-hiyouga
0583d06676 [model] add qwen2vl 32b & upgrade peft (#7469)
* add qwen2vl 32b

* fix ci

* upgrade peft to 0.15

* fix ci

* fix ci
2025-03-25 12:15:58 +08:00
GuoCoder
ec6a261568 [model] fix lora on quant models (#7456)
Co-authored-by: root <root@ai>
2025-03-25 11:59:46 +08:00
Xiaosu Zhu
6b3b97c738 [misc] update liger-kernel's monkey patch (#7453)
* Update liger_kernel.py

* Update setup.py
2025-03-25 11:58:52 +08:00
AbdelKarim ELJANDOUBI
6d3748f727 [misc] enable liger kernel for gemma3 text and paligemma (#7466)
* add gemma3 text

* add paligemma (1,2 and 2 mix)
2025-03-25 09:27:43 +08:00
Kenny Lam
7c890170e3 [misc] enable liger kernel for gemma3 (#7462) 2025-03-24 19:09:59 +08:00
hoshi-hiyouga
ca42c0c406 [assets] fix gemma3 readme (#7449) 2025-03-24 10:31:25 +08:00
hoshi-hiyouga
7203365b80 [trainer] fix vlm loss for transformers 4.49 (#7448) 2025-03-24 10:24:05 +08:00
rumichi
3612946dd9 [docker] upgrade to torch 2.6 (#7442) 2025-03-23 21:18:08 +08:00
hoshi-hiyouga
3aa4f32e9c [misc] fix ci (#7441)
* fix ci

* improve ci
2025-03-23 21:09:35 +08:00
hoshi-hiyouga
304796b803 [misc] fix license (#7440) 2025-03-23 19:31:56 +08:00
SnowFox4004
7cfd6e4bb0 [scripts] support compute score on vllm's predictions (#7419)
* enable manual bleu&rouge eval by adding `scripts/eval_bleu_rouge.py`

* added libraries check

* update: 使用datasets库的多进程加速处理

* update:
- 使用 fire.Fire
- 修改代码格式

* Update eval_bleu_rouge.py: correctly uses fire

Deleted the code of using sys.argv

* Update eval_bleu_rouge.py

---------

Co-authored-by: SnowFox4004 <manba@out>
Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-03-23 19:21:01 +08:00
hoshi-hiyouga
05b19d6952 [deps] upgrade transformers to 4.50.0 (#7437)
* upgrade transformers

* fix hf cache

* fix dpo trainer
2025-03-23 17:44:27 +08:00
hoshi-hiyouga
919415dba9 [deps] upgrade vllm to 0.8 (#7436) 2025-03-23 14:32:22 +08:00
Guo, Quan
a959c2a509 [misc] fix sglang deps (#7432)
* feat: Add transformer version requirement for sglang

* feat: add srt to sglang which is required for running sglang

Other options are srt_hip, srt_xpu, srt_npu, srt_hpu, srt_cpu, for different computation architectures.
2025-03-23 14:07:10 +08:00
Eric Tang
db0a08db6f [3rdparty] fix redundant process group destroy for ray (#7395)
* fix redundant process group destroy for ray

* Update tuner.py

---------

Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
2025-03-21 10:56:47 +08:00
hoshi-hiyouga
a306f0f5a2 [version] fix minicpmo (#7378) 2025-03-20 16:59:31 +08:00
hoshi-hiyouga
63752fccf7 [assets] update wechat (#7361) 2025-03-18 21:31:09 +08:00
hoshi-hiyouga
1f9773395b [misc] set dev version (#7351) 2025-03-18 00:10:53 +08:00
hoshi-hiyouga
128b5b12b3 [data] fix template (#7349) 2025-03-17 23:45:20 +08:00
hoshi-hiyouga
d5915a7dd7 [assets] update videos (#7340)
* Update README.md

* Update README_zh.md
2025-03-17 15:48:02 +08:00
Hertz
ec1154662b [model] support hunyuan 7b (#7317)
* [Model]supported tencent-hunyuan model

* [Model]supported tencent-hunyuan model(fix)

* [Model]supported tencent-hunyuan model(fix)
2025-03-15 20:55:24 +08:00
Qiaolin Yu
a44a53ebec [inference] support sglang backend (#7278)
* Mimic SGLang offline Engine

* Add more tests and args

* Pass all current tests

* Clean Code

* fix sample_params

* clean code

* Fix Stream Chat

* change sglang from engine mode to server mode

* fix

* Fix Review Issues

* Use SGLang Built-In Utilities

* Fix test SGLang

* Some Doc Issue

* fix sglang engine

* add readme

---------

Co-authored-by: Jin Pan <jpan236@wisc.edu>
Co-authored-by: hiyouga <hiyouga@buaa.edu.cn>
2025-03-15 04:37:58 +08:00
hoshi-hiyouga
93e6184cbe [data] gemma3 plugin pan and scan (#7294)
* gemma3 pan and scan

* add test case

* fix test
2025-03-13 23:29:23 +08:00
hoshi-hiyouga
0be0d7796a [assets] update video (#7287) 2025-03-13 18:45:47 +08:00
Ritesh Goru
480369a9f2 [data] efficient 4d_attention_mask creation in neat_packing (#7272) 2025-03-13 03:31:12 +08:00
hoshi-hiyouga
650a9a9057 [misc] update format (#7277) 2025-03-13 02:53:08 +08:00
hoshi-hiyouga
4b9d8da5a4 [model] support gemma3 (#7273) 2025-03-13 01:35:23 +08:00
hoshi-hiyouga
e6159ad730 [misc] upgrade deps (#7257) 2025-03-12 00:33:47 +08:00
hoshi-hiyouga
264538cb26 [misc] upgrade format to py39 (#7256) 2025-03-12 00:08:41 +08:00
hoshi-hiyouga
5995800bce [ci] update workflow (#7255) 2025-03-11 22:57:49 +08:00
hoshi-hiyouga
bf8b483186 [core] release v0.9.2 (#7254) 2025-03-11 22:42:23 +08:00
hoshi-hiyouga
e2299e261b Merge pull request #7242 from hiyouga/hiyouga/release
[release] release v0.9.2

Former-commit-id: 6b25268990bf225d84e29d4067595cf720fa12d8
2025-03-11 15:28:45 +08:00
hoshi-hiyouga
8a44dce326 Merge pull request #7247 from hiyouga/hiyouga/commit
[misc] support print commit info

Former-commit-id: 0f7ec4f8529a5d7ea2153b881335821038307bb7
2025-03-11 15:28:04 +08:00
hoshi-hiyouga
6d9233833b Merge pull request #7244 from hiyouga/hiyouga/token
[data] avoid exit after saving preprocessed data

Former-commit-id: dcbf01b0035062fa14187e5bdbb925080d349501
2025-03-11 15:17:15 +08:00
hiyouga
d019603835 support commit info
Former-commit-id: a7d89a6dc10579deaf9f45825cc18405a27cade6
2025-03-11 15:13:59 +08:00
hiyouga
478e8194d9 remove exit in preprocess
Former-commit-id: f369b6ef41ffd9586ba568b88c5ff32a1af4bace
2025-03-11 15:08:25 +08:00
hiyouga
1890d3dafe release v0.9.2
Former-commit-id: e7ed1782d4a006400de6fc0f864abd01f7fadeea
2025-03-11 14:49:13 +08:00
hoshi-hiyouga
522a3e8493 [infer] fix vllm args (#7235)
Former-commit-id: 999be5b4512890b8cf4f45874a77e35cf35626f5
2025-03-11 01:15:35 +08:00
Ze-Yi LIN
18968405d0 [tracking] add swanlab_logdir param (#7219)
* feat: add swanlab_logdir param

* fix

Former-commit-id: 9215ad488b6ac6cd57fe8fa4acdacceb63f68ca5
2025-03-11 00:53:07 +08:00
hoshi-hiyouga
71a1c1321a [config] update args (#7231)
Former-commit-id: f71a901840811bf560df671ec63a146ff99140c6
2025-03-10 23:04:43 +08:00
hoshi-hiyouga
cf58a6d860 [config] fix export max len (#7230)
Former-commit-id: 211c0b3e8f3340acd2fae1762d9152a09f19ba34
2025-03-10 16:46:08 +08:00
hoshi-hiyouga
9adc0a2c3f [assets] update readme (#7209)
Former-commit-id: d1631b38dad9ba3d41aebbb00e3500eb79b9e8e9
2025-03-07 17:27:49 +08:00
hoshi-hiyouga
16419b2834 [data] fix loader (#7207)
* fix dataloader

* add test case

* fix type

* fix ci

* fix ci

* fix ci

* disable overwrite cache in ci

Former-commit-id: e84af0e140b1aafd1a6d6fe185a8e41c8fc5f831
2025-03-07 17:20:46 +08:00
hoshi-hiyouga
82a2bac866 [misc] fix ds config (#7205)
Former-commit-id: b478fa1d9de1858075769f86f57126fde92db813
2025-03-07 15:21:28 +08:00
ZhangChuanhui
151ef48b40 [data] fix function formatter (#7201)
Co-authored-by: zhangchuanhui <zhangchal@digitalchina.com>
Former-commit-id: 3efb32b986170d2839e526640f85ba230715879a
2025-03-07 15:17:23 +08:00
hoshi-hiyouga
a255c3a476 [misc] fix cli (#7204)
Former-commit-id: 999f57133ca163c7108d2d5ee8194eca9b2109b4
2025-03-07 15:01:18 +08:00
hoshi-hiyouga
f4ec4fa6ad [script] fix vllm version (#7193)
Former-commit-id: ababdde597b2b9bf0ab3f30f036bc8d97de07f03
2025-03-06 17:14:17 +08:00
hoshi-hiyouga
2635794727 [webui] support escape html (#7190)
Former-commit-id: cf9840374f171359c828b0d6f7a2aa9893c8f701
2025-03-06 16:52:21 +08:00
hoshi-hiyouga
d2f845d70d [deps] upgrade vllm (#7183)
Former-commit-id: 37678a3d64668c3b4a4bfefc054e3b9b40427c1a
2025-03-06 15:25:08 +08:00
hoshi-hiyouga
bb8aba5abf [data] fix mm template (#7181)
Former-commit-id: 648616d473c81d393592806307e3e25b159cb278
2025-03-06 15:18:32 +08:00
hoshi-hiyouga
9f16c50155 [model] add QwQ 32b (#7179)
Former-commit-id: 8897e48b8cd55407812453ddd4ff98ac7bdc4e91
2025-03-06 11:58:36 +08:00
Ze-Yi LIN
25bb9f5ad9 [trainer] fix swanlab callback (#7176)
Former-commit-id: 6d9acf4bd30db24499118aee16bd19cb19ba9e3d
2025-03-06 00:33:37 +08:00
hoshi-hiyouga
7b985f55db [trainer] update config (#7174)
Former-commit-id: 9f535d0e3c4ee3cd0f1b65218c2eee5d03f43c6f
2025-03-05 23:32:54 +08:00
sirui.li
fd0357a26d [data] fix qwen2audio plugin (#7166)
* Update pairwise.py

[data]Repair multimodal model dpo training

* Update pairwise.py

[data]repair multimodal model dpo training using deepcopy

* Update pairwise.py

* Update mm_plugin.py

Former-commit-id: 86763dfdb8e9e5668c1ddd7e924e4be76bf78368
2025-03-05 18:03:36 +08:00
hoshi-hiyouga
31f9daa362 [data] use bicubic resampler (#7143)
Former-commit-id: c708f19ab0ab57526134952afddaa90aae8decbf
2025-03-04 00:17:06 +08:00
hoshi-hiyouga
15ea576246 [webui] fix webui (#7142)
Former-commit-id: d07281f8a45ad8a38d390181d01dcadbcf9aa1b9
2025-03-04 00:01:49 +08:00
rabbit
19a6916d80 [data] bailing template (#7117)
* add bailing template

* add bailing template

* add bailing template

---------

Co-authored-by: chengshiwen.csw@antgroup.com <chengshiwen.csw@antgroup.com>
Former-commit-id: 4a36f5e0abb5a63f4b3b81560bb1ad0e6832d379
2025-03-03 15:33:22 +08:00
hoshi-hiyouga
585c475f71 [inference] fix hf_engine (#7120)
Former-commit-id: f8cf5319cb5d6e06a1b0d8b8db2b678627f2271e
2025-03-01 05:22:49 +08:00
hoshi-hiyouga
e62dae37fe [assets] update wechat (#7106)
Former-commit-id: 0ea430060994631e9fdb18fbbca0dd565a04fd66
2025-02-28 12:01:04 +08:00
Ze-Yi LIN
11672f760d [webui] display swanlab exp link (#7089)
* webui add swanlab link

* change callback name

* update

---------

Co-authored-by: hiyouga <hiyouga@buaa.edu.cn>
Former-commit-id: 27a4b93871c63b839c92940766bd7e0177972c9b
2025-02-27 19:40:54 +08:00
leo-pony
b9f84900ee [npu] update cann base image and torch 2.4 (#7061)
* Update base npu container image version:The Python version required for Hugging Face Transformers is >= python3.10

* Fix the bug: arg type of INSTALL_DEEPSPEED shoud been string now.

* Update Ascend CANN, CANN-Kernel and corresponding torch and torch-npu version

* Upgrade torch-npu needs packages' version: torch==2.1.0 and torch-npu==2.4.0.post2

Former-commit-id: d6dafada58412b0c801e576ef4d8d96203f792af
2025-02-25 23:32:01 +08:00
hoshi-hiyouga
5f65558088 [misc] fix project toml (#7067)
Former-commit-id: 28a668ff4e0beebfe5387362f5518c1d9343666f
2025-02-25 23:22:48 +08:00
JieShen
0f54a78144 [script] add seed args (#7058)
* add seed args

* add seed args

* update seed

Former-commit-id: eb9770b2c01a840b6a0ac119210c22bdbb81e18b
2025-02-25 19:44:57 +08:00
Kingsley
2986bef530 [model] add paligemma2-mix series (#7060)
Former-commit-id: 0c0196306d343242ee5e6f22c55562f9a74aa782
2025-02-25 18:51:16 +08:00
hoshi-hiyouga
065f7fb5da [data] fix mllama (#7053)
* fix mllama

* fix test

Former-commit-id: f5af20a63f3d59a6a68d323a7c6f68e551edb3a3
2025-02-24 22:05:38 +08:00
hoshi-hiyouga
c1d5073bd3 [model] add models (#7054)
* add qwen25vl awq models

* add moonlight

Former-commit-id: ae3be2970fea8a35907202a313ab767381c44916
2025-02-24 22:05:13 +08:00
hoshi-hiyouga
ee46011b34 [assets] update readme (#7051)
Former-commit-id: c89a39bfc6a3f0aaa376cd1b221320f466aba617
2025-02-24 20:45:06 +08:00
hoshi-hiyouga
d55f420206 [assets] update wechat (#7019)
Former-commit-id: 3d102fe7e0bfc23db7d75f90ebaf53216c54cc85
2025-02-20 20:32:33 +08:00
Zhangchi Feng
fcf75633a0 [data] fix MiniCPMV plugin (#6998)
* fix template

* fix bug in messages processing

Former-commit-id: f98b828f53968fb9c72bff9e45510ad5586c4fab
2025-02-19 19:36:04 +08:00
hoshi-hiyouga
e77ced045d [webui] update css (#6985)
Former-commit-id: 760a1dfb8193de418d7aa1063c0d111a3a64ae0f
2025-02-18 18:27:57 +08:00
hoshi-hiyouga
331f53381f [data] add r1 distill dataset (#6983)
Former-commit-id: 1da5ee4edaa3896593b9cae488f0ac5917c3243e
2025-02-18 17:25:09 +08:00
hoshi-hiyouga
1d675a287d [version] support transformers 449 (#6982)
* support transformers 449

* fix mm plugin

Former-commit-id: e9118a9df0839d24f6ddff5a0b55ef101a1d3d22
2025-02-18 17:05:40 +08:00
hoshi-hiyouga
be33ef67fb [misc] fix script (#6977)
Former-commit-id: 775efa1d8cbdb1b7d122be2a986d47f85214e0a1
2025-02-18 17:00:46 +08:00
hoshi-hiyouga
f5cd17881e [data] update vlm args (#6976)
Former-commit-id: c28e710636a0286d4b8a1d494529b25168a8f3ab
2025-02-18 02:12:51 +08:00
hoshi-hiyouga
c09b648934 [data] add min resolution option (#6975)
Former-commit-id: 76bd9a98a2fb00f1a1d881e6e1364c02fd36d327
2025-02-18 01:40:46 +08:00
hoshi-hiyouga
f2fd9d1b25 [data] fix predict dataset (#6972)
Former-commit-id: f9a82e527877b1ed47cabb3d34f4d155705f4048
2025-02-17 20:29:40 +08:00
Zhangchi Feng
167342af8a [data] fix minicpmo template (#6946)
Former-commit-id: 09e4438b58d5c1a5fdde37ff781c3d79461c4743
2025-02-15 00:37:41 +08:00
Eric Tang
76f9bd1820 [ray] specify ray storage path (#6920)
Former-commit-id: 4be6b66b1eaa79955e936ce2b747a8837ecd1e49
2025-02-14 21:55:41 +08:00
hoshi-hiyouga
a893505924 [misc] fix lora regex (#6944)
* fix lora regex

* fix

Former-commit-id: 1d0ecbaee1b72f1e03154ddd4fcc8b7876e01f89
2025-02-14 21:38:43 +08:00
hoshi-hiyouga
ed25e051a9 [misc] fix grad ckpt (#6931)
Former-commit-id: deae1fc9a0bea5c8b8be1564cf9c81c9c02a0b3a
2025-02-13 23:27:51 +08:00
hoshi-hiyouga
5e5fc337f9 [model] add liger kernel to qwen2_5 vl (#6930)
* add liger kernel to qwen2_5 vl

* fix patch

* fix patch

Former-commit-id: 828776d155986166498dfc907194f64436571106
2025-02-13 23:05:54 +08:00
Billy Cao
58e9ca8aa0 [trainer] fix gen_kwarg to eval during training (#5451)
* Correctly pass gen_kwarg to eval during model runs

* fix

* fix

---------

Co-authored-by: hiyouga <hiyouga@buaa.edu.cn>
Former-commit-id: 845d16122496311e08263610a6a922f82604de7b
2025-02-13 02:35:06 +08:00
SrWYG
a4c4b8496f [data] evaluate on each dataset (#5522)
* [Update] loader.py , evaluate will run separate evaluations on each dataset.

`If you pass a dictionary with names of datasets as keys and datasets as values, evaluate will run separate evaluations on each dataset. This can be useful to monitor how training affects other datasets or simply to get a more fine-grained evaluation`

seq2seqtrainner support eval_dataset as Dict.

* fix format

* fix

* fix

---------

Co-authored-by: hiyouga <hiyouga@buaa.edu.cn>
Former-commit-id: cf00f78650a442c85678ce805e030d2b96cbecd7
2025-02-13 02:19:03 +08:00
Noah
38c9641777 [data] improve error handling (#6128)
* sync from upstream

* update

* update

* fix

---------

Co-authored-by: hiyouga <hiyouga@buaa.edu.cn>
Former-commit-id: 1569e6096fec07da5583f1a3435b0d23ae09b5ba
2025-02-13 01:39:41 +08:00
hoshi-hiyouga
8b8fdb3a85 [misc] update readme (#6918)
Former-commit-id: f5823479bd51c39db668b68056be749af09894d1
2025-02-13 01:01:41 +08:00
hoshi-hiyouga
290057069e [misc] update readme (#6917)
Former-commit-id: 6bbed1d8c4189fb7bea40230e278c40bb5336fbd
2025-02-13 00:58:10 +08:00
hoshi-hiyouga
46203856fc [breaking change] refactor data pipeline (#6901)
* refactor data

* rename file

Former-commit-id: 7a1a4ce6451cb782573d0bd9dd27a5e443e3a18b
2025-02-13 00:39:20 +08:00
Eric Tang
80b89978d9 [misc] support for launching LLaMA-Factory with uv run (#6907)
* yay

* uv with ray temporary commit

* remove ray specific code for now

* cleanup

Former-commit-id: 1a9cab6de49e300bf9c747eefbb11d693592b477
2025-02-13 00:38:44 +08:00
Eric Tang
5a221d91f9 [example] fix path to ray example (#6906)
Former-commit-id: e9bee3ef045d85051da04e6ad581a23a9e1a9551
2025-02-13 00:29:32 +08:00
hoshi-hiyouga
3a3f4072e5 [misc] fix grad ckpt func (#6916)
Former-commit-id: 35e069a52b3d7cfd9b0107574b09265eb2290f0b
2025-02-13 00:17:18 +08:00
marko1616
0c0cdc26bc [trainer] fix llama3.2 vision kto train (#6904)
Former-commit-id: 1563e89adc8988fc6e4250634a3f1e385979b0e5
2025-02-12 19:09:14 +08:00
hoshi-hiyouga
2581cc844b [data] feat: auto template (#6905)
* support auto template

* add unittest

Former-commit-id: 0c6c9150db6414a5a05527ea486dce6633dff4b3
2025-02-12 00:22:53 +08:00
hoshi-hiyouga
d58fcd094e [misc] update readme (#6903)
Former-commit-id: 830d028939149d54bc91b6bda110dfa5de949483
2025-02-11 22:51:26 +08:00
hoshi-hiyouga
86063e27ea [data] fix ollama template (#6902)
* fix ollama template

* add meta info

* use half precision

Former-commit-id: 1304bbea69d8c8ca57140017515dee7ae2ee6536
2025-02-11 22:43:09 +08:00
hoshi-hiyouga
88eafd865b [misc] support export ollama modelfile (#6899)
* support export ollama modelfile

* update config

* add system and num ctx

Former-commit-id: 8c2af7466f4015f300b51841db11bcd2505ebf20
2025-02-11 19:52:25 +08:00
hoshi-hiyouga
3f7bd98bfa [data] refactor template (#6896)
Former-commit-id: f78d5a3eca947ed965ca2f6c87d60441b1a59867
2025-02-11 17:59:25 +08:00
codingma
b72c4bd118 support ollama modelfile export (#4686)
Former-commit-id: 15cca102a7fc0d08b5d049cf264acc6fa576b104
2025-02-11 17:52:24 +08:00
hoshi-hiyouga
808ff89a2d [data] refactor mm plugin (#6895)
* refactor plugin

* lint

Former-commit-id: 1c8dcc3adca4a2e78f514f8bb70573dd1ca08746
2025-02-11 16:34:49 +08:00
HJ
6d7f1299bd [data] fix qwen_2_5_vl video processing (#6868)
* fix qwen_2_5_vl video processing

* Update mm_plugin.py

* Update mm_plugin.py

---------

Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
Former-commit-id: 35f326dabdc8e84036296d2e3de1c84c67b8def8
2025-02-11 16:14:50 +08:00
hoshi-hiyouga
0420a608ca [assets] update wechat (#6892)
Former-commit-id: 0b268cc903a583ae78cb7e63d2bdc4602d7220fc
2025-02-11 13:56:26 +08:00
Zhangchi Feng
2047eab723 [da'ta] fix minicpmv plugin (#6890)
* fix template name

* tiny fix

* support minicpm-o-2.6

* support inference of minicpmv

* update readme

* support dpo of minicpmv

* update init audio

* update init audio

* [model]fix image process in minicpmo

* fix no mm inputs

Former-commit-id: cdd19ccd8cec460606b4545e886e932c1c5c5fe1
2025-02-11 13:30:44 +08:00
HJ
e11b40c344 [data] fix: sharegpt converter (#6879)
* fix-sharegpt-format

* fix

---------

Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
Former-commit-id: ae8f8151ff750839998b50446f127061f240d41a
2025-02-10 21:59:12 +08:00
hoshi-hiyouga
b869506a57 [data] fix mllama collator (#6874)
Former-commit-id: c694fa3d66651c6ce547fa72c8260c46a406126b
2025-02-09 22:42:25 +08:00
hoshi-hiyouga
72d5b06b08 [test] align test cases (#6865)
* align test cases

* fix function formatter

Former-commit-id: a68f5e22d0391c80a9a826dc83967255be572032
2025-02-09 01:03:49 +08:00
hoshi-hiyouga
94726bdc8d [dataset] add openthought (#6866)
Former-commit-id: 20c748a4f108c0087f0d85377a4aa99126a0beb0
2025-02-09 00:53:01 +08:00
hoshi-hiyouga
4d1791e905 [deps] upgrade vllm (#6857)
Former-commit-id: 4bd50f65a3d62528768561019fda2723d045c7fd
2025-02-08 15:02:28 +08:00
hoshi-hiyouga
528e06ccaa fix qwen2vl plugin (#6855)
Former-commit-id: fd13b7138ab3f4da0a429a327b9d076bcb70b944
2025-02-08 10:59:10 +08:00
hoshi-hiyouga
fec641ec82 [misc] allow extra args (#6831)
Former-commit-id: 0fd3a5295cb4e08a4e57e860e82103364c28fba8
2025-02-06 12:38:08 +08:00
Zhangchi Feng
8f401e37f8 [model] support audio (#6701)
* support qwen2_audio

* improve code

* lint

* fix

* fix

* fix

---------

Co-authored-by: hiyouga <hiyouga@buaa.edu.cn>
Former-commit-id: 5eacb5629e4d7733cd992a63747a1335f2c6a929
2025-02-05 04:59:09 +08:00
Yueqi Song
9feb78e7b4 [data] allow thought in function call (#6797)
* Update template.py

* Update template.py

* use formatter

* fix regex

---------

Co-authored-by: hiyouga <hiyouga@buaa.edu.cn>
Former-commit-id: 3a31af6e920683ec074da93b1719e29f5d4cffd6
2025-02-05 02:26:23 +08:00
hoshi-hiyouga
c2022431aa [misc] update license year & fix llama pro (#6814)
* fix llamapro script

* change year

Former-commit-id: d9ae594178796994d400a5f207d6499712816f89
2025-02-05 01:53:33 +08:00
Yueqi Song
0817c24c04 [data] fix qwen tool template (#6796)
* Update tool_utils.py

* fix unittest

---------

Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
Former-commit-id: 02bb78a792112f5151b3a96ddde2528823855288
2025-02-05 00:02:00 +08:00
Zhangchi Feng
cfb926fb84 [data] fix minicpmv plugin (#6801)
* fix template name

* tiny fix

* support minicpm-o-2.6

* support inference of minicpmv

* update readme

* support dpo of minicpmv

* update init audio

* update init audio

* [model]fix image process in minicpmo

Former-commit-id: 8f704c8b6228ef50f828014f85dce67fda868660
2025-02-04 21:20:15 +08:00
neavo
34746d6151 [readme] update flash attention installation instruction on win platform (#6788)
* Update README_zh.md

* Update README.md

Former-commit-id: e48d1327fb39cc95f8fbfc746494f67a79471893
2025-02-01 12:43:29 +08:00
hoshi-hiyouga
5bb447b118 [misc] update workflows (#6787)
Former-commit-id: 15add6b250149e2aeabdc62d7dca69fc06054e01
2025-02-01 04:54:42 +08:00
hoshi-hiyouga
a28261a866 [model] add mistral small models (#6786)
Former-commit-id: e5e95c39bc4199fa89c67e34f9adaaa987058744
2025-02-01 04:31:38 +08:00
hoshi-hiyouga
800de98dc8 [model] add qwen2.5 vl models (#6779)
Former-commit-id: ed46fb4f6194c30060b908092464dded12e5787c
2025-01-31 03:00:29 +08:00
hoshi-hiyouga
222423bcef [breaking] support transformers 4.48 (#6628)
Former-commit-id: f154ab175c513a4d7bb866bf2cffc34b77b50508
2025-01-31 01:36:33 +08:00
hoshi-hiyouga
e71737351f [webui] improve webui & reasoning mode (#6778)
Former-commit-id: 3f17fc0d7163372e0446f1a38792ff761e99b739
2025-01-31 00:09:21 +08:00
qvlehao
4f298894da [model] add deepseek-R1 & show think process (#6767)
Former-commit-id: 4dccb724af51208a001c96fefbdbf226be09e50c
2025-01-29 12:16:26 +08:00
yinpu
a8fae3869d fix: avoid redundant normalization in DPO's SFT loss calculation (#6722)
Former-commit-id: 971a8ccbdacf130763d40c7ef82a711b2fc1292f
2025-01-21 13:38:02 +08:00
engchina
db9b977e4f [webui] support ja (#6698)
* add support for japanese language

* add support for japanese language

---------

Co-authored-by: engchina <atjapan2015@gmail.com>
Former-commit-id: 88692e403f9b5085dd0c7c2b2c68656c5da50dd4
2025-01-20 19:46:38 +08:00
hoshi-hiyouga
87d685b59f [model] support yarn (#6693)
Former-commit-id: 8c412abc44a4c61b683465e36c6288580d980250
2025-01-18 13:56:09 +08:00
hoshi-hiyouga
e4046bdd1f [assets] update wechat (#6692)
Former-commit-id: 70dba5fab6f4c9225758cafb646113d8e80ac084
2025-01-18 12:35:03 +08:00
hoshi-hiyouga
5baa3add8c [misc] update mm plugin (#6691)
Former-commit-id: 00303338d6927b1fda58b23340a31a8fa009f706
2025-01-17 23:04:26 +08:00
hoshi-hiyouga
332f637592 disable valset by default (#6690)
Former-commit-id: a1a94f364e33d1d73852f74eda4fa581e6b16533
2025-01-17 21:09:30 +08:00
hoshi-hiyouga
31daa6570b [webui] upgrade to gradio 5 (#6688)
Former-commit-id: 9df7721264ddef0008d7648e6ed173adef99bd74
2025-01-17 20:15:42 +08:00
hoshi-hiyouga
33525a34b6 fix qwen2 moe (#6684)
Former-commit-id: ab624419fa0ab23ef7a331a0ec14e393328772b5
2025-01-17 13:46:09 +08:00
Zhangchi Feng
3607caa2ad [data] Fix minicpmv/o dpo training (#6657)
* fix template name

* tiny fix

* support minicpm-o-2.6

* support inference of minicpmv

* update readme

* support dpo of minicpmv

Former-commit-id: 8d9f47b98047f370637d1c96c2f3440dcc738ef3
2025-01-15 17:30:37 +08:00
steveepreston
0fc2e19279 Update val_size english description (#6653)
* Update `val_size` Description in locales.py

* Update `val_size` Description in data_args.py

* Remove extra space in data_args.py

Former-commit-id: f1ba5158091446dce540dd796284037bdd724c38
2025-01-15 16:00:20 +08:00
hoshi-hiyouga
ef994600db update readme (#6648)
Former-commit-id: b47467276ab3174c50329b3c8b76823bc0a2249c
2025-01-15 11:06:19 +08:00
hoshi-hiyouga
7638f1070e [optim] clean apollo (#6645)
* clean apollo code

* update readme

Former-commit-id: 38b8ec4a99189483124b54df9d6bc6b0d318855a
2025-01-15 01:42:50 +08:00
zhuHQ
c2120432db [optim] add support to APOLLO (#6617)
Former-commit-id: 5a252e5a458457adbd19da3b68a3897ad2962824
2025-01-15 00:24:56 +08:00
Zhangchi Feng
66184762e8 update readme of MiniCPM-o (#6642)
* fix template name

* tiny fix

* support minicpm-o-2.6

* support inference of minicpmv

* update readme

Former-commit-id: 68604050ae2c98aeef5e9a6b4d2c11a4eb609bfa
2025-01-14 21:22:35 +08:00
hoshi-hiyouga
41a9e231cb lint (#6641)
Former-commit-id: 79731ae13ecd17eb8646fb53162c81dddfef3b00
2025-01-14 18:40:07 +08:00
Haian Huang(深度眸)
1bb06e06df Support InternLM3 Dense 8B Model (#6640)
* support internlm3

* update

* update

* update

* add hint

Former-commit-id: 24ab7ae0944c5f373e9cac60f0332e704824a057
2025-01-14 18:07:27 +08:00
Xiaosu Zhu
381f7120e6 Fix tokenizer max length (#6632)
Former-commit-id: 1807c7ba033985490aa7c8c39d880da6af983b92
2025-01-14 17:35:54 +08:00
Zhangchi Feng
f7857c83e1 Support Inference of MiniCPM-V-2.6 and MiniCPM-o-2.6 (#6631)
* fix template name

* tiny fix

* support minicpm-o-2.6

* support inference of minicpmv

Former-commit-id: 7f3c64e853a7cdd49d02bf85e237611941ac7fa8
2025-01-14 17:34:58 +08:00
hoshi-hiyouga
d0da6f40b0 [model] fix mllama any image (#6637)
* fix mllama any image

* reorder classes

Former-commit-id: 1242a1c4b4a465c06363fdc59302e80e5c4c96e6
2025-01-14 16:47:58 +08:00
hoshi-hiyouga
28d145a066 pin vllm version to 0.6.5 (#6629)
Former-commit-id: 26097ca0adf25ebb7d9e8eec2d2cef673c6cfe88
2025-01-14 02:44:02 +08:00
Zhangchi Feng
ae32c148d1 Support new features of MiniCPM-V (#6626)
* fix template name

* tiny fix

* support minicpm-o-2.6

Former-commit-id: 53034a61c7654358f46916cbc370910fb2aeff3b
2025-01-14 00:26:19 +08:00
hoshi-hiyouga
2a05941b14 [inference] fix stop token for object detection (#6624)
* fix stop token

* update minicpm data pipeline

* fix npu qlora examples

Former-commit-id: 844919fadaa8a61dfae47020971ea80730b2346f
2025-01-13 21:34:20 +08:00
codingma
11c38b9173 add nf4 qlora support on Ascend NPU (#6601)
* add nf4 qlora support on Ascend NPU

* add transformers version check

* add python>=3.10 requirement description for npu

* tiny fix

---------

Co-authored-by: hoshi-hiyouga <hiyouga@buaa.edu.cn>
Former-commit-id: 7912d1acac5f10dab22145fe729a90c57aad8d85
2025-01-13 19:43:36 +08:00
Zhangchi Feng
73c1c15b62 Fix template name of MiniCPM-V (#6620)
* fix template name

* tiny fix

Former-commit-id: 94dea52cef709a7e6f1cdc0b78e83e0422bd65d3
2025-01-13 16:46:48 +08:00
hoshi-hiyouga
7f58bf984f Merge pull request #6598 from BUAADreamer/minicpmv
[model] Support MiniCPM-V

Former-commit-id: 251e82bec12eaea6cf13608de191c096c63d1214
2025-01-13 15:24:02 +08:00
fzc8578
ec552372ba remove tests
Former-commit-id: 51addcd7ab81548a9952064dd8c95a8542252003
2025-01-13 15:08:35 +08:00
fzc8578
17d32fb5c7 fix tests
Former-commit-id: 582a17a12010943c7ca1cc0e25ebc8d125d10b45
2025-01-13 15:01:39 +08:00
fzc8578
4b61610b12 fix style
Former-commit-id: 76a36d9acecbf36b6959a14caacfed1d32bcee41
2025-01-13 14:19:38 +08:00
fzc8578
07798e4aad fix system prompt and tests
Former-commit-id: 955efca677b299749f3d40d587ee310951537543
2025-01-13 14:18:06 +08:00
fzc8578
6d6acd0213 add some
Former-commit-id: 5ad8ef3ec434f53f6fc494474becb034a3aca0ca
2025-01-11 15:03:20 +08:00
fzc8578
a789e0f263 add cpm_o test
Former-commit-id: 53cade69caed82b470fdb249274f03ee34af3100
2025-01-11 11:55:30 +08:00
fzc8578
f9ee00b6b6 add cpm_o test
Former-commit-id: 81dc0f678a7609c834581d956387bde42652755d
2025-01-11 11:49:03 +08:00
fzc8578
31bfdb08cd fix format
Former-commit-id: 964e18be5a824950164bc7232d35822a8b116d1a
2025-01-11 01:27:40 +08:00
fzc8578
12c83e00fc add some
Former-commit-id: 6233764d18f31365e9ba450408306fad55567ffc
2025-01-11 01:10:24 +08:00
fzc8578
9dc7b6c7ac adapt to new mllm_param
Former-commit-id: 0775b71965863c2618c117726a1046a36d6d85b8
2025-01-11 00:16:34 +08:00
Zhangchi Feng
627548bf7f Merge branch 'main' into minicpmv
Former-commit-id: 8a9c90759feda975faadc5858bd44b7ea116e7fb
2025-01-11 00:01:36 +08:00
hiyouga
dc65ecdf09 refactor mllm param logic
Former-commit-id: b895c190945cf5d991cb4e4dea2ae73cc9c8d246
2025-01-10 15:45:48 +00:00
fzc8578
e577990eb2 add minicpmv2.6
Former-commit-id: 1ab0aea54b54066cad500b7969b86a0e952d396d
2025-01-10 23:45:44 +08:00
fzc8578
1f3b729a4b add some
Former-commit-id: 58f50b8729083e9ea0fdcf07042b06261670ad57
2025-01-10 23:29:06 +08:00
fzc8578
0aa7ac210f add some
Former-commit-id: 3acd151a0f8efdd230c0b0980550795d204a69f7
2025-01-10 21:25:32 +08:00
fzc8578
40382f1387 fix some
Former-commit-id: 1eb7118db3ad6054cfd59d5f16a5d882e40e9057
2025-01-10 20:55:52 +08:00
fzc8578
75b3819e43 fix version
Former-commit-id: 834903fbf7a0fc8ac110f62f4df7c13819dd3c68
2025-01-10 20:31:04 +08:00
fzc8578
e63c2df0b1 fix some
Former-commit-id: cd5a1a8b9c6eb59d6e95f79573f60ad8668f1942
2025-01-10 20:27:06 +08:00
fzc8578
25d4889789 tiny fix
Former-commit-id: f088e580d3bacd0eecd0c3bf17e928eb49832ba1
2025-01-10 20:15:39 +08:00
Zhangchi Feng
8c0a721c4c Merge branch 'main' into minicpmv
Former-commit-id: d8840ae416660e23f1d615ffd404f519360151d9
2025-01-10 20:12:07 +08:00
fzc8578
9e972bc9ec add some
Former-commit-id: fede563aeb716ba5d1e368fd3e1182e4e580d248
2025-01-10 20:01:22 +08:00
hoshi-hiyouga
1675712a4c Merge pull request #6588 from hiyouga/hiyouga/upd_issue_temp
[gh] update issue template

Former-commit-id: 0a2626f996ce61559e93bedf19083aac5c861666
2025-01-10 03:03:48 +08:00
hiyouga
e0c9012f7f update issue template
Former-commit-id: 2bfca993588d8087dfd118f6f02486bbe752b166
2025-01-09 18:58:53 +00:00
hoshi-hiyouga
a25024bd0c Merge pull request #6585 from hiyouga/hiyouga/add_phi4
[model] add phi4 model

Former-commit-id: 0ae6a9b7bf9f1d6d844b97406b4795363bf75e78
2025-01-10 02:39:17 +08:00
hiyouga
867980196e improve template, add phi4 model
Former-commit-id: a785b6796e445a3adba45c5b6947166a2ff99871
2025-01-09 18:27:54 +00:00
hoshi-hiyouga
4e25d037c8 Merge pull request #6564 from stephen-nju/fix_ray
Fix ray

Former-commit-id: d4566839369726023f1b6e8f4b2332bda0c715cc
2025-01-08 18:14:18 +08:00
hoshi-hiyouga
6ba6926221 Merge pull request #6565 from hiyouga/hiyouga/improve_log
[misc] imporve log

Former-commit-id: 538bf7b839c63d6a6758522fa08999d9b78e9db2
2025-01-08 18:08:21 +08:00
zhubin
b6b53b61f7 fix –get ray args when args not a dict
Former-commit-id: 5e5398cd5b117b2378107172d3f91cfb0321e842
2025-01-08 10:06:02 +00:00
hiyouga
647c51a772 imporve log
Former-commit-id: a6abf375975ffea3d51e1b944c9855b5f62ffac8
2025-01-08 09:56:10 +00:00
hoshi-hiyouga
3b843ac9d4 Merge pull request #6542 from erictang000/et/ray-integration
Ray Train integration with LLaMA-Factory

Former-commit-id: 4e34ee0a8e0aa90b535e53608b51c5c0804db34e
2025-01-08 11:46:03 +08:00
hiyouga
0ef1f981da fix llamaboard with ray
Former-commit-id: bd8a432d6a980b1b24a551626304fe3d394b1baf
2025-01-07 09:59:24 +00:00
hiyouga
944a2aec4d refactor ray integration, support save ckpt
Former-commit-id: 2f50b27e608b2092bfceab6c6e84e6631e973ee2
2025-01-07 09:39:10 +00:00
Eric Tang
4f31ad997c run style check
Former-commit-id: 5ec33baf5f95df9fa2afe5523c825d3eda8a076b
2025-01-07 08:55:44 +00:00
Kourosh Hakhamaneshi
8683582300 drafting ray integration
Signed-off-by: Kourosh Hakhamaneshi <kourosh@anyscale.com>

Former-commit-id: 19c12ddae9350f6e25a270fe3372f5b9094cf960
2025-01-07 08:55:44 +00:00
hoshi-hiyouga
5ccc607222 Merge pull request #6547 from hiyouga/hiyouga/fix_pixtral_dpo
[trainer] fix pixtral dpo

Former-commit-id: 920bb2a8922847fa544e2c260c67161e64cf5d50
2025-01-07 14:38:55 +08:00
hiyouga
d8bd46f1bf fix #6546
Former-commit-id: 6fcf2f10faf3b1614896b091591eeef96d717e64
2025-01-07 06:30:44 +00:00
fzc8578
8c2a712247 add some
Former-commit-id: b4790c66c126567bd193de52a564e3ce11c94769
2025-01-06 19:32:39 +08:00
hoshi-hiyouga
53e41bf2c7 Merge pull request #6528 from hiyouga/hiyouga/upd_wechat
[assets] update wechat

Former-commit-id: 3ceedf44896b5ebc406d6398b3f15e74e4710fbe
2025-01-04 16:01:21 +08:00
hiyouga
0eeae9061c update wechat
Former-commit-id: 11a9d96a042e8afd972e0bf2fa3e51f95e4799ec
2025-01-04 07:59:57 +00:00
Zhangchi Feng
08729dbefc Merge branch 'hiyouga:main' into minicpmv
Former-commit-id: 873b2d5888038e2328a12a6eb7c84099ba7ca1f3
2025-01-04 11:20:33 +08:00
fzc8578
2c120aa0df add some
Former-commit-id: 81176fe226da89eace89cb202bad68e73b7c2a02
2025-01-04 11:11:15 +08:00
hoshi-hiyouga
cca6286b6f Merge pull request #6524 from hiyouga/hiyouga/upd_scripts
[misc] update scripts

Former-commit-id: 6ba3ec45fc369c095ab9a1fbd9847dc66cf24ca4
2025-01-03 23:52:26 +08:00
hiyouga
8516054e4d update scripts
Former-commit-id: 05aa52adde8905ca892f1ed5847d6f90b1992848
2025-01-03 10:50:32 +00:00
hoshi-hiyouga
d1a8cd67d2 Merge pull request #6515 from hiyouga/hiyouga/misc
[misc] update model name

Former-commit-id: f92eea4090351dcd3c364e10a9eec0d17d480e12
2025-01-02 20:20:02 +08:00
hiyouga
8a5b4bdfd4 update model name
Former-commit-id: bf627d9f1ac117f040adbfd7630b5283f0db556a
2025-01-02 12:19:21 +00:00
hoshi-hiyouga
3bceef02ee Merge pull request #6514 from hiyouga/hiyouga/add_project
[readme] add project

Former-commit-id: 0bd0c373183731302f1af9f33a1f8ff70ba743e2
2025-01-02 20:16:15 +08:00
hoshi-hiyouga
166a830938 Merge pull request #6513 from hiyouga/hiyouga/add_gpt2
[model] add gpt2 model

Former-commit-id: 859c37f43c8a49eea4f118d0d00ee2a554f6bd4f
2025-01-02 20:15:55 +08:00
hiyouga
18767fe026 add project
Former-commit-id: 3b7e745d271e36b4cfe8826820b23254e1debfe9
2025-01-02 12:15:41 +00:00
hiyouga
18a1a4b9da add gpt2 model
Former-commit-id: 37d5e3639fcf5ae6e58cc435e0fa9dee0d6e4ead
2025-01-02 12:07:38 +00:00
hoshi-hiyouga
6015fe700e Merge pull request #6512 from hiyouga/hiyouga/fix_gen_logic
[trainer] fix generate logic

Former-commit-id: b97759421c535560ade631a7fa0a57b7c0da50f1
2025-01-02 19:36:54 +08:00
hoshi-hiyouga
369dae8dd3 Merge pull request #6462 from shibingli/main
Add ARG HTTP_PROXY in Dockerfile to support HTTP proxy during image building

Former-commit-id: 1e72bb24253bb07da874f3a37ccfa4fddaaf6978
2025-01-02 19:34:17 +08:00
hiyouga
2aaf3697d7 fix #6499
Former-commit-id: dffc607220ff6dac15cf501ac9a3cdbe80c25211
2025-01-02 11:28:54 +00:00
hoshi-hiyouga
5504b5254c Merge pull request #6492 from hiyouga/hiyouga/add_deepseek3
[model] add deepseek3 model

Former-commit-id: 0a6d1244a51f3cc8fe141b32f39bffce4c924a8c
2024-12-30 21:50:13 +08:00
hiyouga
b2e4f11602 add deepseek3 model
Former-commit-id: 611779d412f31e25b1ed38049050eee2da61dde5
2024-12-30 13:39:20 +00:00
hoshi-hiyouga
e3f95abca7 Merge pull request #5507 from piamo/main
Add deepseek-v2.5 template

Former-commit-id: 8a4911d201e219465fe0835a3ceb967f8b80dc0e
2024-12-30 21:08:25 +08:00
hoshi-hiyouga
2f44f70c2c Merge pull request #6483 from hiyouga/hiyouga/fix_paligemma_infer
[model] update vllm & fix paligemma dtype

Former-commit-id: 03ad6d44805a965764aaa51376964972b9b7da3d
2024-12-30 16:34:32 +08:00
hiyouga
f8f05a883b fix #6482
Former-commit-id: 8577f52b4152efe6cc7a8b5f6d37b4f9ba6684e7
2024-12-30 06:03:07 +00:00
hoshi-hiyouga
5f473e2696 Merge pull request #6465 from hiyouga/hiyouga/fix_eval_loss
[trainer] fix eval loss

Former-commit-id: fa8110b2052a74b4bd0dcf391a54207e1e31056d
2024-12-28 01:02:56 +08:00
hiyouga
88b1874c04 fix #6448
Former-commit-id: 04f78e85af5af14b4c195936623e426a6a128af2
2024-12-27 16:54:39 +00:00
shibingli@yeah.net
58bc6943dc Add ARG HTTP_PROXY in Dockerfile to support HTTP proxy during image building.
Former-commit-id: c46af4c45f96f1942dfaf77bdbdbe5d0fe85a387
2024-12-27 18:31:14 +08:00
shibingli@yeah.net
2dedf7b401 Add ARG HTTP_PROXY in Dockerfile to support HTTP proxy during image building.This commit introduces an ARG parameter named HTTP_PROXY in the Dockerfile. This addition allows for the configuration of an HTTP proxy, facilitating image building in environments with network restrictions.
Former-commit-id: d59fe30bca636bc2ca132d50172dba0032cecb6b
2024-12-27 18:17:17 +08:00
hoshi-hiyouga
5769a553d2 Merge pull request #6457 from youkaichao/module-run
[misc] enable module run

Former-commit-id: 813881a5d13dd1d5a526a85d41032196e0d46f04
2024-12-26 23:41:37 +08:00
youkaichao
552816e04b Update cli.py
Former-commit-id: 18e65bbd3ae07af3b9eed7f293c345815776c325
2024-12-26 23:22:09 +08:00
hoshi-hiyouga
b5fa1044b8 Merge pull request #6443 from hiyouga/hiyouga/add_qvq
[modle] add qvq

Former-commit-id: 2010e80b1a939d21efa13d54df5f5d648ea640de
2024-12-25 15:53:19 +08:00
hiyouga
3c55976a0e add qvq #6439
Former-commit-id: 4dbfa142d899dd6e4d1a9d4db125765af5580a4f
2024-12-25 07:52:41 +00:00
hoshi-hiyouga
4611f67fae Merge pull request #6426 from hiyouga/hiyouga/update_readme
[assets] update readme

Former-commit-id: 2309c431090d1f3b573d113bbedeabee2b01fdf2
2024-12-23 22:17:19 +08:00
hiyouga
a5346041bb update readme
Former-commit-id: 1deda4750e0df6c46aeb33cf3f8b35baa537cc1d
2024-12-23 14:08:59 +00:00
hoshi-hiyouga
df42e438c1 Merge pull request #5922 from Tuyohai/main
support granite3 models

Former-commit-id: a9087bc0549f7f16e5b4c39e324043755b1618c8
2024-12-23 16:46:02 +08:00
hoshi-hiyouga
7dbfd7dff6 Merge pull request #6418 from hiyouga/hiyouga/add_report
[trainer] add custom args to experimental logger

Former-commit-id: 5e5a7ba73c1a386f025d75c10b102306bcb98674
2024-12-22 05:47:55 +08:00
hiyouga
a897d46049 support report custom args
Former-commit-id: d41254c40a1c5cacf9377096adb27efa9bdb79ea
2024-12-21 21:42:45 +00:00
hiyouga
adff887659 fix paligemma infer
Former-commit-id: d272455d6118c1d670c70cfe3458d8dab111da6c
2024-12-21 20:24:32 +00:00
hoshi-hiyouga
eba78f2159 Merge pull request #6416 from Zeyi-Lin/main
docs: use swanlab
Former-commit-id: 0759b576a36cde120ccb8cadd96fca4d871be130
2024-12-22 04:08:26 +08:00
ZeYi Lin
ec05c8cdb4 docs: use swanlab
Former-commit-id: 33509ea7bcd5f698a8393379bb3941c3c32f7fd6
2024-12-21 20:59:25 +08:00
hoshi-hiyouga
0a869c4ed4 Merge pull request #6401 from Zeyi-Lin/hiyouga/swanlab
feat: add swanlab for experiment tracking and visualization.
Former-commit-id: e65fe507f7643bf40b0fc462805c7b7f8ef6b738
2024-12-21 14:09:33 +08:00
ZeYi Lin
f792eaf8d4 fix: project blank
Former-commit-id: 3a0939572b0bfc7da0ee1a7244b6b3fbf567aba0
2024-12-20 18:26:02 +08:00
ZeYi Lin
8a41c96761 fix: by hiyouga suggestion
Former-commit-id: 41195f1bc69e4b5da7a265369d368b06754362cf
2024-12-20 16:43:03 +08:00
ZeYi Lin
e5d9d8c55d feat: ui improve
Former-commit-id: 6a1effb1741a13ae5238b0e9b429b4cbe3b6534f
2024-12-20 11:03:02 +08:00
ZeYi Lin
3e44c8fe3a fix: text
Former-commit-id: 52fe8d61eba7b7d8f66df09a03d40f25cc9c5b44
2024-12-19 21:26:02 +08:00
ZeYi Lin
925e421bde fix: bugs
Former-commit-id: a2297f97f7587c77d55fbce9ffa81dc60d0b04a1
2024-12-19 21:08:16 +08:00
hoshi-hiyouga
bbb636bdba Merge pull request #6395 from hiyouga/hiyouga/fix_genkwargs
[generate] fix generate kwargs

Former-commit-id: 1193594f2d06df38ec0aef7f591c74651cf1353c
2024-12-19 20:24:17 +08:00
ZeYi Lin
a30bdbb1c0 docs: config framework
Former-commit-id: 9cad21df82754170900e3ea74476f674754159b3
2024-12-19 20:22:36 +08:00
ZeYi Lin
95b7e10a06 fix: string
Former-commit-id: 73e1da5ab07c96a6faa9738e83c4dd9297f34b14
2024-12-19 20:18:59 +08:00
hiyouga
0385c60177 fix #6391
Former-commit-id: 067ba6e6cb4d8a1d95bba0a108f73008416a2865
2024-12-19 12:16:38 +00:00
ZeYi Lin
44895ebe36 feat: optimize frontend
Former-commit-id: 4a78603c141d9bd78bcaf81261b443cf082bf51f
2024-12-19 19:04:19 +08:00
ZeYi Lin
44dfbf9dbd feat: swanlab params
Former-commit-id: 761b3bdb03e27826fde2ca86d4e37b53c2bbc777
2024-12-19 18:47:27 +08:00
hoshi-hiyouga
0a465fc3ca Merge pull request #6388 from hiyouga/hiyouga/shuffle_control
[trainer] support disable shuffling

Former-commit-id: 3243e74a2ed3b1f7fa818842955f91386b591a9c
2024-12-19 17:00:12 +08:00
hiyouga
01eeae50b5 support disable shuffling
Former-commit-id: 9d8c35fd6b838ede0bd6827c6c6121f2cba2b11b
2024-12-19 08:53:21 +00:00
hiyouga
7eeeffdb8a add swanlab
Former-commit-id: c85a77c8a8824a56a67d56b97b4877fcd6edeb3d
2024-12-19 07:12:31 +00:00
hoshi-hiyouga
eca06531c3 Merge pull request #6384 from hiyouga/hiyouga/fix_webui
[webui] fix webui args

Former-commit-id: 94294c4e356b3ac5546f897d6e3255ee8c2a260f
2024-12-19 14:57:52 +08:00
hiyouga
d90b40b60f fix webui
Former-commit-id: 7152fde4a026e67f15885814c1900f3911d04ee8
2024-12-19 06:48:03 +00:00
hoshi-hiyouga
1898c1e9a6 Merge pull request #6379 from hiyouga/hiyouga/add_paligemma2
[model] add paligemma2

Former-commit-id: abe3ff3fe0b113e949bf6d2bd10e4c125fb8fe75
2024-12-18 17:03:11 +08:00
hiyouga
8d2f8b0dd8 add paligemma2
Former-commit-id: dafbc31684cb2566ef23c79e171cdfd02d6d396b
2024-12-18 08:57:26 +00:00
hoshi-hiyouga
df42281256 Merge pull request #6313 from ge-xing/main
support telechat2 model

Former-commit-id: 282d0619b1047ba48f9bc3ac837d2ed40b7df307
2024-12-18 16:16:17 +08:00
hoshi-hiyouga
896cf476d5 Merge pull request #6369 from hiyouga/hiyouga/template
[template] support qwen2 tool template

Former-commit-id: e1e133635f05f5b83869bc02340d6ea46976f318
2024-12-18 04:23:49 +08:00
hiyouga
37961d5f06 support qwen tool format
Former-commit-id: cbef4cb501fa1b50fa611e7054a856ce2c5ed10e
2024-12-17 20:12:06 +00:00
hiyouga
bb047bc844 change default replace jinja to false
Former-commit-id: bfe6625f6f6aa294933fa9056a4bfedee4fbe5e2
2024-12-17 19:27:10 +00:00
hoshi-hiyouga
448adedf6a Merge pull request #5473 from AlongWY/mistral
Support Mistral format tools

Former-commit-id: 4838427310d49e5942138e4578d2483baa005471
2024-12-18 03:23:24 +08:00
ylfeng
469c7cd462 Support Mistral format tools
Former-commit-id: e42d0e54b7a64a3f017a09e99846d174db7b438f
2024-12-17 19:13:26 +00:00
hoshi-hiyouga
ebf6a07681 Merge pull request #6368 from hiyouga/hiyouga/fix_llama_template
[template] fix llama3 tool template

Former-commit-id: 7c6763c4f3287f758077191361d5b0354741f84a
2024-12-18 01:10:48 +08:00
hiyouga
53f0fff513 fix llama3 tool template
Former-commit-id: 63f28a594a44c011f2e6d418f22ddbfc445db163
2024-12-17 17:05:10 +00:00
hoshi-hiyouga
ab7567693d Merge pull request #6367 from hiyouga/hiyouga/add_model
[model&template] add llama3.3 & support llama3 tool prompt

Former-commit-id: c32012c5e4943a30c3061716ed780d6124b6c90d
2024-12-18 00:13:28 +08:00
hiyouga
1b8aab0723 support llama3 tool prompt
Former-commit-id: dc45d2f56669fd99935a68cda1ec0e8f36229f7f
2024-12-17 15:52:37 +00:00
hoshi-hiyouga
30ebe61914 Merge pull request #5819 from yafshar/remote_code
Add trust_remote_code Parameter and Set Default to False

Former-commit-id: e82099350a2fb6d8ddf9c80ba0b18173057d4dcf
2024-12-17 21:10:24 +08:00
Yaser Afshar
6f1c8dacea Add missing key to init_kwargs
Former-commit-id: 03fc4621dad132164596a58d3e8693787b7e1aca
2024-12-17 12:34:05 +00:00
Yaser Afshar
8881237475 Add trust_remote_code parameter and remove True
- Introduced a new model parameter `trust_remote_code`
- Set the default value of `trust_remote_code` to `False`
  to enhance security


Former-commit-id: 4bf23f406cf5235c16f9f8139850c53354901814
2024-12-17 12:25:12 +00:00
zhaohu xing
584755be4b support telechat2 model
Former-commit-id: 15a069d85c07842cd28d65845af93c3cf70ef1f4
2024-12-17 12:15:33 +00:00
hoshi-hiyouga
3d3324be5c Merge pull request #6364 from hiyouga/hiyouga/control_reenterent_gc
[model] support non-reenterent-gc

Former-commit-id: a8a13cb360980bb4acd493e33ed405e07460fe73
2024-12-17 19:58:36 +08:00
hiyouga
4196d5b4d6 support non-reenterent-gc & fix #6358
Former-commit-id: 20446141e408885eb36d512bfb2dfb62bbc0c20d
2024-12-17 11:41:59 +00:00
hoshi-hiyouga
101c95ce65 Merge pull request #6363 from hiyouga/hiyouga/control_skip_eos
[infer] support control eos

Former-commit-id: 963640cff370be9f2fab649c88a120a645e6992e
2024-12-17 19:35:40 +08:00
hiyouga
19ebc0e7a2 support control eos, fix #6345
Former-commit-id: cb0f8399356bf372f3b7963f2565c3d504be0923
2024-12-17 10:42:05 +00:00
hoshi-hiyouga
1ce15b5d9e Merge pull request #6362 from hiyouga/hiyouga/mllm_packing
[model] generalized packing

Former-commit-id: b85f77a2687f7e0d11f7d2e49de54c544e39e3d5
2024-12-17 18:41:48 +08:00
hiyouga
d670d62a66 generalized packing & fix #6343
Former-commit-id: 3b1e4194616cacd5c24f08b328e31a008bddcf29
2024-12-17 10:26:19 +00:00
hoshi-hiyouga
6522467ddb Merge pull request #6359 from hiyouga/hiyouga/fix_qwen2vl_infer
[model] fix qwen2vl infern

Former-commit-id: 419cba5fae31a3c88305fe424b8aae9d59e3941a
2024-12-17 18:15:23 +08:00
hiyouga
aacd9642f5 fix #6348
Former-commit-id: 83e552320909f4775377889f1512994b7e638a7e
2024-12-17 10:06:46 +00:00
hoshi-hiyouga
4446c92517 Merge pull request #6334 from hiyouga/hiyouga/add_examples
[assets] update wechat and examples

Former-commit-id: 7725e7ac7d21ad844e8424a920e8bece6f38af19
2024-12-15 01:37:01 +08:00
hiyouga
8c65548b10 update assets
Former-commit-id: 7b9bd552b2bf97b72976511094eb51dfde5d1017
2024-12-14 17:36:03 +00:00
hiyouga
fb22651faf fix mrope
Former-commit-id: 55bee1d333549ca19858b3f5c1b7b86926e5fb09
2024-12-12 15:08:17 +00:00
hoshi-hiyouga
cfff136b2a Merge pull request #6253 from hiyouga/hiyouga/qwen2vl_mm_proj
[model] support qwen2vl train proj only

Former-commit-id: 0b0012142ab683da1e0558e6240310bf90f39150
2024-12-05 20:25:33 +08:00
hiyouga
bac2c64f87 support qwen2vl train proj only
Former-commit-id: 0e949ef03455726e907c6f1039e93ebe480c897a
2024-12-05 10:37:42 +00:00
hoshi-hiyouga
be1ec97c8e Merge pull request #6251 from hiyouga/hiyouga/vllm_qwen2vl_infer
[infer] support qwen2vl vllm infer

Former-commit-id: df76f7d6e124131ce7628c31cce01de4f8e6014c
2024-12-05 18:26:19 +08:00
hiyouga
bbd432415d support qwen2vl vllm infer
Former-commit-id: 03ddd2555fb97488cd4daab11e8b672d36150c5a
2024-12-05 10:17:26 +00:00
hoshi-hiyouga
1fef702382 Merge pull request #6246 from hiyouga/hiyouga/update_examples
[examples] update examples

Former-commit-id: ecb688bdb3e940651d64bc1edc85ce4568f3eabe
2024-12-05 16:49:30 +08:00
hiyouga
39865d8a1f update examples
Former-commit-id: bcb010be7732ae137f156932100ee4d02a93725c
2024-12-05 08:48:25 +00:00
hoshi-hiyouga
c7b27bd70b Merge pull request #6242 from hiyouga/hiyouga/fix_script
[script] fix scripts

Former-commit-id: cf254ea0891ea2e6522fdbefcccf409ff7aafd99
2024-12-05 11:54:46 +08:00
hiyouga
86e4fab0d5 fix scripts
Former-commit-id: f94f55d20283298cb7d90d0573992a62df414a8f
2024-12-05 03:47:32 +00:00
hoshi-hiyouga
ff3e40e4a5 Merge pull request #6160 from village-way/pr_dataloader
fix:tokenized_path not None and load_from_disk return Dataset Trigger…
Former-commit-id: 63de20970c8062aeebed5f366f1675beb12e05bf
2024-12-04 22:18:19 +08:00
hoshi-hiyouga
ea830cad0c lint
Former-commit-id: 191ccc585399ad4c6c2c4f280b144b2c0a4869f3
2024-12-04 22:08:27 +08:00
hoshi-hiyouga
225e270fd5 Merge pull request #6238 from hiyouga/hiyouga/vllm_batchinfer
[infer] feat: support batch infer in vllm

Former-commit-id: 886752801ba8a5bf6fc4853ed618817185950c11
2024-12-04 21:59:13 +08:00
hiyouga
c1768cfb14 support batch infer in vllm
Former-commit-id: 3ef5ed3b9a44eed2f7e3ff221dfc343d0a97c0b5
2024-12-04 13:50:00 +00:00
hoshi-hiyouga
53edd62f8b Merge pull request #6190 from JieShenAI/main
add vllm_infer script

Former-commit-id: 09c7ea700c83dcf8d75796a1e28a36197f62cab4
2024-12-04 21:19:23 +08:00
hoshi-hiyouga
41a7e128b6 Merge pull request #6170 from hykilpikonna/main
[+] Show the hostname in webui title

Former-commit-id: 1cb2f9da317a8db8f45e887ab57cdfdc0e8b9412
2024-12-04 18:07:29 +08:00
hoshi-hiyouga
6b8c41c3ac Merge pull request #6233 from hiyouga/hiyouga/vlm_zero3
[data] fix vlm zero3 training

Former-commit-id: b0cbd5e3464a8a1a0f1cf709fb107b23a61f34ff
2024-12-04 17:51:10 +08:00
hiyouga
2f09c34980 fix vlm zero3 training
Former-commit-id: 86fe7fe71b51077310357b7b1895522258f9bc7a
2024-12-04 09:40:39 +00:00
JieShen
76dc69ce36 add async call api
Former-commit-id: 0f728386d88cf8253250c6650555d41578114a0c
2024-12-01 22:18:05 +08:00
JieShen
6c9d05539a add vllm_infer script
Former-commit-id: 4daab843a3aa096b35e5d3832c01fac4271e4604
2024-11-29 14:22:20 +08:00
Azalea
b6bc17f730 [U] Compute hostname differently
Former-commit-id: fbc735972af6facdaba169603a4c77e613b2e8d7
2024-11-28 22:23:41 -05:00
hoshi-hiyouga
c07ba8ccc0 Merge pull request #6175 from hiyouga/hiyouga/add_qwq
[model] add QwQ

Former-commit-id: da8f565c359004d811481b8b85f2a36f30e95e23
2024-11-28 17:01:53 +08:00
hiyouga
ed86f621a0 add qwq
Former-commit-id: acad977356a7f2e729eb6f2cb919a416b18f8add
2024-11-28 08:50:57 +00:00
Azalea
c6a3175bbf [+] Show the hostname
Former-commit-id: 410847656a760fe4c2c310b0d770072392d7aefb
2024-11-28 12:25:02 +08:00
wangdepeng
452291417d fix:tokenized_path not None and load_from_disk return Dataset Trigger stuck
Former-commit-id: cbf9da35728daaf98d92e699e891e334c74af1e5
2024-11-27 16:44:42 +08:00
hoshi-hiyouga
ab9db8b7c7 Merge pull request #6156 from hiyouga/hiyouga/add_o1
[data&model] add marco-o1, skywork-o1 and openo1

Former-commit-id: fa8aa1a3bcb49357799ec30fbb3f143a015e5d58
2024-11-27 14:36:01 +08:00
hiyouga
877e2ea791 fix dataset
Former-commit-id: d4a2d299414984a4043d30034c5c95e2d717a49e
2024-11-27 06:27:44 +00:00
hiyouga
6ea42d5b63 add skywork o1
Former-commit-id: 272a6fe972de926e5841c1570995f4e6fed9f28d
2024-11-27 05:51:59 +00:00
hiyouga
31c117e696 Merge remote-tracking branch 'origin/main' into hiyouga/add_o1
Former-commit-id: 5da8c00b233f96e51cf3bac7f25e3e61659d0cb7
2024-11-27 05:36:41 +00:00
hoshi-hiyouga
04f057334f Merge pull request #6157 from hiyouga/hiyouga/fix_ci
[ci] pin tokenizers version

Former-commit-id: 0357d7530d16699e728bc648abd08ea309e84865
2024-11-27 13:33:04 +08:00
hiyouga
99a54d06ca pin tokenizers version
Former-commit-id: 2b747737f0be2caeb737fe87dad6bf5902b4a588
2024-11-27 05:24:58 +00:00
hiyouga
8332c85f37 add marco-o1 and openo1 dataset
Former-commit-id: 51d49e075470951f109bcdde136203f972450c2e
2024-11-27 04:20:23 +00:00
hoshi-hiyouga
fcf1a3df62 Merge pull request #6152 from hiyouga/hiyouga/add_num_proc_in_data_load
[data] add num_proc in load_dataset

Former-commit-id: d8258ba7e792d5f17ae80d5e8b303e8fa820f162
2024-11-27 00:16:15 +08:00
hoshi-hiyouga
f4f52ae67d Merge pull request #6151 from hiyouga/hiyouga/fix_mllama
[model] fix mllama cross mask

Former-commit-id: 7e64661c1fc53c4d3d9fd915162b762e403b1991
2024-11-27 00:07:54 +08:00
hiyouga
0b08d5882a fix #6149
Former-commit-id: b581b272793314a9602f4dc2fb646a988a6249df
2024-11-26 16:03:02 +00:00
hiyouga
62eeafaba6 fix mllama cross_mask
Former-commit-id: c33967308bebd99489d28bd5a879525cf304c1f9
2024-11-26 15:56:58 +00:00
hoshi-hiyouga
5a52e41399 Merge pull request #6141 from hiyouga/hiyouga-patch-1
[misc] chore: lint

Former-commit-id: ba2b94c68eb08798792be76f95b94b358ce69f44
2024-11-25 23:02:11 +08:00
hoshi-hiyouga
e8083f8f3f lint
Former-commit-id: 57c3cf1f498d5ffafdc8c06e0f8713f8ff77de81
2024-11-25 22:55:56 +08:00
hoshi-hiyouga
338b3a03f0 Merge pull request #6140 from hiyouga/hiyouga/fix_mllama
[data] fix mllama plugin

Former-commit-id: b7e220a7d82db26cbe7ced9ed30332418cc4fa20
2024-11-25 22:32:07 +08:00
hoshi-hiyouga
c8b01b41ac fix #6139
Former-commit-id: a4e9552b9ade6ebb22d782f0412003279ddca23c
2024-11-25 22:22:06 +08:00
hoshi-hiyouga
6d08a418ed Merge pull request #6137 from hiyouga/hiyouga/fix_mllama
[model] fix mllama hidden_size

Former-commit-id: 54f1d3f4064b9d37261883e8399c8e7909178857
2024-11-25 20:17:33 +08:00
hoshi-hiyouga
e3066d1489 fix visual patch
Former-commit-id: ac51fa37cc23518b30a6123e188964dce39be82f
2024-11-25 20:06:06 +08:00
hoshi-hiyouga
487e3f2507 fix #6136
Former-commit-id: b84e5d91a070c473ea820c379bf9b5abbca6df2c
2024-11-25 19:43:42 +08:00
hoshi-hiyouga
b82a53cad8 Merge pull request #6127 from hiyouga/hiyouga/dev_version
[misc] set dev version

Former-commit-id: cb0a51031324c9fdf0c1fedf237692a40c2091d9
2024-11-25 01:42:29 +08:00
hiyouga
5bec82ca9d set dev version
Former-commit-id: a0aea74100a9505664023f6a46fc290e332dfa40
2024-11-25 01:36:49 +08:00
hoshi-hiyouga
57354fc990 Merge pull request #6124 from hiyouga/hiyouga/release
[release] release v0.9.1

Former-commit-id: f61cdd99fd282612884c92d36e111ad46b4e0d00
2024-11-25 00:20:02 +08:00
hoshi-hiyouga
89f240805c Merge pull request #6126 from hiyouga/hiyouga/fix_vllm
[inference] fix vllm

Former-commit-id: c5025c3ee6e67e62724cc3f34fbf8aa9968590f5
2024-11-25 00:19:54 +08:00
hoshi-hiyouga
27bbea886c Merge pull request #6010 from XYZliang/fix-#4316
Increase shm_size to 16GB in docker-compose.yml

Former-commit-id: 73194233f9f1aa8299be1360deb25b753338e168
2024-11-25 00:16:42 +08:00
hoshi-hiyouga
3ec3dda33a Merge pull request #6125 from hiyouga/hiyouga/fix_cli
[cli] remove shell=True in cli

Former-commit-id: cf3ec28baa9a9f1ba342fe3a627e85d8799a1912
2024-11-25 00:07:35 +08:00
hiyouga
ae9f338bf7 fix vllm
Former-commit-id: 9ce0e4b07e3733c015137bc93c7e6d53bf25b08e
2024-11-25 00:07:24 +08:00
hiyouga
bf44f76dc7 fix cli
Former-commit-id: 9338c287cc15c0cad8d5ddbdadfb6f64d383c034
2024-11-24 23:56:21 +08:00
hiyouga
c18581f0a4 release v0.9.1
Former-commit-id: a134ad42c65dc4d72e3083c932ddfaaa687c513d
2024-11-24 23:48:41 +08:00
hoshi-hiyouga
9f6c5c4798 Merge pull request #6123 from hiyouga/hiyouga/fix_qwen2vl_vllm
[inference] fix qwen2vl vllm infer

Former-commit-id: 5d886f99e3bd20795d5313dccf9f045d37a0aefc
2024-11-24 23:42:11 +08:00
hiyouga
7bc03ac986 fix qwen2vl vllm infer
Former-commit-id: 3ac98847fdc23129912c8994ed19a8c66fe00b8c
2024-11-24 23:27:24 +08:00
hoshi-hiyouga
85d7e4f4ab Merge pull request #6121 from hiyouga/hiyouga/readme
[readme] update readme

Former-commit-id: d603650a671c3a323f29001fd0cc53563d28f3e0
2024-11-24 03:28:09 +08:00
hiyouga
bf69747f40 update readme
Former-commit-id: 48423afe53d6f6de1a257a33019909009626a42e
2024-11-23 19:27:18 +00:00
hoshi-hiyouga
f1146bf7b6 Merge pull request #6120 from hiyouga/hiyouga/fix_ci
[test] fix ci

Former-commit-id: 573a0978b82986ec45aae16637edb6ff4af54a35
2024-11-24 03:21:11 +08:00
hiyouga
9efd1fec90 fix ci
Former-commit-id: 91c672f0147bb6eb998871a42f8a89992af88528
2024-11-23 19:13:32 +00:00
hoshi-hiyouga
3b91839a55 Merge pull request #5555 from marko1616/feat/llama3.2vl
Support llama3.2 vision

Former-commit-id: 8151dc488585d1cec6d4a0c9c6dcd46a6a57e9f0
2024-11-24 02:49:07 +08:00
hiyouga
bc4421eeef add forbidden modules
Former-commit-id: c9f4d051d0eca7515bab201afdef17f1ac1b3cb9
2024-11-23 18:34:15 +00:00
hiyouga
5003820a6a fix inputs
Former-commit-id: 7d535bb8cdf7e81edda81152e63c8cfe6c9dcc9f
2024-11-23 18:26:02 +00:00
marko1616
cd2485f28d Linter.
Former-commit-id: 719d124f65ebb18ba0a1212751da9909160fb6f1
2024-11-23 16:09:04 +00:00
marko1616
918a367378 Tiny fix.
Former-commit-id: 4c1cef12d812832eed58b5da562ba083104756d3
2024-11-23 16:09:01 +00:00
marko1616
3d35aeca72 Support llama3.2vl.
Former-commit-id: 664229d7d1f7994e1ae68c5d197ab81f081bcd2e
2024-11-23 16:07:35 +00:00
hoshi-hiyouga
53b1e5fd1d Merge commit from fork
[patch] Patch remote OS command injection vulnerability

Former-commit-id: 960897b950e29aa440afa45b4deb9d42d2f6e941
2024-11-21 22:39:44 +08:00
hoshi-hiyouga
b852c895cf do not split save_cmd ret value
Former-commit-id: 1e312072fb4a9f472e2d3fa7e6b4fb0aec00b566
2024-11-21 22:30:23 +08:00
superboy-zjc
aaa7ed8712 [patch] Patch remote OS command injection vulnerability
Former-commit-id: 4678ceea4ce334a8289caf87d86047e67c67c603
2024-11-21 01:52:12 -05:00
hoshi-hiyouga
205aca5b03 Merge pull request #6078 from wtmlon/support-efficient-tokens-calculation
support effective tokens calculation on sft/dpo

Former-commit-id: d0510e6d49b43c5ffadd8af653c3bdecc1582417
2024-11-20 13:43:15 +08:00
Ting
87b1f851f1 code refactor
Former-commit-id: ee3f85aa9677d0aeecb3bc396530d2cd7c50dce5
2024-11-19 20:33:18 +08:00
Ting
fca814b30d update
Former-commit-id: 516ed0ea5fed8c74fe3669a7e85dd89b5a0ec3c2
2024-11-19 19:12:10 +08:00
Ting
a20c2b6ecf update
Former-commit-id: a3e8ca53e654136242197a2da872cc0e5cf67880
2024-11-19 19:10:07 +08:00
Ting
fee94e1c54 support efficient tokens calculation on sft/dpo
Former-commit-id: b157d5cccdeb42412b8b440d25d5bdfa8a50be68
2024-11-19 17:15:47 +08:00
hoshi-hiyouga
047a596542 Merge pull request #6065 from hiyouga/hiyouga-patch-1
[misc] fix dep package version

Former-commit-id: 34a09e6cd1a8b1c2acddf837f1c787978bc526f5
2024-11-18 21:13:59 +08:00
hoshi-hiyouga
3d45606984 fix #6061
Former-commit-id: 4eb0b6763f0a1b3cde89bd5c69760178bb35d303
2024-11-18 20:56:44 +08:00
hoshi-hiyouga
310c107d56 Merge pull request #6052 from hiyouga/hiyouga-patch-1
[trainer] fix DPO metrics

Former-commit-id: 94add263fe874d2be1b37110faf5da7a5096df6d
2024-11-16 16:20:12 +08:00
hoshi-hiyouga
089e4d9e96 fix #6050
Former-commit-id: 028ea3d9b4fa4ab74a969ac80e61a449d6c15e74
2024-11-16 16:11:16 +08:00
hoshi-hiyouga
ae56c3cf49 Merge pull request #6046 from hiyouga/hiyouga/add_code_model
[model] add qwen-coder and opencoder

Former-commit-id: 5b485671aee8dd2f775371d0b9ff3d0d043159f3
2024-11-15 21:58:03 +08:00
hiyouga
0a0288a286 add qwen-coder and opencoder
Former-commit-id: 9669a42704cd40bdfc76ca278cc6a562549bc27d
2024-11-15 21:48:38 +08:00
XYZliang
25da686758 Increase shm_size to 16GB in docker-compose.yml to optimize shared memory allocation for large-scale model fine-tuning tasks.
This pull request increases the shm_size parameter in docker-compose.yml to 16GB. The goal is to enhance the LLaMA-Factory framework’s performance for large model fine-tuning tasks by providing sufficient shared memory for efficient data loading and parallel processing.

This PR also addresses the issues discussed in [this comment](https://github.com/hiyouga/LLaMA-Factory/issues/4316#issuecomment-2466270708) regarding Shared Memory Limit error.


Former-commit-id: de2616d103b4bdc2458874068b1a223c7de82b4e
2024-11-13 10:13:59 +08:00
hoshi-hiyouga
e2da3cc9fa Merge pull request #5990 from hiyouga/hiyouga/dev_vllm
[generate] fix vllm config args

Former-commit-id: ee0745022bd7484f4f2e6b183088f55d5e60c085
2024-11-11 14:10:35 +08:00
hoshi-hiyouga
c42e5cf401 fix #5988
Former-commit-id: 9e08e206a8ea9926768b0f1d5ff9d7e3e216c269
2024-11-11 13:57:14 +08:00
hoshi-hiyouga
9943cd1c96 Merge pull request #5982 from hiyouga/hiyouga/vllm_args
[args] add vllm config

Former-commit-id: 07d3de5c8376d3c4147411ec603da4254885d2d7
2024-11-10 21:37:18 +08:00
hiyouga
1e6f96508a add vllm config
Former-commit-id: 95365f0ce4f362bde7de8b679b54b548d7055bfb
2024-11-10 21:28:18 +08:00
hoshi-hiyouga
d401974f69 Merge pull request #5973 from JJJJerry/fix_vllm_generate
fix VllmEngine: 将inputs参数替换为prompt

Former-commit-id: d3271416a316e6b92aea3026f6941f6967215a7b
2024-11-10 21:04:38 +08:00
hoshi-hiyouga
09b2dbe859 Update vllm_engine.py
Former-commit-id: 5638fae81c180b7d91eb6aebe6629640beb217d8
2024-11-10 20:57:00 +08:00
JJJJerry
7f8ef8c132 fix VllmEngine: 将inputs参数替换为prompt
Former-commit-id: 5affb1d20921afd3fe48802ff80785e412e2e3aa
2024-11-09 11:45:59 +08:00
hoshi-hiyouga
fcb6283a72 Merge pull request #5971 from hiyouga/hiyouga/fix_webui
[webui] fix extra args

Former-commit-id: d04e21d69e60ab4a350e70da7d1abbf11cfeed0e
2024-11-09 00:25:24 +08:00
hiyouga
0027f46ccc fix extra args
Former-commit-id: 2c98a1bc3d885170f8298872c2ea2e24427fb447
2024-11-09 00:24:27 +08:00
hoshi-hiyouga
967a27695e Merge pull request #5970 from hiyouga/hiyouga/fix_beam
[generation] fix vllm v0.6.3

Former-commit-id: 571d4538568272fd59cc5621e56113329c857546
2024-11-08 23:58:15 +08:00
hiyouga
3ce8a326c6 fix #5966
Former-commit-id: a9a99b545609083533cca1fd1e5480c60ea68750
2024-11-08 23:49:16 +08:00
hoshi-hiyouga
91b56b7baf Merge pull request #5927 from hiyouga/hiyouga/dev_fixmmchat
[fix] chat engines

Former-commit-id: e9c22e2d089927eee3bce052bbf7d6502d0ac544
2024-11-04 16:36:23 +08:00
hiyouga
e2fa961302 add image input type
Former-commit-id: 6fe260e35ff12662b72f26ec9df44e87b9693551
2024-11-04 08:27:20 +00:00
hiyouga
87d6d7dc61 fix chat engines
Former-commit-id: 3a220b7992d265c77d9a1a406ef86eefbc699cfe
2024-11-04 08:18:12 +00:00
hoshi-hiyouga
00019e2ca4 Merge pull request #5926 from hiyouga/hiyouga/dev_deps
[version] update datasets version

Former-commit-id: 4a24e8fc8e1c229ef8751bd7eafe024661d46661
2024-11-04 16:04:00 +08:00
hiyouga
b104739d63 update datasets version
Former-commit-id: feba2c6418a15715fee77a34428fa3cf47fcee5b
2024-11-04 07:52:26 +00:00
steven
6ef0d13e42 support granite3 models
Former-commit-id: 8cff612e55eb7df116e51c4dd21e7a42543e7a1f
2024-11-04 10:35:03 +08:00
hoshi-hiyouga
b238d1aa04 Merge pull request #5914 from hiyouga/hiyouga/dev_read
[misc] update readme

Former-commit-id: 2897696bad6bcc2d826845750c0c913882449829
2024-11-02 21:44:10 +08:00
hoshi-hiyouga
aa497d5d96 Merge pull request #5475 from menibrief/main
Fix phi-3-small issues 

Former-commit-id: c1daf49a967f6c0b641c9639a78971275aaa7cae
2024-11-02 21:31:34 +08:00
hiyouga
fecf04b2f4 fix phi3 template
Former-commit-id: b62131a3c5b4ff6f2969a8041e6e7b9cf2c444ed
2024-11-02 21:31:23 +08:00
hiyouga
3f157e2f6f update readme
Former-commit-id: 94bae8360b1aa124cc57dca481b9e686ba559f31
2024-11-02 21:28:04 +08:00
hoshi-hiyouga
c7c558562e update template
Former-commit-id: 3559ef6115a831dcd1adf7210995ffd62890cff6
2024-11-02 21:21:22 +08:00
hoshi-hiyouga
c2ea5fb618 Merge branch 'main' into main
Former-commit-id: 154f504fc2cebaae2b58c0121d6d8d8016db1bb2
2024-11-02 21:20:27 +08:00
hoshi-hiyouga
fa9c32bb8d Merge pull request #5913 from hiyouga/hiyouga/dev_metrics
[train] support gather DPO metrics, fix return output

Former-commit-id: a17ac67f22c4de7699a8f2c1d4980af4babd2c7e
2024-11-02 21:13:43 +08:00
hiyouga
c610deb5a2 fix webchat
Former-commit-id: 071fe40f209156f994c069507a2d53cc4f586d67
2024-11-02 21:04:18 +08:00
hiyouga
2bb3255e74 fix dpo metrics
Former-commit-id: 57029280da825a39fbf5a05097921b861f126669
2024-11-02 20:59:01 +08:00
hoshi-hiyouga
b28b74c71e Merge pull request #5880 from sd3ntato/make-image-parametric
make base image parametric.

Former-commit-id: e2ea7c8b67cf598bba2b2b298e638b23712f14b3
2024-11-02 20:26:14 +08:00
hoshi-hiyouga
1ed921bff7 Update Dockerfile
Former-commit-id: 89a1c1eb6d717b20107c06a645652b87fba388e8
2024-11-02 20:20:26 +08:00
hoshi-hiyouga
80f634cc95 Merge pull request #5910 from Cuiyn/index
Support Index series models.

Former-commit-id: b74d9fa8efeb4f52ba0e20538ad90c8b40492e29
2024-11-02 20:16:54 +08:00
Cuiyn
a3eb5e200c fix: rename to Index-1.9B-Charater-Chat and Index-1.9B-Chat-32K
Former-commit-id: 95ab64749155a781ab5e55b989388ccd9e094c8d
2024-11-02 20:04:14 +08:00
hoshi-hiyouga
2d02c0e22d Merge pull request #5912 from hiyouga/hiyouga/dev_logging
[misc] support rank0 logger

Former-commit-id: ed34a6322814f302f050ba8ca4ecc53689f4d646
2024-11-02 18:48:41 +08:00
hiyouga
093eda2ad6 support rank0 logger
Former-commit-id: 84528eabe560091bfd866b6a0ca864085af7529b
2024-11-02 18:31:04 +08:00
Cuiyn
dbaf621f57 Add support for Index
Former-commit-id: 4e6dba16ca1755235d2ae117b53b68c5ae2f239a
2024-11-02 13:45:27 +08:00
hoshi-hiyouga
ceb701c2d4 Merge pull request #5909 from hiyouga/hiyouga/dev2
[data] support auto convert for single image, add image_dir argument

Former-commit-id: ced43fa0c84f7d0792694721d2c5e572c0d0e718
2024-11-02 13:43:04 +08:00
hoshi-hiyouga
29ad3783f5 Merge pull request #5907 from hiyouga/hiyouga/dev
[data] fix template replace behavior

Former-commit-id: 0a51c0bfdd9b193d2a3ac34a62fe8b073569c41a
2024-11-02 13:42:53 +08:00
hiyouga
fa2386e73c fix #5904
Former-commit-id: 079ebe038b11f36a11681dc8688f8ea48bccf324
2024-11-02 13:08:15 +08:00
hiyouga
e0045e8386 fix #5883
Former-commit-id: 73b93caa9ac16ffd8d3faae24d16210d85ae9754
2024-11-02 13:06:34 +08:00
hoshi-hiyouga
b94c941196 Merge pull request #5906 from hiyouga/dev
[test] update tests

Former-commit-id: f95f2824b3c078508408da23e1958292dc96d0fa
2024-11-02 12:50:43 +08:00
hiyouga
ba66ac084f update tests
Former-commit-id: 4e92b656e324725048d914946e70867be20032ff
2024-11-02 12:41:44 +08:00
hoshi-hiyouga
83479c9ef0 Merge pull request #5895 from hiyouga/dev
[inference] support multiple images

Former-commit-id: 491132e5db483fd00aa9f3cbc201b8fb83693f57
2024-11-01 16:52:55 +08:00
hiyouga
df8ac15ef0 add examples
Former-commit-id: 9eff9625adba643263bc6cba480f30edc6bb086a
2024-11-01 08:41:54 +00:00
hiyouga
8cea5cd967 support multiimage inference
Former-commit-id: 8083e4607549e805eb308c4e93c8aa256202f438
2024-11-01 07:25:20 +00:00
Valerio Mariani
a2d7d6a518 make base image parametric.
default `BASE_IMAGE` is nvcr.io/nvidia/pytorch:24.02-py3 for retro-compatibility


Former-commit-id: db8d00536acb02b29d10a3d735438d194656ece3
2024-10-30 21:53:32 +01:00
hoshi-hiyouga
a63e624eca Merge pull request #5873 from hiyouga/dev
[misc] update readme

Former-commit-id: e02c3bea981dff6beae45a9428d5d88d210db5e1
2024-10-30 17:14:44 +08:00
hiyouga
8596c321ce update readme
Former-commit-id: b3d3b440e8879198603da042441d4b4f84296109
2024-10-30 09:14:01 +00:00
hoshi-hiyouga
54cd799aa0 Merge pull request #5871 from hiyouga/dev
[loss&ui] fix incorrect loss of vlms, add extra args to ui

Former-commit-id: 5f4a62b600ab47db6aab3a1f831ecfe1df4335d9
2024-10-30 17:13:17 +08:00
hiyouga
8185eb1890 fix incorrect loss value for vlms
Former-commit-id: 0aa29a71ce958343a2086090d647eb63b8f5f5be
2024-10-30 08:56:46 +00:00
hiyouga
03213984ec tiny fix
Former-commit-id: b8f4b145506851cf5488cd8551a04d1c7603019b
2024-10-30 08:56:29 +00:00
hiyouga
aeeee9d4b5 support extra args in llamaboard
Former-commit-id: da0a5fd612e2214cc4bcb72516efd768fbe18a20
2024-10-30 08:55:54 +00:00
hoshi-hiyouga
c8a1fb99bf Merge pull request #5581 from Kuangdd01/pixtral-patch
[WIP] Support Pixtral-12B

Former-commit-id: fcddf4ec5c2914f73e23eeda2dbf67b048246669
2024-10-29 22:29:10 +08:00
hoshi-hiyouga
f0181a41ff fix bug
Former-commit-id: e69665746d9fcd17a92ace7d5d9c8de1fc0c29b7
2024-10-29 22:19:04 +08:00
hoshi-hiyouga
f6b06d0c6f Update mm_plugin.py
Former-commit-id: 830315cb438e75b589017fd57f70d0a513780a53
2024-10-29 22:16:22 +08:00
hoshi-hiyouga
1047217f78 Update template.py
Former-commit-id: 99a01547ca31adade1c48feae5796e06b73d387c
2024-10-29 22:11:21 +08:00
hoshi-hiyouga
16a9a44849 Update visual.py
Former-commit-id: 6f1db7b9abfbdea1781452388d66df3e9f9a5dd9
2024-10-29 22:10:29 +08:00
hoshi-hiyouga
58fb24ce41 Update collator.py
Former-commit-id: 941fa8a0d9c3a9106ad0af6e776db7e57f69548f
2024-10-29 22:03:42 +08:00
hoshi-hiyouga
a9afffa246 Update hf_engine.py
Former-commit-id: 7412a8b95678ca6827a8c42c9f4d38115fede897
2024-10-29 22:00:59 +08:00
hoshi-hiyouga
1fdd053022 Update README_zh.md
Former-commit-id: e14535aa97062d0e57bbf1230c050f2c56a45556
2024-10-29 21:58:03 +08:00
hoshi-hiyouga
0a833968a0 Update README.md
Former-commit-id: 65be32f6b12c2be80a12a4e903001820f64a0833
2024-10-29 21:57:28 +08:00
hoshi-hiyouga
58b681de78 Merge pull request #5801 from NLPJCL/main
使用了 LLaMA Factory 的项目:RAG-Retrieval 使用LLaMA-Factory作为生成方法做Reranker任务的微调框架。

Former-commit-id: cc9995cc99a7d7ba2958094bcd3d597eddc349e3
2024-10-29 21:20:16 +08:00
hoshi-hiyouga
22d5fc5f4c Update README_zh.md
Former-commit-id: 9e356805aa631810fd5897cb6a6cfc1fe0e939ab
2024-10-29 21:19:17 +08:00
hoshi-hiyouga
cc0119f698 Update README.md
Former-commit-id: 9181486c630bca23f68868128c9b0e04a0d7cea4
2024-10-29 21:18:15 +08:00
hoshi-hiyouga
580cedebde Merge pull request #5857 from hiyouga/dev
[train] fix saving processor

Former-commit-id: 5aaa90124483c8b54225797fa91065ed072d171a
2024-10-29 21:12:04 +08:00
hiyouga
43bd1b070c fix #5749
Former-commit-id: c36c5c61fc022b3f144d4c798ec584c4954b0181
2024-10-29 13:02:13 +00:00
Kingsley
42aa9c65be Merge branch 'hiyouga:main' into pixtral-patch
Former-commit-id: 438302edfdb66b6397266b8b17ac66f60a89300c
2024-10-29 21:01:25 +08:00
hoshi-hiyouga
b0b87fa33f Merge pull request #5852 from hiyouga/dev
[misc] several important updates

Former-commit-id: 5bc5ddf3b62abc132df08be477ffb46e9257e2ba
2024-10-29 20:30:02 +08:00
hiyouga
22912eba1a fix pissa
Former-commit-id: 4ac65a318b87249d42ffa73cbd3b33f0934f2afa
2024-10-29 12:18:45 +00:00
hiyouga
e2748fa967 fix #5747
Former-commit-id: 26d07de349c98b547cd6a6166ea20616d08ba343
2024-10-29 10:47:04 +00:00
hiyouga
248d5daaff use pre-commit
Former-commit-id: 7cfede95df22a9ff236788f04159b6b16b8d04bb
2024-10-29 09:07:46 +00:00
hiyouga
8f5921692e update requires
Former-commit-id: cae0e688ddcead370821e126c192bddc53ff6017
2024-10-29 16:10:07 +08:00
grok
e880eb8844 Update README_zh.md
Former-commit-id: e0c4aa091e71bcb4be44f5a07bdda5df6b949af2
2024-10-23 23:50:56 +08:00
grok
dc076c4e52 Update README.md
update english readme

Former-commit-id: c295a8b549603ec1d58f460c041401e1393d18b5
2024-10-23 23:49:47 +08:00
grok
8306e93ef3 Update README_zh.md
Former-commit-id: 77e39e7c34410a24055ab63cc088e6ec768d49c7
2024-10-23 23:36:14 +08:00
hoshi-hiyouga
6a2cd129c0 fix #5797
Former-commit-id: 71d23ed3444f24b31785d9f0f6dd711f6f516731
2024-10-23 20:49:44 +08:00
KUANGDD
30d7f6a22e rm comment
Former-commit-id: 80b58eaaec1996571d24b2dc2b73859cc28911a1
2024-10-23 15:50:59 +08:00
KUANGDD
5440ebbae6 rm useless code
Former-commit-id: 2dc337a49a8646ce916981b2914718e7472b5946
2024-10-23 15:38:11 +08:00
KUANGDD
22dbe694e9 Merge branch 'pixtral-patch' of https://github.com/Kuangdd01/LLaMA-Factory-X into pixtral-patch
Former-commit-id: 10c58488558549c382f9bba43c487d7f9222f16e
2024-10-23 15:32:50 +08:00
KUANGDD
64ac6ca396 rm import torch
Former-commit-id: 561a0f8155afca20ac699e124320b0eaef6dac07
2024-10-23 15:32:33 +08:00
Kingsley
377d37fa7f Merge branch 'hiyouga:main' into pixtral-patch
Former-commit-id: f3ad96aea6f2602981bf5f27d2bbd1f729d11aa0
2024-10-23 15:30:03 +08:00
KUANGDD
55296744a8 Merge branch 'pixtral-patch' of https://github.com/Kuangdd01/LLaMA-Factory-X into pixtral-patch
Former-commit-id: 3c1694157d61d88fd53fb3c9197196013b98e0e7
2024-10-23 15:28:19 +08:00
KUANGDD
d0889012c2 modify style & little change
Former-commit-id: c988477d14dc656450d5fec31895781b7f9f7dce
2024-10-23 15:24:07 +08:00
hoshi-hiyouga
3a8b2890eb fix test
Former-commit-id: a0a23f79d2d94d68e3bf1e90b95beff817bc409c
2024-10-22 12:35:36 +08:00
hoshi-hiyouga
5b2284a51d fix #5768
Former-commit-id: 9f9e3fd186ce917f0b323c8cd42cf050ed238c58
2024-10-22 11:06:22 +08:00
hoshi-hiyouga
4807d8a4ef Update misc.py
Former-commit-id: fe9a927f1ea8e44e0429b437e5feecf13e34e9aa
2024-10-17 19:48:51 +08:00
hoshi-hiyouga
c6e1313977 Update loader.py
Former-commit-id: 3b229a27a108b840e6bed3c8684737f51ce9faf4
2024-10-17 19:48:12 +08:00
hoshi-hiyouga
66819fd3ee Update README_zh.md
Former-commit-id: a829d4a28fae77b08a6ea451479c71578b3b552f
2024-10-17 19:47:33 +08:00
hoshi-hiyouga
bd85e370be Update README.md
Former-commit-id: f62b0682e476dd62a4a3ac5620f8fc244e8bf150
2024-10-17 19:46:36 +08:00
BUAADreamer
cc097174cc tiny fix [skip ci]
Former-commit-id: 937f69190e529fe7bf0fdf58d7bbb39017854c5e
2024-10-16 15:55:30 +08:00
KUANGDD
7d135bbdb8 remove useless codes
Former-commit-id: 01247fcdde215398ec67cbd6cf1bc6cfb512a9ba
2024-10-16 01:14:51 +08:00
KUANGDD
4845a76535 fix bug for webui infer
Former-commit-id: 17768832908cc59ab64ed72522b2954c575ce21d
2024-10-16 01:09:33 +08:00
Kingsley
67645c0db8 Merge branch 'pixtral-patch' of https://github.com/Kuangdd01/LLaMA-Factory-X into pixtral-patch
Former-commit-id: 995eae4333f4346734d76f7d18cfffb5147e2f7b
2024-10-15 17:09:56 +08:00
Kingsley
f463b3f038 add extra test for pixtral mm_input
Former-commit-id: c706ec8a5dbd3c72ab15a709668624c0c7bbd8ce
2024-10-15 17:09:24 +08:00
BUAADreamer
01defc2779 tiny fix [skip ci]
Former-commit-id: 95f968eec2628cb26b3c4f4d4e81a9536e23cc31
2024-10-15 13:53:33 +08:00
Kingsley
c9e77ab352 Merge branch 'hiyouga:main' into pixtral-patch
Former-commit-id: da6eb7bab2b4e551366d33b81083773cfd45ec08
2024-10-15 13:41:10 +08:00
BUAADreamer
c3de160d1c fix some
Former-commit-id: c9b644693996f96d234349823911fc267635acb9
2024-10-15 13:30:41 +08:00
KUANGDD
3693d7b571 plugin test & check
Former-commit-id: 76c7c8c5a729b8b43e3a31efc44f2c9c2678bf3d
2024-10-15 12:12:46 +08:00
hiyouga
a63144c28f fix #5705
Former-commit-id: 0c85fd253f860eee3c7b9b5a4e77ffbf93af372a
2024-10-15 10:10:16 +08:00
KUANGDD
2b3b0473cd required transformers version
Former-commit-id: d9915db327a038c93b5e3421c90b1f218fb23f92
2024-10-14 21:11:09 +08:00
Kingsley
9d929897ce remove bs condition
Former-commit-id: bf3520178ab66058c62a9cf31b42f36a9d88ce20
2024-10-14 16:55:59 +08:00
Kingsley
313a5e1494 Merge branch 'hiyouga:main' into pixtral-patch
Former-commit-id: 28696e2f945a9f55e4ca9e9dc5ebd8af9df45d8b
2024-10-13 17:42:02 +08:00
hiyouga
74dd25224a fix #5668
Former-commit-id: 116f2946201d55305f6b57b3f926670a3e2173c8
2024-10-12 01:24:43 +08:00
hiyouga
c7efc7f2ed tiny fix
Former-commit-id: 1fe424323b212094856f423351dc2a15774d39c3
2024-10-11 23:51:54 +08:00
hoshi-hiyouga
c71c78da50 Merge pull request #5665 from johnnynunez/main
vllm 0.6.3

Former-commit-id: 6f8a9581fa406e255ca6955794f16cc06b5cf287
2024-10-11 23:45:58 +08:00
hoshi-hiyouga
f4897da009 Merge pull request #5642 from huniu20/main
[hub support] add modelers hub support

Former-commit-id: ea96c8ba3f81546df1311ca738ff961aa4ef7446
2024-10-11 23:45:17 +08:00
huniu20
a6951db970 bugs fixed
Former-commit-id: 5457ba7512d70564ea784b9ec6bdb86cfd2d7e3d
2024-10-11 19:56:13 +08:00
Johnny
9d27aaa38f Update parser.py
Former-commit-id: 60b13c86f4feaffbb43f5a23a28376fe416ed118
2024-10-11 12:29:33 +02:00
Johnny
3b19b6f31b Update setup.py
Former-commit-id: f85b756ffafa241304624819b7612603ad5e0ee3
2024-10-11 12:29:09 +02:00
huniu20
5b15ca0b0b add om_hub_token argument
Former-commit-id: b3214e69d32067a1c22dbd60c2cde1545ba75b19
2024-10-10 17:16:46 +08:00
huniu20
aad79127e6 1. add model and dataset info to support webui
Former-commit-id: 92f6226f3fecbd9af744a7232dda2c68b2bb0d86
2024-10-10 16:46:34 +08:00
huniu20
c42dcab32b 1. add modelers hub support
Former-commit-id: 14678eb444d8181176745d18d4a6865fd6860f58
2024-10-09 17:21:37 +08:00
Kingsley
be519c84d9 Merge branch 'hiyouga:main' into pixtral-patch
Former-commit-id: 2076d00dfbe1279a91207157fd6d9a118427626a
2024-10-08 21:04:08 +08:00
hiyouga
b2dc6dc59a tiny fix
Former-commit-id: d8ddd07c2ed14d871fb25743c20265fc99e3e221
2024-10-08 17:48:56 +08:00
hoshi-hiyouga
9df626dc18 Merge pull request #5546 from chengchengpei/cpei/refactor
1, log exceptions in details; 2, check processor is None before calling it

Former-commit-id: 81c23ebdd7ef46102437b1d352818fe205fa3851
2024-10-08 17:46:54 +08:00
hoshi-hiyouga
8d4b9200a1 Merge branch 'main' into cpei/refactor
Former-commit-id: c2951f17f726470bcd5dff6bf7028ec90212442e
2024-10-08 17:31:17 +08:00
hoshi-hiyouga
7806df46ba Merge pull request #5615 from johnnynunez/patch-1
Update setup.py (Compatible with Jetson)

Former-commit-id: baa3cd4c0db2502cf8a606e034df20492a83e6b2
2024-10-07 16:50:34 +08:00
hoshi-hiyouga
bba026a212 Update parser.py
Former-commit-id: e7d291605f184f6ac48429015e15755192d2f274
2024-10-07 16:27:23 +08:00
hoshi-hiyouga
6e111eb29f Update setup.py
Former-commit-id: 4c017fe014b708d79c65eff24329b9c324399461
2024-10-07 16:26:50 +08:00
Johnny
2b69ae0eb2 Update parser.py
Former-commit-id: 55c449b54aec04e2141bffe75d4016cbac9ef4c5
2024-10-07 10:17:45 +02:00
Johnny
13d73574ef Update setup.py
Former-commit-id: 73d3f93496712edace38711613e14768922d6c96
2024-10-07 10:16:53 +02:00
hiyouga
bc264807ae update readme
Former-commit-id: 915f25e9b34fc4554fd1198a383f96a2536fec60
2024-10-07 11:31:18 +08:00
Johnny
f9815dd20a Update parser.py
Former-commit-id: f832edc8dc0e2b78c12dc8edd702fe147a0a5292
2024-10-06 20:34:19 +02:00
Johnny
1f58943b32 Update setup.py
Former-commit-id: b4de2c84b078194bb6358697fd6815d622843f58
2024-10-06 08:53:55 +02:00
hiyouga
6476507429 fix #5611
Former-commit-id: 3bef07ecf0557999bb0b33b650a778addc8e5b91
2024-10-06 10:34:55 +08:00
hiyouga
35862d19ec fix #5611
Former-commit-id: 76c813d37c1d945a8bb6d3e4168e15fbe97c7a87
2024-10-06 10:33:11 +08:00
Kingsley
1272cb00df Merge branch 'hiyouga:main' into pixtral-patch
Former-commit-id: 9372ac93f304db438383d539ccd00bffe7415dbc
2024-10-01 00:52:31 +08:00
Kingsley
e9ac26db4c unfactor md
Former-commit-id: 1a79d61f8d25a4c1127c2f393418e14ab9d2abd4
2024-09-30 23:36:16 +08:00
hiyouga
20ee1d2e19 fix #5542
Former-commit-id: cf28e7418c2eb07e86923a53ef832ef218e45af1
2024-09-30 23:28:55 +08:00
Kingsley
cbc1dd0c88 sync with former
Former-commit-id: f8707e52586182144c4fb70c7c0de8bf7044ef5e
2024-09-30 20:27:05 +08:00
Kingsley
870bbabbc4 register model fix
Former-commit-id: 077d8e3c0344d944705254cc5a2cd06c9f5dc116
2024-09-30 20:04:47 +08:00
Kingsley
8fd84c375e fix some errors due to inconsistency of model cards
Former-commit-id: dd83265b9b8768eb8732f59ace128dfe4aac1c47
2024-09-30 19:58:34 +08:00
Kingsley
32b5364051 Merge branch 'hiyouga:main' into pixtral-patch
Former-commit-id: df0baeaa3fd093433d92b7921d3a57d88061d6d4
2024-09-30 19:33:29 +08:00
hiyouga
cf72aec098 add patch processor func
Former-commit-id: 0cd6327da6a044b4a62f203a662e5bb6068d9c29
2024-09-30 17:07:43 +08:00
hiyouga
87849d12d2 lint
Former-commit-id: d7564365f4008e468f89102879d6e65c627ad447
2024-09-30 17:00:33 +08:00
hoshi-hiyouga
a19512436f Merge pull request #5585 from shing100/main
Support EXAONE3.0 Model

Former-commit-id: 2fba28d586757bbb3ac57e4dd10c756381766b51
2024-09-30 16:56:08 +08:00
hoshi-hiyouga
6c89d93aea Update constants.py
Former-commit-id: 7c04e1caea38fd1e1e9abcf8ed1bbdc24ddd6df1
2024-09-30 16:47:52 +08:00
hoshi-hiyouga
345f40a660 Update template.py
Former-commit-id: d893289b595c0530b5aeb8902369885118809b86
2024-09-30 16:39:48 +08:00
Zhangchi Feng
8b9a814653 Merge branch 'main' into pixtral-patch
Former-commit-id: 0cf52d48fbc505e2fba29e5df0f2e6722db7ac79
2024-09-30 12:37:03 +08:00
shing100
05fabf9095 fix chat template Exaone3.0
Former-commit-id: 2e32864b59c1ef1a78f3eb1c28fbf578cfaa19cd
2024-09-30 09:44:21 +09:00
Geun, Lim
95eede911a Update README_zh.md
Former-commit-id: c4bf9d86e14a9d7a5ed5f9c49d73006d13df2707
2024-09-30 09:25:02 +09:00
Geun, Lim
7bc7f7d673 Update README.md
Former-commit-id: d014eb931cd9ed70abb8a466281668a0b00ba9f9
2024-09-30 09:24:44 +09:00
shing100
054fdbe186 update docs Support model Exaone3.0
Former-commit-id: e6fbf8fd7c84cfb11a0a4a173657b1541806b5f9
2024-09-30 09:19:27 +09:00
shing100
f0f80819a0 add Exaone3.0 template
Former-commit-id: f7478af1d04353ab13236323e3bfb96fd2870fce
2024-09-30 09:18:25 +09:00
hoshi-hiyouga
e702678252 Merge pull request #5574 from BUAADreamer/main
support llava-next(video)/video-llava

Former-commit-id: bf7611e15a7e7ee9fb870efeba9bdac358c6d462
2024-09-30 00:22:43 +08:00
hoshi-hiyouga
553579986a Update common.py
Former-commit-id: 7f7f4b67b8b757e3787a78993cf083552cd5fbbd
2024-09-29 23:58:09 +08:00
hoshi-hiyouga
622cb04f27 Update README_zh.md
Former-commit-id: 01ee426c745f522bd0dee79ace2c6b2eb52d0510
2024-09-29 23:56:32 +08:00
hoshi-hiyouga
f3ba11a432 Update README.md
Former-commit-id: 45b79a78f62a1d916083f8c74ebf08ad0fb8fe6f
2024-09-29 23:55:55 +08:00
hoshi-hiyouga
8b1f53bca5 Update README.md
Former-commit-id: 0bcf6a30ae95d5c76e477f829f6ba633d9ccdd64
2024-09-29 23:55:21 +08:00
hoshi-hiyouga
ac25fef80e Update constants.py
Former-commit-id: a0dd90fa41fc10d7944521d95a312631be64af8f
2024-09-29 23:45:34 +08:00
hoshi-hiyouga
15f819d273 Update test_mm_plugin.py
Former-commit-id: 8490ba1bb3b429d10c5a1cf791aa1bfe3547fd5f
2024-09-29 22:59:47 +08:00
BUAADreamer
f2d1c43d28 fix template
Former-commit-id: cfd05bb009895a936c59f3d97afebf2ed8006f84
2024-09-29 22:56:36 +08:00
BUAADreamer
464acc7d6c fix template
Former-commit-id: 6291c933448022ae80fd85d7f1d785bf6c0fcb25
2024-09-29 22:55:45 +08:00
BUAADreamer
a96c5da737 fix constants
Former-commit-id: e66a338410be6812064a119d8c6a6644e0f035d1
2024-09-29 22:40:43 +08:00
BUAADreamer
28d09b81c9 Merge branch 'main' of https://github.com/BUAADreamer/LLaMA-Factory
Former-commit-id: 2358bdde973dfde3abff251d02f7622e9c144e4d
2024-09-29 22:00:35 +08:00
BUAADreamer
a769d0e3d4 fix constants
Former-commit-id: 69309a23598995aa1937fd8d80732a018c18db87
2024-09-29 22:00:01 +08:00
hoshi-hiyouga
1b98b5e65c Update requirements.txt
Former-commit-id: bd3b235904aae267ead8db1809d06d6935d2ea30
2024-09-29 21:51:23 +08:00
BUAADreamer
3cc5408da7 fix style
Former-commit-id: dc1bdcb69e6f2c605a2c533dab15613affc902f4
2024-09-29 21:39:37 +08:00
Zhangchi Feng
689f5c4554 Merge branch 'main' into main
Former-commit-id: 7566589b820e6030269523e9d08c312594f893ae
2024-09-29 21:32:54 +08:00
BUAADreamer
ab5d042cd3 add more llava-next series template
Former-commit-id: 93f64f2aebf41582d39aa8a2c6059e562ca694b0
2024-09-29 21:29:29 +08:00
BUAADreamer
4d43317aa1 Merge branch 'main' of https://github.com/BUAADreamer/LLaMA-Factory
Former-commit-id: bf6d6eb0bfe00453a77bbe42a3842b856dd2e47f
2024-09-29 20:55:23 +08:00
BUAADreamer
ed3b0c5b40 fix readme_zh
Former-commit-id: b663d664793b79c02db1b91d206dea2beb168e26
2024-09-29 20:55:18 +08:00
hoshi-hiyouga
67a97794ee Update mm_plugin.py
Former-commit-id: 507de0df036e39eae3a3887ded9165bd918ee48f
2024-09-29 20:54:04 +08:00
hoshi-hiyouga
2c7c93cb9b Update mm_plugin.py
Former-commit-id: b8be270f9c97bfcaf431bbd9f06c4c0b83980539
2024-09-29 20:53:34 +08:00
BUAADreamer
4d4fe08d14 fix readme_zh
Former-commit-id: 4621cc3e0b8a5dc7fcfa7cf2d60ff1838aef9a1a
2024-09-29 20:46:47 +08:00
BUAADreamer
85a919b6f7 fix readme
Former-commit-id: 867e7e70dbff207dbd78668af09a638654937f71
2024-09-29 20:45:02 +08:00
BUAADreamer
fe2abe20fc tiny fix
Former-commit-id: 0c7c875d55bc45795a41c0b8a5c407d72b1f3d8d
2024-09-29 20:38:46 +08:00
BUAADreamer
12444720db fix style
Former-commit-id: 7b922803586c05981cd095cfb730061091f0204c
2024-09-29 20:30:57 +08:00
BUAADreamer
510faf5805 fix tests
Former-commit-id: e932907f6f6473bd6917d61a464366cc9918f66c
2024-09-29 18:00:45 +08:00
BUAADreamer
722e01c8ab fix some
Former-commit-id: aeca8c0f978cb9754e0526b40cd431aaf867044f
2024-09-29 17:55:40 +08:00
hoshi-hiyouga
6050e6cff9 update readme
Former-commit-id: e5c8634cbd4e00459894c031ef0e10fcc6ef5775
2024-09-29 05:02:44 +00:00
hoshi-hiyouga
c8abbe4fc3 Merge pull request #5580 from amrear/main
made a small change to a warning about fa2 for gemma2 models.

Former-commit-id: 5e2d90ab976dd55b8c61a68e929d7e5b3583156c
2024-09-29 12:45:03 +08:00
BUAADreamer
f2881c9d4a fix some params of visual regularize
Former-commit-id: 15cbc35af4559dad73c09317e82a63571a8c3540
2024-09-29 12:38:25 +08:00
hoshi-hiyouga
1ded3abdf1 Update attention.py
Former-commit-id: 2adf79c195053bb4541e0317573a2c89da28b5bc
2024-09-29 10:47:41 +08:00
Kingsley
e641f1215a Tiny fix
Former-commit-id: ae66e1a545f4cd209a57fd824f9bfb7e94436cba
2024-09-29 00:00:23 +08:00
Amirreza A
ca736bcab7 made a small change to a warning about fa2 for gemma2 models.
Former-commit-id: e0695a026d822c896cb4f5b33e0c4f88441d75e9
2024-09-28 19:03:36 +03:30
Kingsley
bddb2646bd tiny fix
Former-commit-id: 35bc71b2a68fd303798c35fe22ad29ceea87cf9b
2024-09-28 22:50:53 +08:00
Kingsley
e4c57f54f8 remove some unnecessary if conditions
Former-commit-id: 482d3e5ff3338385da664475fee88c7dc623c993
2024-09-28 02:14:06 +08:00
BUAADreamer
6de82ca843 fix some
Former-commit-id: 12e509da85af76ccf1e9a879a78e450a7b70cc4b
2024-09-28 01:15:33 +08:00
BUAADreamer
b2c02df555 modify some style
Former-commit-id: 36bc408b8296cfc6d565b2f968fb1059bc6d1305
2024-09-28 01:07:38 +08:00
BUAADreamer
ca86d6361e add tests
Former-commit-id: f0ed66bf6f9b45e0c3fddb5179a93363f5a4194f
2024-09-28 00:59:14 +08:00
BUAADreamer
b6fb00e046 add llava-next/llava-next-video/video-llava
Former-commit-id: a4e4239931b0b0e3fd12c9f9bbfd2c201cbc78ca
2024-09-28 00:57:03 +08:00
Zhangchi Feng
86c84972c8 Merge branch 'hiyouga:main' into main
Former-commit-id: 2695dcdf468f9e39e3aeec7892eb3dad399736ee
2024-09-27 18:14:39 +08:00
Kingsley
9390927875 add pixtral template
Former-commit-id: c7b4e47e0fda955272ccd6340b2047fd92acbfcf
2024-09-26 17:14:51 +08:00
Kingsley
c4a585f232 Merge branches 'pixtral-patch' and 'pixtral-patch' of https://github.com/Kuangdd01/LLaMA-Factory-X into pixtral-patch
Former-commit-id: 197bb14e6308bdf9af65eafe7bf06b36dbf96df6
2024-09-26 12:18:25 +08:00
Kingsley
300feb3245 add pixtral template
Former-commit-id: e0bcaa6c6e902e29361438a6d215bbc2535b648f
2024-09-26 12:11:58 +08:00
Chengcheng Pei
cacafb0038 address comments
Former-commit-id: 6311bb2ca266ce156537cfa477202b2904921593
2024-09-25 21:07:51 -07:00
hoshi-hiyouga
6509114259 Merge pull request #5547 from marko1616/chore/llama3.2
Chore: Support llama3.2.
Former-commit-id: 979ecc92a0db6b90ed8249d9a17120d5ed18b6aa
2024-09-26 11:38:34 +08:00
hoshi-hiyouga
7d4cb79822 add modelscope models
Former-commit-id: 4de3081eea9cede78a1f2db65cf22a5731c54447
2024-09-26 11:22:48 +08:00
marko1616
b867e164fe Chore: Support llama3.2.
Former-commit-id: 2741ac784c1a776bd545fa6dffc07b6346273519
2024-09-25 16:08:44 -04:00
Chengcheng Pei
26bbfc084d 1, log exceptions in details; 2, check processor is None before calling it.
Former-commit-id: 0f0a4813db9ca4e9bb5762a781a0a214129284a6
2024-09-25 12:59:48 -07:00
hiyouga
c376eed31d fix ci
Former-commit-id: f354593ca9b13e542fccd8fe2b64ea0ec4db78b2
2024-09-25 23:14:17 +08:00
hoshi-hiyouga
7c595abc38 Merge pull request #5533 from StrangeBytesOrg/add-docker-args
Add additional install options to Dockerfiles

Former-commit-id: c52aa3d5323e270f6b50a51d97a92e79138b7293
2024-09-25 23:04:57 +08:00
hiyouga
c428ab68d8 optionally replace jinja template
Former-commit-id: f15dec3001f785eeac1ed9cc545fab96bac2c4fd
2024-09-25 23:02:02 +08:00
hiyouga
968b9f1852 update readme
Former-commit-id: 826a47909f22b72228cd8944875a13f5f65232b1
2024-09-25 20:13:04 +08:00
hiyouga
018266c66e update readme
Former-commit-id: fe482183ae9d19cc42f78b5cd144ef21b93ec8d1
2024-09-25 19:39:52 +08:00
StrangeBytesDev
111c644bf1 Add additional install options to Dockerfiles
Former-commit-id: 5310af2f2ac8d226b95785d6b1eb0632312871a7
2024-09-24 16:54:46 -07:00
huangpan.foo
ed5c641e8b Add deepseek-v2.5 template
Former-commit-id: e80c1fe798fb2e076c0891a64300f1b6710176b6
2024-09-21 19:33:30 +08:00
hoshi-hiyouga
de72d1f0e7 Merge pull request #5483 from whybeyoung/main
fix: 修复function call数据集如果 function_call 值的为不合法json,异常提示且中断训练。
Former-commit-id: 9e36ebebd087cd3b128b9426255d420f3c94353c
2024-09-19 17:01:52 +08:00
hoshi-hiyouga
8bfb856923 flat string
Former-commit-id: f1e7731075e6ded4a5ecac7ef46ca4a318b91597
2024-09-19 16:43:42 +08:00
hoshi-hiyouga
8fdbaab95d lint
Former-commit-id: dd94fdd69c8f36df80d6d70d63ab7403a0e55d46
2024-09-19 16:21:43 +08:00
hoshi-hiyouga
a01668bbe8 fix bug
Former-commit-id: b6d0ee1fd8b555bc6aac8b8686c9a3eea784c3a8
2024-09-19 16:21:21 +08:00
hoshi-hiyouga
3385616a37 improve error message
Former-commit-id: e7735dd487ae4e31c34dcd8e2ea9af0a39d1cf9e
2024-09-19 16:06:00 +08:00
ybyang
1f0d89328d fix: 修复function call数据集如果 function_call 值的为不合法json,异常提示且中断训练。
Former-commit-id: 625a0cd7cb5725a0f76c8c19cd23d6c0275bd146
2024-09-19 15:00:10 +08:00
menibrief
a7feab45d5 fix phi-small template
Former-commit-id: 48fb6bae6245dc6d5f72ebfc1c2bd9ffacd51b86
2024-09-18 23:52:30 +03:00
menibrief
f34322afd7 Update README.md
update readme to phi-small template

Former-commit-id: e9df26aa45f916ab0756db3329dff48dcdfce1f1
2024-09-18 23:51:36 +03:00
hoshi-hiyouga
3815fa40b7 tiny fix
Former-commit-id: 1f45d18a780c2aa501f060688a09ff04071379b9
2024-09-19 02:20:24 +08:00
hoshi-hiyouga
c43050b3fa Update README_zh.md
Former-commit-id: 750c57cbcee3ecdd6a9096f1569b9bee282d5ac7
2024-09-19 02:17:59 +08:00
hoshi-hiyouga
3e152872ad Update README.md
Former-commit-id: 40b0e51092289dbf1f2a112cd8c36df399314c8b
2024-09-19 02:16:16 +08:00
hoshi-hiyouga
ae6ad55758 fix webui
Former-commit-id: aa6e65b24451fe9f65d58e5eca5a56eb9aba71e8
2024-09-19 02:13:39 +08:00
hoshi-hiyouga
0118a2fc04 add qwen2.5 models
Former-commit-id: 408a7d7b2e1a2316cbeefade872b732c88191b75
2024-09-19 02:07:54 +08:00
hoshi-hiyouga
4dd81976f4 Merge pull request #5438 from aliencaocao/patch-1
Add qwen_vl to liger kernel supported list

Former-commit-id: c706ff61dc3e5c152a10789c7524844e2be554a2
2024-09-16 13:40:02 +08:00
Billy Cao
2b4da8baf6 Add qwen_vl to liger kernel supported list
Former-commit-id: 053b2d832450cb6cd6af673b9fc51404f1fb1e41
2024-09-14 19:28:20 +08:00
hoshi-hiyouga
7d1b4071e8 Merge pull request #5427 from HardAndHeavy/update-rocm
Update the ROCm version to 6.2

Former-commit-id: 5dcdf5d16590b59004be9d728887781729344ea0
2024-09-13 10:25:47 +08:00
HardAndHeavy
8fc5377f50 update the ROCm version to 6.2
Former-commit-id: a6eda6a500daa4f3383a7868f6abe2434f967b1d
2024-09-12 23:46:33 +03:00
hiyouga
e5812f261d update ci
https://github.com/huggingface/transformers/pull/33436

Former-commit-id: c723f16cdb919cedbf938d51d422ad49b9c6eecf
2024-09-11 20:44:42 +08:00
hiyouga
f7e85cd7de set dev version
Former-commit-id: 39edf597f050bcb2099a10d6f6018f96e29b7e65
2024-09-11 18:56:37 +08:00
hiyouga
749395420b remove windows in ci
Former-commit-id: 56046767c086853b6d40fbc42e0ed9662546de6b
2024-09-11 18:14:39 +08:00
hiyouga
7d536d1d75 fix ci
Former-commit-id: 627f30200068f58d06eb53b1b4797ed426c9c1f1
2024-09-11 18:01:09 +08:00
hiyouga
7fd0d2fc2f fix #5411
Former-commit-id: 392bdaf1ea9e5baf6289f2d4415a175dd55a479d
2024-09-11 17:36:42 +08:00
BUAADreamer
ec696bbcdd try to past test
Former-commit-id: 2db97e1e5e06370375f4f5c577671524e399321f
2024-09-10 13:29:09 +08:00
BUAADreamer
df24345d65 try to past test
Former-commit-id: 76a4cfcb84b55467792318dc15a5fbcd6807b674
2024-09-10 13:25:30 +08:00
Zhangchi Feng
386dd26097 Merge branch 'hiyouga:main' into main
Former-commit-id: 8619ad7dc124c50e254b1bb2e173ff99ca4f0e22
2024-09-10 13:20:24 +08:00
BUAADreamer
514f976cc1 try to past test
Former-commit-id: 3b6bfae0e5fe795a70d530b2765f27d95c5862f8
2024-09-10 13:12:51 +08:00
BUAADreamer
66b870fd08 try to past test
Former-commit-id: 808a4bd77daca4dd92423652878d8262f3a6f2a4
2024-09-10 12:56:12 +08:00
BUAADreamer
24d3c7e378 resolve confilct
Former-commit-id: d6168da2a1f74424b83416cbcbf685861e76ff5f
2024-09-10 12:39:17 +08:00
BUAADreamer
484128b641 support llava-next(video)
Former-commit-id: 27e94593ac467e56e3a7f5c64f4ff6cee81f4b47
2024-09-10 12:31:53 +08:00
hiyouga
588ea95732 update accelerate ver for schedule_free optimizers
Former-commit-id: 2de74e79049ce8e50f605f649275b1dbfb899c8c
2024-09-09 22:51:08 +08:00
hiyouga
800567cde7 fix mm plugin
Former-commit-id: 6a3549c6c1a8c40de61e748f0b280bfc9e1279a2
2024-09-09 22:41:28 +08:00
hiyouga
7a3ba5a25d fix qwen2vl preprocess
Former-commit-id: 52ddd42b7d2ae9e1aa08c15fd5c13ddad96f1b74
2024-09-09 22:33:33 +08:00
hiyouga
dfff411e1a release v0.9.0 (real)
Former-commit-id: 8ff781c8ae5654680f738f69a6db9d7b95d76baf
2024-09-09 01:00:25 +08:00
hiyouga
e20baa4218 fix constants
Former-commit-id: fce6671d2764d7a2b77c44401fc5582c7cbb77aa
2024-09-08 23:52:30 +08:00
hiyouga
d1ab9b501a release v0.9.0
Former-commit-id: 594c450f648ad326ef39c0f4d70d67cda5f36159
2024-09-08 23:43:35 +08:00
hiyouga
3cbc9109ea tiny fix
Former-commit-id: 76177039c8f9ef5a63724a339dae6195d89fa215
2024-09-08 23:18:08 +08:00
hiyouga
3259397f89 update scripts
Former-commit-id: 51d087cbc14bf3c7dfa06b8b66052cd80a6081be
2024-09-08 14:17:41 +08:00
hiyouga
eb5af3d90b support vllm 0.6.0
Former-commit-id: e39470ec51a9c74ad901871eb816df10e851f351
2024-09-08 02:26:20 +08:00
hiyouga
b6810b209a fix test case
Former-commit-id: b075b2971c6acb2c6039b36420a296f1f4e1b91b
2024-09-08 01:50:51 +08:00
hiyouga
158e0e1f63 add test case
Former-commit-id: c452d65e1551074dddd1d87517c0d44dc014c6aa
2024-09-08 01:40:49 +08:00
hiyouga
294a103ead support activation offloading via unsloth gc
Former-commit-id: d3d0dd0feba3ca6f0ae970d5856bec989d26ef67
2024-09-08 01:22:19 +08:00
hiyouga
7f71276ad8 add docstrings, refactor logger
Former-commit-id: c34e489d71f8f539028543ccf8ee92cecedd6276
2024-09-08 00:56:56 +08:00
hoshi-hiyouga
93d4570a59 Merge pull request #5388 from yzoaim/cal_mfu_update
update cal_mfu.py

Former-commit-id: fe5eac2cb6a4646b653232d7c68d535105b60f3a
2024-09-08 00:49:28 +08:00
hoshi-hiyouga
527ba2eb2e fix
Former-commit-id: 53a74cbc3afec58b36c2265e080061bcdf702f98
2024-09-08 00:41:45 +08:00
hoshi-hiyouga
3021b31cf3 Update cal_mfu.py
Former-commit-id: 0c391b2e59943b0ca9dd4e8561398e7c856a4b29
2024-09-08 00:39:48 +08:00
-.-
9f2427907e update cal_mfu.py
Former-commit-id: 1cdbb4c774d463969c6be14fb08d92c7a0bdb565
2024-09-07 23:21:35 +08:00
hoshi-hiyouga
570ce100c1 fix #5384
Former-commit-id: 2e86c54f381f7403c30ba78d2acf5003aab6e049
2024-09-07 01:21:14 +08:00
hiyouga
27547355e6 tiny fix
Former-commit-id: c0e9c0484dae6db93cef5048bad827ff22b1986a
2024-09-05 23:41:16 +08:00
hiyouga
c5ef52a67a fix ci
Former-commit-id: b5ffca5a190f3aed8ba8c49bd8cf3239fb787bf5
2024-09-05 22:39:47 +08:00
hiyouga
b48b47d519 fix ci
Former-commit-id: cf0758b03e9b8b4931ba790a9726b8256ee4286c
2024-09-05 22:27:48 +08:00
hiyouga
9bdba2f6a8 add e2e tests
Former-commit-id: 0156a37450604641c4f5f9756ad84324698fc88c
2024-09-05 21:52:28 +08:00
hoshi-hiyouga
d6ce902d80 Merge pull request #5372 from LDLINGLINGLING/main
增加了对minicpm3.0的适配'

Former-commit-id: 2e3c221d9c87bd59f48648be8878b7b50347280f
2024-09-05 21:35:42 +08:00
liudan
ce6dcf3600 根据代码规范修改了代码
Former-commit-id: fe5351980b42e0e38175b0da2401a61b3807fa7c
2024-09-05 20:17:55 +08:00
hoshi-hiyouga
e7f92d16d8 fix #5366
Former-commit-id: b0a4964846dd5be7aa2c54d43f28ba62985587f1
2024-09-05 18:08:09 +08:00
hiyouga
abd26f5f67 update data readme
Former-commit-id: 0af5f054b7b8da8b39eb44b1dfa76050f0c45667
2024-09-05 04:44:49 +08:00
hiyouga
4d35ace75e update data readme
Former-commit-id: 81adb153b7d0b30e6cd50c9bf4ca1ccf17458611
2024-09-05 04:25:27 +08:00
hiyouga
72222d1598 support Yi-Coder models
Former-commit-id: ea3f1659e70541c4fa8b7079a0a8c94fce9a41c8
2024-09-05 03:12:24 +08:00
hiyouga
26d914b8fc fix ci
Former-commit-id: 280c0f3f2cea4dfced797cc0e15f72b8b3a93542
2024-09-05 03:02:59 +08:00
hiyouga
7b01c0676c fix ci
Former-commit-id: 7899b44b19c3d0a70706d987bb7d2e0e3536014b
2024-09-05 02:49:22 +08:00
hiyouga
571a9b8669 update ci
Former-commit-id: e24bf7345442701ca874d439f0ca3da49fa59a84
2024-09-05 02:26:10 +08:00
hoshi-hiyouga
ed35eb1e9e Merge pull request #5365 from hiyouga/video_finetuning
Support Qwen2-VL Fine-Tuning on Video Datasets

Former-commit-id: 178cc3fbc48bf2c68723b487681db04e660b12fa
2024-09-05 02:24:58 +08:00
hiyouga
d291e0d60d tiny fix
Former-commit-id: 9da6e084e1e5daf7403e7fabeaaec686167fb11f
2024-09-05 02:16:49 +08:00
hiyouga
1874d579c5 video datasets
Former-commit-id: 33f28ce82d9e44d2615909250dc56d6a4a03cd99
2024-09-05 02:04:17 +08:00
liudan
c692339020 增加了对minicpm3.0的适配'
Former-commit-id: 4ad3a761af2452ef3f6c61190b7e47c9ea5227b9
2024-09-04 23:10:05 +08:00
hiyouga
2c1eef34cb fix test
Former-commit-id: 553a83aff9f9da35c9a0eca81f7d2b0bf2adf6ff
2024-09-04 22:38:26 +08:00
hiyouga
af178cbcd1 update get template
Former-commit-id: 21ea0d0786f91c0bce79630963e66b815a6792a0
2024-09-04 22:36:20 +08:00
hoshi-hiyouga
5d85be31ca Merge pull request #5323 from naem1023/feat/add-dataset-map-batch-size-argument
Add batch size of map function in the preprocessed dataset

Former-commit-id: c3428c5807500d087cdee4386798e10e39c9cf30
2024-09-04 22:09:36 +08:00
hoshi-hiyouga
372b71c847 fix #5228
Former-commit-id: 0d332ca8d0987c0331361934ab110fafa6402a7e
2024-09-04 19:10:30 +08:00
hiyouga
41a9c415e1 fix #5252
Former-commit-id: 73f30b4dfffb260e24f9e2332617b8ca2c249ed5
2024-09-04 03:17:54 +08:00
hiyouga
915e32a5f8 add vl_feedback dataset
Former-commit-id: 6ff34ad2db383b5fbd51008bcc5eec880658811e
2024-09-04 03:13:03 +08:00
hiyouga
f4dd429cbf fix #5344
Former-commit-id: 9d445c0b5be5ccc0e6d1979e76a869ddf92d9534
2024-09-04 03:06:06 +08:00
hoshi-hiyouga
7435cde2ef Merge pull request #5346 from hiyouga/lazy_image
[exp] Lazyload for multimodal inputs

Former-commit-id: 4bbd721361a8c5888b28f5fcdcbb2a4ad2305445
2024-09-04 03:00:53 +08:00
hiyouga
7056087e92 lazy image load
Former-commit-id: cdd733b575411e003bc5ffd6560dd8eff8aa09cf
2024-09-04 02:27:08 +08:00
hiyouga
fed7ae5661 fix #5334
Former-commit-id: a5ea0f83f00c81d128a1f50ce244866ce38ee15f
2024-09-03 19:09:42 +08:00
hiyouga
5019c6148b fix #5338
Former-commit-id: a66ddfea218feefde50fa097d20b4bcbe89ab791
2024-09-03 17:45:17 +08:00
hiyouga
2e1396cd6b lint
Former-commit-id: d821d933e6cb982d648a69f85f6ad01d0560ed70
2024-09-03 00:46:25 +08:00
hiyouga
b5e9df5df8 fix #5324
Former-commit-id: f7aa06c9c0b18c28419ea5792410915d3f322cbf
2024-09-02 23:56:21 +08:00
naem1023
3622856994 feat: add batch size of map function in the preprocessed dataset
Former-commit-id: 94b6cf06c2f84d0619b1a2dccaf8abb51de9951c
2024-09-02 13:52:47 +09:00
hoshi-hiyouga
7367c6ec21 fix trainer predict
Former-commit-id: 2790790cd26c6743105555a60523b89f367ebce3
2024-09-02 10:15:29 +08:00
hoshi-hiyouga
6579ec8c4c remove .cpu()
Former-commit-id: 35c57cc9dcba305d40282a9757ddc23968c210ac
2024-09-02 10:10:53 +08:00
hiyouga
a7fbae47d5 fix mm inference
Former-commit-id: fa782c15a07ed40f8a6381acdf2da395377efd04
2024-09-02 01:47:40 +08:00
hiyouga
f203a9d78e tiny fix
Former-commit-id: 8b4f408da110d74285bae20bbd969013a979964b
2024-09-02 01:33:22 +08:00
hiyouga
bae73e676c add image num check
Former-commit-id: 15201113bf16b748c0a758c7a5b363da8272e0e6
2024-09-02 01:31:36 +08:00
hiyouga
806e1061d4 add pokemon dataset
Former-commit-id: 06680158a0f0a1e3c542e77af92ac877fbe357c5
2024-09-02 01:02:25 +08:00
hiyouga
f920091667 update readme
Former-commit-id: 25a05d9f96718e06ce83f5bb1f41d2c001790295
2024-09-01 23:32:39 +08:00
hiyouga
801979f779 update wechat
Former-commit-id: 7f88dfe080db10ff12d1fb80b43099a356c899ea
2024-09-01 23:30:57 +08:00
hoshi-hiyouga
df2d32e7aa Merge pull request #5317 from ByronHsu/patch-1
Add liger kernel link

Former-commit-id: a319b3cf9119fd49cbcfb17b963e111a2f86bb51
2024-09-01 23:30:12 +08:00
hiyouga
60cf12727b add rlhf-v dataset
Former-commit-id: 3fd18fc34a0c994a738504746abfd5548e002437
2024-09-01 22:57:41 +08:00
hiyouga
7621526d22 tiny fix
Former-commit-id: 8ccaae3871d8d1fe3ea4633d427aecb2ab3addec
2024-09-01 21:15:44 +08:00
hiyouga
559b84dceb fix bug
Former-commit-id: 6e19e56000dd18d5faf84ceabce8d7708ff21e4d
2024-09-01 21:07:49 +08:00
hiyouga
7e4c5d4bb3 fix mixed mm inputs and rlhf-v
Former-commit-id: 7c248fac20bf85d57a91132ce7a793c7f84e9218
2024-09-01 20:52:47 +08:00
Byron Hsu
2a4ed6610e Add liger kernel link
Former-commit-id: 4f313044cf8efd9c6ebcbd4741f6f38d56804b7f
2024-08-30 17:16:16 -07:00
hiyouga
1d8e9c7897 fix ci (temp)
Former-commit-id: 9ebaafd2e5c16ecef0243e4df77344ed7c823e57
2024-08-31 02:03:56 +08:00
hiyouga
43654028eb add test mm plugin
Former-commit-id: ddea5cca5a3174de1dcc7fdee8ec69e77700b6bf
2024-08-31 01:53:38 +08:00
hiyouga
2f6fc27c8b remove visual_inputs, fix qlora
Former-commit-id: be30c01c4f1482520ece770bd54c6a4837c26f0a
2024-08-31 00:24:51 +08:00
hiyouga
d789b667d7 optimize predict vram
Former-commit-id: a577e44eee351b3ed8011a33ae01cd713354ff97
2024-08-30 23:08:45 +08:00
hiyouga
66a1abac6a add examples
Former-commit-id: 169c68921b1b8ac279834b060d9e7d38a56fe1aa
2024-08-30 21:43:19 +08:00
hiyouga
665db18661 tiny fix
Former-commit-id: 830511a6d0216da99520aee8b3a753d347a71fa9
2024-08-30 03:21:50 +08:00
hiyouga
30d97ca879 fix #5307
Former-commit-id: 63c19ddfe483a16c1c9afc2f1441e8070bb0f7e4
2024-08-30 02:45:40 +08:00
hiyouga
c62a6ca59d refactor mm training
Former-commit-id: 179c0558699e287cbf38a2d73bff47e86d589c5a
2024-08-30 02:14:31 +08:00
hoshi-hiyouga
77c2c7076b Merge pull request #5290 from simonJJJ/qwen2_vl
support qwen2-vl

Former-commit-id: 7156f832af8505b26371559d340c0e69eb962bbc
2024-08-30 02:10:36 +08:00
hoshi-hiyouga
7466fd4387 fix bug
Former-commit-id: 365e6df71509569f59c40743c115f1a4b945ef0f
2024-08-30 02:05:26 +08:00
hiyouga
c1369a1ec9 update liger kernel
Former-commit-id: d6bf6ca2161c99dd5d644e31d2b1df451017b68c
2024-08-29 20:46:08 +08:00
hiyouga
d677fe053d fix #5292
Former-commit-id: dd81ce8ce5fdf450027c5f9634abb6ac2cd52128
2024-08-29 20:37:47 +08:00
hiyouga
7c6785d3df fix #5295
Former-commit-id: c76873b0eb8225f6e6bfc7223c6012387dceb8ed
2024-08-29 20:30:18 +08:00
hiyouga
77341ee3c4 fix #5305
Former-commit-id: a710ebaf97c258c802f24e508d83f1f3f10edc6d
2024-08-29 20:16:01 +08:00
simonJJJ
5b4b60cfb5 update
Former-commit-id: a968a416d5e513320c97109229ca1e6ddc003cb1
2024-08-28 20:22:46 +08:00
simonJJJ
0f3d54d8a0 initial-commit
Former-commit-id: b6a39847a10b417b09db4b5512dd835e9e4ce928
2024-08-28 16:51:35 +08:00
hiyouga
7272792f65 update wechat
Former-commit-id: ef91752cc6f53088eaf7fc2f64f7148821d82ec2
2024-08-27 12:55:23 +08:00
hiyouga
4cc8e16595 add extra requires
Former-commit-id: c47511773ae9886aae4e5ea1841866d2125abc34
2024-08-27 12:52:12 +08:00
hiyouga
ca5a759f94 tiny fix
Former-commit-id: d2cede7023bbe28525ef8b4ad27247445d8c22e5
2024-08-27 12:49:32 +08:00
hoshi-hiyouga
be51e56a2e Merge pull request #5237 from marko1616/patch-1
Fix mllm api

Former-commit-id: 017703c7ab7f3dc566792619537c3202ca4f4bb7
2024-08-27 12:24:43 +08:00
marko1616
3a9171e275 ruff pass.
Former-commit-id: c2f817772f8e7d947dca04f546befc70001abe64
2024-08-27 11:30:16 +08:00
marko1616
bd0f3b4050 Update chat.py
Former-commit-id: 4e5893a5c4a47ff3cb989bbef0841effc713fc08
2024-08-27 11:27:56 +08:00
hiyouga
206a8364d4 support liger kernel
Former-commit-id: 0f4e54abf6c5feb2329855a4047597ad5147720a
2024-08-27 11:20:14 +08:00
marko1616
097d031066 Force re check.
Former-commit-id: 5f04452f7d65e535d0af08944f7b9e29e85f51d7
2024-08-23 14:43:18 +08:00
marko1616
2674b42b59 Update chat.py
Former-commit-id: 206a16c17d253956afb96daea6f24478e17334fc
2024-08-22 12:24:34 +08:00
marko1616
edf2e51bbc Update chat.py
Former-commit-id: edf6dc1995daa6c3635c3fda1052b340693a04f5
2024-08-22 12:14:34 +08:00
MengqingCao
47877acc2a update npu base image
Former-commit-id: 20819f7707cfff6b951484e91fc7ecda2bf68528
2024-08-21 09:12:38 +00:00
hiyouga
d111a324bc tiny fix
Former-commit-id: 23961bdf6fdbcde64e7b943f699fdeb4ac024043
2024-08-20 00:10:52 +08:00
hoshi-hiyouga
388f0a6e05 Merge pull request #5156 from YeQiuO/main
fix Llama-template's system prompt bug

Former-commit-id: 0b57175d3bd029675dae2f55995b7eeb4e9adc7a
2024-08-20 00:09:03 +08:00
hoshi-hiyouga
8c13c02c55 Update template.py
Former-commit-id: f5a075cb1c90f05bb0de26c6aea718f556c54623
2024-08-20 00:03:33 +08:00
hoshi-hiyouga
a101fde917 Merge pull request #5163 from liu-zichen/fix_ppo_optim
fix lr not change

Former-commit-id: f3c03ec6a89bf57f290820fa31eda24291355e4e
2024-08-19 23:56:24 +08:00
hoshi-hiyouga
1f4373b6e5 Merge pull request #5185 from chenhuiyu/feature/add-sailorllm-template
Add SailorLLM template

Former-commit-id: 28387d6b2f9e3bcc6321345c46b525c8180ebf7e
2024-08-19 23:51:49 +08:00
hoshi-hiyouga
525747b472 Merge pull request #5188 from Zxilly/main
fix: report correct device count for intel xpu
Former-commit-id: cd3c536cb3936061d905256850b0e57df4498010
2024-08-19 23:51:39 +08:00
hoshi-hiyouga
472f12c985 Merge pull request #5193 from Ricardo-L-C/main
_is_bf16_available judgment supports npu

Former-commit-id: 18b9ac49c45af773a2ea563f5e1852dc4b775db8
2024-08-19 23:40:59 +08:00
hoshi-hiyouga
b681f24f43 Update template.py
Former-commit-id: c6822a217e1c296f4aedd9a2c7610acd1dbd443e
2024-08-19 23:40:16 +08:00
hiyouga
fd02b089b6 update readme
Former-commit-id: 756e438866876fa54495cf557dd1e299b17a42fb
2024-08-19 23:32:04 +08:00
Ricardo
57d4c4a4f8 _is_bf16_available judgment supports npu
Former-commit-id: 50a1e892a1005b4cdd82dca1005f71db08ed89a2
2024-08-16 02:58:22 +00:00
Zxilly
3595d26846 fix: report correct device count for intel xpu
Former-commit-id: 0618f660b6511599365bd9be64499dbab41a79ba
2024-08-15 08:30:43 +00:00
Huiyu Chen
22a79c169d Add SailorLLM template
Former-commit-id: a594abe0321a718394a97b5a48ded16e2012c1f0
2024-08-15 15:10:14 +08:00
liu-zichen
75dfe259cf fix lr not change
Former-commit-id: 387dd2d51b5d8cd666459040fdd16525b34720d9
2024-08-13 16:33:34 +08:00
codingma
2e257d6af0 add tutorial and doc links
Former-commit-id: 4f6072562a34e0ec97471210ff54244cf0d0f3df
2024-08-13 16:13:10 +08:00
“Wzw”
e734222373 fix Llama-template's system prompt bug
Former-commit-id: 2e3eddcd0918b0c968ded0df7c82e3dcff870381
2024-08-12 19:22:12 +08:00
hiyouga
6a351b9912 update readme
Former-commit-id: 4fecc5ee56873a7ab4941e46a5168cfe2ecb4bb6
2024-08-10 10:17:35 +08:00
hiyouga
cfc04aa162 update readme
Former-commit-id: fa7bc9f1c7347153f9092ffbbb8e88c6b2f59632
2024-08-09 20:46:02 +08:00
hiyouga
943c795318 add magpie ultra dataset
Former-commit-id: 3317b24329b87e30f13a78936ac5554f211abf7a
2024-08-09 20:28:55 +08:00
hiyouga
7fb61bad04 add qwen2 math models
Former-commit-id: 72ff43a1772c9de5ff914d5e1c8bdc8dea9ae0c8
2024-08-09 20:20:35 +08:00
hiyouga
47efcdb1dd update examples
Former-commit-id: d5c57c8b7f64afe8061045ec9689abbac45c1175
2024-08-09 20:13:46 +08:00
hiyouga
59cbce1a46 add adam_mini to readme
Former-commit-id: d610c6bcf8a8ba6f4236f5d11f79571b83f4fb11
2024-08-09 20:02:03 +08:00
hoshi-hiyouga
7e755e9cac Merge pull request #5095 from relic-yuexi/feat-optimizer
Feat optimizer

Former-commit-id: f08390d252d42a812b71a08daba7339cc40889b7
2024-08-09 19:51:33 +08:00
hiyouga
9d1e2c3c1f update scripts
Former-commit-id: dabf5a1dc661a6581474c6a5ec115322d168ed5f
2024-08-09 19:16:23 +08:00
hiyouga
5af32ce705 follow #5115
Former-commit-id: 7d917e03e2df570139bae18227d9c7303a12de2a
2024-08-09 18:03:00 +08:00
hoshi-hiyouga
4e8861e653 Merge pull request #5115 from YeQiuO/main
fix: `Train on the last turn only` truncate bug
Former-commit-id: 2c6dae45f7a7b72c961489ac407b1b444ab7752e
2024-08-09 17:58:27 +08:00
hoshi-hiyouga
d4d7ffb17c Merge pull request #5072 from relic-yuexi/main
fix the deepseekcoder template to avoid repeat problem

Former-commit-id: 2ae7d5c91725eab9f994015d8d3577894c7978b6
2024-08-09 16:35:21 +08:00
hoshi-hiyouga
46f834ec75 Update template.py
Former-commit-id: ae2a5221c109ae3474d219c37433be767abbee91
2024-08-09 16:27:42 +08:00
“Wzw”
6ec64a7e56 mask_history args verify valid
Former-commit-id: 2f8388b4f4195d934400ad9267d72e10ca4105a3
2024-08-08 10:12:01 +08:00
“Wzw”
d71446e387 fix mask_history tiny bug
Former-commit-id: cac07aac6196be026f723b2397a343d4fb675973
2024-08-08 10:09:33 +08:00
codingma
eada49e56b fix eval_dataset in example
Former-commit-id: e1ffc54f7e58419cc8da958a4d3c2697e18d5583
2024-08-07 18:24:19 +08:00
moontidef
8f42d7df56 feat: add support for adammini
Former-commit-id: a2d5fafb705ff44db1711e972490f0abebc2012b
2024-08-07 10:08:22 +08:00
moontidef
33a90b9026 fix: rename optimzer to optimizer
Former-commit-id: 186dc1fde822e6a603ac273538741ea3853f243e
2024-08-07 10:05:01 +08:00
moontidef
710902b0d0 Merge branch 'hiyouga:main' into main
Former-commit-id: d1b23283e0e4286f126d38d7bdc55802f74c8922
2024-08-06 00:18:45 +08:00
moontidef
7b4f5d3b21 fix: fix the deepseekcoder template to avoid repeat problem
Former-commit-id: 56294831115f095135f72490a8a435434b2f0a11
2024-08-05 23:55:45 +08:00
hiyouga
13093963b1 fix #5048
Former-commit-id: 71a6861667ae68c1fd6a69acf68e1359b858cf1b
2024-08-05 23:48:19 +08:00
hoshi-hiyouga
2e477e7458 Merge pull request #5037 from codemayq/feature-gemma-2-2b
support gemma-2-2b

Former-commit-id: 6af51fadff92cd3e665c556ac073a1876f792ada
2024-08-05 23:27:37 +08:00
codingma
4b6252151e support gemma-2-2b
Former-commit-id: 7037192cf6049fd7d675aed4a6237ed929c6b170
2024-08-01 13:45:48 +08:00
hoshi-hiyouga
f3765d1996 Merge pull request #5010 from Eruly/main
Add Korean web UI (llamafactory-cli webui)

Former-commit-id: 2050806aa826028df45c0c746b4314afe178dcd3
2024-07-30 01:55:54 +08:00
hoshi-hiyouga
1f5cdd66b7 Merge pull request #4996 from LDLINGLINGLING/main
增加了MiniCPM在页面首页的支持列表,MiniCPM官方github也放了LLama_factory的友情链接

Former-commit-id: a86a776fb0f75697b0fee7694a5a0d6bd04fee0a
2024-07-30 01:55:30 +08:00
hoshi-hiyouga
5b0ddbb835 Update README_zh.md
Former-commit-id: 922906faf2d432def7cfdac82f90472fa1bb24a9
2024-07-30 01:55:13 +08:00
hoshi-hiyouga
4f92b56f06 Update README.md
Former-commit-id: 6bc7f71940be0a8f1614f9036b9c539ce46d34e1
2024-07-30 01:53:19 +08:00
hoshi-hiyouga
a1f6ff92be Update README.md
Former-commit-id: 54eecdec0da06677ea55847c74642d0fc12d8908
2024-07-30 01:52:35 +08:00
hoshi-hiyouga
ef98e91618 Merge pull request #4995 from codemayq/fix-pissa
fix pissa callback

Former-commit-id: 052c0f6bd9e872ea325b5a6aef98c4c070733384
2024-07-30 01:47:25 +08:00
eruly
9fdf800750 Add Korean web UI (llamafactory-cli webui)
Former-commit-id: 357a035f2aeb9548368c230c5a17dcdfa4844b17
2024-07-29 13:47:13 +00:00
liudan
32c698e4c2 增加了MiniCPM在页面首页的支持列表,MiniCPM官方github也放了LLama_factory的友情链接
Former-commit-id: f482a6e2fd30aff5113e53f3f07b4649982bcc2e
2024-07-29 10:58:28 +08:00
codingma
75e80fa820 fix pissa save
Former-commit-id: 25a1dad7c8df79c15efecb8c6f871a13a327f57a
2024-07-29 10:44:34 +08:00
hiyouga
f8329bc632 tiny fix
Former-commit-id: 183d8bd500a8e9513a077161ba8e8d61bea9200f
2024-07-26 11:51:00 +08:00
hoshi-hiyouga
9f74d36ba4 Merge pull request #4892 from piamo/main
update deepseek template

Former-commit-id: 3233efc8404972098665286d9dec7312dd6ecfab
2024-07-26 11:49:34 +08:00
hoshi-hiyouga
fc2435f135 Merge pull request #4950 from liuwwang/main and fi
fix: Repair the issue where quantization failed after merging the adapter.
Former-commit-id: 93a68ea1f4372973f745a2c250250ecaac515e27
2024-07-26 11:48:56 +08:00
hoshi-hiyouga
0636519ba3 Merge pull request #4970 from HardAndHeavy/add-rocm
Add ROCm support

Former-commit-id: c0f21d869bce6e59825d57c66bce3fe54f50065f
2024-07-26 11:41:23 +08:00
hoshi-hiyouga
573bf03a6f Update README_zh.md
Former-commit-id: 86a27a97ff67b0d4bcd671c62759cd049542dc1b
2024-07-26 11:30:57 +08:00
hoshi-hiyouga
9e529be4e7 Update README.md
Former-commit-id: 1c167bb2ea3a47bdeeccc044a653662132c61698
2024-07-26 11:29:28 +08:00
hoshi-hiyouga
7af4ffa6cc Update README.md
Former-commit-id: d6e7a69c274c3756587e18a039637dd37fa152b2
2024-07-26 11:29:09 +08:00
HardAndHeavy
5b67ccd1c6 Add ROCm support
Former-commit-id: cf9df10a24936efd420b0fdac541fd6c0808a327
2024-07-25 21:29:28 +03:00
khazic
5166dbbcd3 Added the reference address for TRL PPO details.
Former-commit-id: 509c55608643eae3a6456683d425a7c636cfc3e9
2024-07-25 09:03:21 +08:00
hiyouga
21adb09730 fix #4959
Former-commit-id: 96e8a1d47874708c6157865c78be4cd6c533e01b
2024-07-24 23:44:00 +08:00
hiyouga
28b5f656db update webui
Former-commit-id: 463edec1b1c1345afc791e225deb33f118f3582e
2024-07-24 21:11:51 +08:00
hoshi-hiyouga
68ee2d512f Update README_zh.md
Former-commit-id: 1443e876697e18108573387e501a7453ba9fc06c
2024-07-24 21:08:42 +08:00
hoshi-hiyouga
a5f7e0efc6 Update README.md
Former-commit-id: 07d86e38cfd857d1dfa898541f3e5bd9c6f11581
2024-07-24 21:07:14 +08:00
hiyouga
211038584a tiny fix
Former-commit-id: 28cac0e325bfd7a6c0c344ad2d46511613190cd7
2024-07-24 18:33:39 +08:00
hiyouga
ff5ba97970 fix #4928
Former-commit-id: 6d557e8959678f9d4edbcb3d5a6dfba14b429b18
2024-07-24 17:00:29 +08:00
hiyouga
27f2c3cae1 fix #4925
Former-commit-id: 79c336e2339974471627487858d59e4ed2152370
2024-07-24 16:56:58 +08:00
hiyouga
48f0819327 fix #4944
Former-commit-id: 9e8cf3b21a0b12d1413c3c7f3d60399784909242
2024-07-24 16:42:51 +08:00
hiyouga
5c6d88e91c add mistral nemo model
Former-commit-id: 428bb49f53b32947bc0a62ca19ab10844154c07c
2024-07-24 16:25:53 +08:00
hiyouga
0a04d9470f add llama3.1
Former-commit-id: 3c433890f9b61c520572f5233aae70584da0f330
2024-07-24 16:20:11 +08:00
Liuww
f0408c0dde fix: Repair the issue where quantization failed after merging the adapter.
Former-commit-id: 8109561b7f577d448f8bca7e569f7f443cf6bb52
2024-07-24 14:31:29 +08:00
hiyouga
a041f4a111 tiny fix
Former-commit-id: bf6a2f032c598f969708c1c3db4875d6239c41a9
2024-07-22 21:10:15 +08:00
hoshi-hiyouga
cdf9dae53e fix #4917
Former-commit-id: e26919aafd8436489d065789c9c25d72c8d05a6d
2024-07-22 11:28:31 +08:00
hiyouga
1917f431f5 tiny fix
Former-commit-id: 9133316e558a3c8744f5eb6ab8678686bf4859ed
2024-07-22 00:06:03 +08:00
hiyouga
a770afbff2 fix flashattn + packing
Former-commit-id: 4adc6ce4abc718c25f39b316bfc3352d0d01ed1e
2024-07-21 17:07:45 +08:00
huangpan.foo
b1a5bf025b update deepseek template
Former-commit-id: f5ca86ec95bb301df42ffaa6923fc3037a224e34
2024-07-19 15:02:54 +08:00
hiyouga
adff3e5050 set dev version
Former-commit-id: 0b9a2275dc533b65578278f979ce053e95a644b3
2024-07-19 02:01:46 +08:00
381 changed files with 77011 additions and 7836 deletions

View File

@@ -3,10 +3,12 @@
.github .github
.venv .venv
cache cache
data
docker docker
saves saves
hf_cache hf_cache
ms_cache
om_cache
shared_data
output output
.dockerignore .dockerignore
.gitattributes .gitattributes

45
.env.local Normal file
View File

@@ -0,0 +1,45 @@
# Note: actually we do not support .env, just for reference
# api
API_HOST=
API_PORT=
API_KEY=
API_MODEL_NAME=
API_VERBOSE=
FASTAPI_ROOT_PATH=
MAX_CONCURRENT=
# general
DISABLE_VERSION_CHECK=
FORCE_CHECK_IMPORTS=
ALLOW_EXTRA_ARGS=
LLAMAFACTORY_VERBOSITY=
USE_MODELSCOPE_HUB=
USE_OPENMIND_HUB=
USE_RAY=
USE_KT=
RECORD_VRAM=
OPTIM_TORCH=
NPU_JIT_COMPILE=
# torchrun
FORCE_TORCHRUN=
MASTER_ADDR=
MASTER_PORT=
NNODES=
NODE_RANK=
NPROC_PER_NODE=
# wandb
WANDB_DISABLED=
WANDB_PROJECT=
WANDB_API_KEY=
# gradio ui
GRADIO_SHARE=
GRADIO_SERVER_NAME=
GRADIO_SERVER_PORT=
GRADIO_ROOT_PATH=
GRADIO_IPV6=
# backend
USE_MCA=
# setup
ENABLE_SHORT_CONSOLE=
# reserved (do not use)
LLAMABOARD_ENABLED=
LLAMABOARD_WORKDIR=

View File

@@ -19,3 +19,49 @@ There are several ways you can contribute to LLaMA Factory:
### Style guide ### Style guide
LLaMA Factory follows the [Google Python Style Guide](https://google.github.io/styleguide/pyguide.html), check it for details. LLaMA Factory follows the [Google Python Style Guide](https://google.github.io/styleguide/pyguide.html), check it for details.
### Create a Pull Request
1. Fork the [repository](https://github.com/hiyouga/LLaMA-Factory) by clicking on the [Fork](https://github.com/hiyouga/LLaMA-Factory/fork) button on the repository's page. This creates a copy of the code under your GitHub user account.
2. Clone your fork to your local disk, and add the base repository as a remote:
```bash
git clone git@github.com:[username]/LLaMA-Factory.git
cd LLaMA-Factory
git remote add upstream https://github.com/hiyouga/LLaMA-Factory.git
```
3. Create a new branch to hold your development changes:
```bash
git checkout -b dev_your_branch
```
4. Set up a development environment by running the following command in a virtual environment:
```bash
pip install -e ".[dev]"
```
If LLaMA Factory was already installed in the virtual environment, remove it with `pip uninstall llamafactory` before reinstalling it in editable mode with the -e flag.
5. Check code before commit:
```bash
make commit
make style && make quality
make test
```
6. Submit changes:
```bash
git add .
git commit -m "commit message"
git fetch upstream
git rebase upstream/main
git push -u origin dev_your_branch
```
7. Create a merge request from your branch `dev_your_branch` at [origin repo](https://github.com/hiyouga/LLaMA-Factory).

61
.github/ISSUE_TEMPLATE/1-bug-report.yml vendored Normal file
View File

@@ -0,0 +1,61 @@
name: "\U0001F41B Bug / help"
description: Create a report to help us improve the LLaMA Factory
labels: ["bug", "pending"]
body:
- type: markdown
attributes:
value: |
Issues included in **[FAQs](https://github.com/hiyouga/LLaMA-Factory/issues/4614)** or those with **insufficient** information may be closed without a response.
已经包含在 **[常见问题](https://github.com/hiyouga/LLaMA-Factory/issues/4614)** 内或提供信息**不完整**的 issues 可能不会被回复。
- type: markdown
attributes:
value: |
Please do not create issues that are not related to framework bugs under this category, use **[Discussions](https://github.com/hiyouga/LLaMA-Factory/discussions/categories/q-a)** instead.
请勿在此分类下创建和框架 bug 无关的 issues训练问题求助请使用 **[讨论区](https://github.com/hiyouga/LLaMA-Factory/discussions/categories/q-a)**。
- type: checkboxes
id: reminder
attributes:
label: Reminder
description: |
Please ensure you have read the above rules carefully and searched the existing issues (including FAQs).
请确保您已经认真阅读了上述规则并且搜索过现有的 issues包括常见问题
options:
- label: I have read the above rules and searched the existing issues.
required: true
- type: textarea
id: system-info
validations:
required: true
attributes:
label: System Info
description: |
Please share your system info with us. You can run the command **llamafactory-cli env** and copy-paste its output below.
请提供您的系统信息。您可以在命令行运行 **llamafactory-cli env** 并将其输出复制到该文本框中。
placeholder: llamafactory version, platform, python version, ...
- type: textarea
id: reproduction
validations:
required: true
attributes:
label: Reproduction
description: |
Please provide entry arguments, error messages and stack traces that reproduces the problem.
请提供入口参数,错误日志以及异常堆栈以便于我们复现问题。
value: |
```text
Put your message here.
```
- type: textarea
id: others
validations:
required: false
attributes:
label: Others

View File

@@ -0,0 +1,41 @@
name: "\U0001F680 Feature request"
description: Submit a request for a new feature
labels: ["enhancement", "pending"]
body:
- type: markdown
attributes:
value: |
Please do not create issues that are not related to new features under this category.
请勿在此分类下创建和新特性无关的 issues。
- type: checkboxes
id: reminder
attributes:
label: Reminder
description: |
Please ensure you have read the above rules carefully and searched the existing issues.
请确保您已经认真阅读了上述规则并且搜索过现有的 issues。
options:
- label: I have read the above rules and searched the existing issues.
required: true
- type: textarea
id: description
validations:
required: true
attributes:
label: Description
description: |
A clear and concise description of the feature proposal.
请详细描述您希望加入的新功能特性。
- type: textarea
id: contribution
validations:
required: false
attributes:
label: Pull Request
description: |
Have you already created the relevant PR and submitted the code?
您是否已经创建了相关 PR 并提交了代码?

View File

@@ -1,66 +0,0 @@
name: "\U0001F41B Bug / Help"
description: Create a report to help us improve the LLaMA Factory
body:
- type: markdown
attributes:
value: |
Issues included in **FAQs** or those with **insufficient** information may be closed without a response.
包含在**常见问题**内或提供信息**不完整**的 issues 可能不会被回复。
- type: checkboxes
id: reminder
attributes:
label: Reminder
description: |
Please ensure you have read the README carefully and searched the existing issues (including FAQs).
请确保您已经认真阅读了 README 并且搜索过现有的 issues包括常见问题
options:
- label: I have read the README and searched the existing issues.
required: true
- type: textarea
id: system-info
validations:
required: true
attributes:
label: System Info
description: |
Please share your system info with us. You can run the command **llamafactory-cli env** and copy-paste its output below.
请提供您的系统信息。您可以在命令行运行 **llamafactory-cli env** 并将其输出复制到该文本框中。
placeholder: llamafactory version, platform, python version, ...
- type: textarea
id: reproduction
validations:
required: true
attributes:
label: Reproduction
description: |
Please provide code snippets, error messages and stack traces that reproduces the problem.
请提供运行参数,错误信息以及异常堆栈以便于我们复现该问题。
Remember to use Markdown tags to correctly format your code.
请合理使用 Markdown 标签来格式化您的文本。
placeholder: |
```bash
llamafactory-cli train ...
```
- type: textarea
id: expected-behavior
validations:
required: false
attributes:
label: Expected behavior
description: |
Please provide a clear and concise description of what you would expect to happen.
请提供您原本的目的,即这段代码的期望行为。
- type: textarea
id: others
validations:
required: false
attributes:
label: Others

8
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View File

@@ -0,0 +1,8 @@
blank_issues_enabled: false
contact_links:
- name: 📚 FAQs | 常见问题
url: https://github.com/hiyouga/LLaMA-Factory/issues/4614
about: Reading in advance is recommended | 建议提前阅读
- name: Discussions | 讨论区
url: https://github.com/hiyouga/LLaMA-Factory/discussions
about: Please ask fine-tuning questions here | 请在这里讨论训练问题

112
.github/workflows/docker.yml vendored Normal file
View File

@@ -0,0 +1,112 @@
name: docker
on:
workflow_dispatch:
push:
branches:
- "main"
paths:
- "**/*.py"
- "requirements.txt"
- "docker/**"
- ".github/workflows/*.yml"
pull_request:
branches:
- "main"
paths:
- "**/*.py"
- "requirements.txt"
- "docker/**"
- ".github/workflows/*.yml"
release:
types:
- published
jobs:
build:
strategy:
fail-fast: false
matrix:
device:
- "cuda"
- "npu"
runs-on: ubuntu-latest
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ matrix.device }}
cancel-in-progress: ${{ github.ref != 'refs/heads/main' }}
environment:
name: docker
url: https://hub.docker.com/r/hiyouga/llamafactory
steps:
- name: Free up disk space
uses: jlumbroso/free-disk-space@v1.3.1
with:
tool-cache: true
docker-images: false
- name: Checkout
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Get llamafactory version
id: version
run: |
if [ "${{ github.event_name }}" = "release" ]; then
echo "tag=$(python setup.py --version)" >> "$GITHUB_OUTPUT"
else
echo "tag=latest" >> "$GITHUB_OUTPUT"
fi
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to Docker Hub
if: ${{ github.event_name != 'pull_request' }}
uses: docker/login-action@v3
with:
username: ${{ vars.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to Quay
if: ${{ github.event_name != 'pull_request' && matrix.device == 'npu' }}
uses: docker/login-action@v3
with:
registry: quay.io
username: ${{ vars.QUAY_ASCEND_USERNAME }}
password: ${{ secrets.QUAY_ASCEND_TOKEN }}
- name: Build and push Docker image (CUDA)
if: ${{ matrix.device == 'cuda' }}
uses: docker/build-push-action@v6
with:
context: .
file: ./docker/docker-cuda/Dockerfile
build-args: |
EXTRAS=metrics,deepspeed,liger-kernel
push: ${{ github.event_name != 'pull_request' }}
tags: |
docker.io/hiyouga/llamafactory:${{ steps.version.outputs.tag }}
cache-from: type=gha
cache-to: type=gha,mode=max
- name: Build and push Docker image (NPU)
if: ${{ matrix.device == 'npu' }}
uses: docker/build-push-action@v6
with:
context: .
platforms: linux/amd64,linux/arm64
file: ./docker/docker-npu/Dockerfile
push: ${{ github.event_name != 'pull_request' }}
tags: |
docker.io/hiyouga/llamafactory:${{ steps.version.outputs.tag }}-npu-a2
quay.io/ascend/llamafactory:${{ steps.version.outputs.tag }}-npu-a2
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@@ -18,13 +18,15 @@ jobs:
ISSUE_URL: ${{ github.event.issue.html_url }} ISSUE_URL: ${{ github.event.issue.html_url }}
ISSUE_TITLE: ${{ github.event.issue.title }} ISSUE_TITLE: ${{ github.event.issue.title }}
run: | run: |
LABEL=pending LABEL=""
NPU_KEYWORDS=(npu huawei ascend 华为 昇腾) NPU_KEYWORDS=(npu huawei ascend 华为 昇腾 910)
ISSUE_TITLE_LOWER=$(echo $ISSUE_TITLE | tr '[:upper:]' '[:lower:]') ISSUE_TITLE_LOWER=$(echo $ISSUE_TITLE | tr '[:upper:]' '[:lower:]')
for KEYWORD in ${NPU_KEYWORDS[@]}; do for KEYWORD in ${NPU_KEYWORDS[@]}; do
if [[ $ISSUE_TITLE_LOWER == *$KEYWORD* ]] && [[ $ISSUE_TITLE_LOWER != *input* ]]; then if [[ $ISSUE_TITLE_LOWER == *$KEYWORD* ]] && [[ $ISSUE_TITLE_LOWER != *input* ]]; then
LABEL=pending,npu LABEL="npu"
break break
fi fi
done done
gh issue edit $ISSUE_URL --add-label $LABEL if [ -n "$LABEL" ]; then
gh issue edit $ISSUE_URL --add-label $LABEL
fi

View File

@@ -1,6 +1,7 @@
name: publish name: publish
on: on:
workflow_dispatch:
release: release:
types: types:
- published - published
@@ -25,16 +26,11 @@ jobs:
- name: Set up Python - name: Set up Python
uses: actions/setup-python@v5 uses: actions/setup-python@v5
with: with:
python-version: "3.8" python-version: "3.9"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install build
- name: Build package - name: Build package
run: | run: |
python -m build make build
- name: Publish package - name: Publish package
uses: pypa/gh-action-pypi-publish@release/v1 uses: pypa/gh-action-pypi-publish@release/v1

View File

@@ -1,30 +1,61 @@
name: tests name: tests
on: on:
workflow_dispatch:
push: push:
branches: branches:
- main - "main"
paths: paths:
- "**.py" - "**/*.py"
- "requirements.txt" - "requirements.txt"
- ".github/workflows/*.yml" - ".github/workflows/*.yml"
pull_request: pull_request:
branches: branches:
- main - "main"
paths: paths:
- "**.py" - "**/*.py"
- "requirements.txt" - "requirements.txt"
- ".github/workflows/*.yml" - ".github/workflows/*.yml"
jobs: jobs:
tests: tests:
runs-on: ubuntu-latest strategy:
fail-fast: false
matrix:
python:
- "3.9"
- "3.10"
- "3.11"
- "3.12"
os:
- "ubuntu-latest"
- "windows-latest"
- "macos-latest"
transformers:
- null
include: # test backward compatibility
- python: "3.9"
os: "ubuntu-latest"
transformers: "4.49.0"
- python: "3.9"
os: "ubuntu-latest"
transformers: "4.51.0"
- python: "3.9"
os: "ubuntu-latest"
transformers: "4.53.0"
exclude: # exclude python 3.9 on macos
- python: "3.9"
os: "macos-latest"
environment: runs-on: ${{ matrix.os }}
name: tests
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}-${{ matrix.os }}-${{ matrix.python }}-${{ matrix.transformers }}
cancel-in-progress: ${{ github.ref != 'refs/heads/main' }}
env: env:
HF_TOKEN: ${{ secrets.HF_TOKEN }} HF_TOKEN: ${{ secrets.HF_TOKEN }}
OS_NAME: ${{ matrix.os }}
steps: steps:
- name: Checkout - name: Checkout
@@ -33,19 +64,42 @@ jobs:
- name: Set up Python - name: Set up Python
uses: actions/setup-python@v5 uses: actions/setup-python@v5
with: with:
python-version: "3.8" python-version: ${{ matrix.python }}
cache: "pip" cache: "pip"
cache-dependency-path: "setup.py" cache-dependency-path: "**/requirements*.txt"
- name: Install dependencies - name: Install dependencies
run: | run: |
python -m pip install --upgrade pip python -m pip install --upgrade pip
python -m pip install ".[torch,dev]" python -m pip install ".[torch,dev]"
- name: Install transformers
if: ${{ matrix.transformers }}
run: |
python -m pip install "transformers==${{ matrix.transformers }}"
- name: Cache files
id: hf-hub-cache
uses: actions/cache@v4
with:
path: ${{ runner.temp }}/huggingface
key: huggingface-${{ matrix.os }}-${{ matrix.python }}-${{ matrix.transformers }}-${{ hashFiles('tests/version.txt') }}
- name: Check quality - name: Check quality
run: | run: |
make style && make quality make style && make quality
- name: Check license
run: |
make license
- name: Check build
run: |
make build
- name: Test with pytest - name: Test with pytest
run: | run: |
make test make test
env:
HF_HOME: ${{ runner.temp }}/huggingface
HF_HUB_OFFLINE: "${{ steps.hf-hub-cache.outputs.cache-hit == 'true' && '1' || '0' }}"

16
.gitignore vendored
View File

@@ -159,9 +159,21 @@ cython_debug/
# option (not recommended) you can uncomment the following to ignore the entire idea folder. # option (not recommended) you can uncomment the following to ignore the entire idea folder.
.idea/ .idea/
# vscode
.vscode/
# uv
uv.lock
# custom .gitignore # custom .gitignore
cache/ hf_cache/
config/ ms_cache/
om_cache/
llamaboard_cache/
llamaboard_config/
saves/ saves/
output/ output/
wandb/ wandb/
swanlog/
generated_predictions.jsonl
predictions_score.json

28
.pre-commit-config.yaml Normal file
View File

@@ -0,0 +1,28 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v6.0.0
hooks:
- id: check-ast
- id: check-added-large-files
args: ['--maxkb=25000']
- id: check-merge-conflict
- id: check-yaml
- id: debug-statements
- id: end-of-file-fixer
- id: trailing-whitespace
args: [--markdown-linebreak-ext=md]
- id: no-commit-to-branch
args: ['--branch', 'main']
- repo: https://github.com/asottile/pyupgrade
rev: v3.20.0
hooks:
- id: pyupgrade
args: [--py39-plus]
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.13.2
hooks:
- id: ruff
args: [--fix]
- id: ruff-format

View File

@@ -1,6 +1,16 @@
.PHONY: quality style test .PHONY: build commit license quality style test
check_dirs := scripts src tests check_dirs := scripts src tests tests_v1 setup.py
build:
pip3 install build && python3 -m build
commit:
pre-commit install
pre-commit run --all-files
license:
python3 tests/check_license.py $(check_dirs)
quality: quality:
ruff check $(check_dirs) ruff check $(check_dirs)
@@ -11,4 +21,4 @@ style:
ruff format $(check_dirs) ruff format $(check_dirs)
test: test:
CUDA_VISIBLE_DEVICES= pytest tests/ CUDA_VISIBLE_DEVICES= WANDB_DISABLED=true pytest -vv tests/

629
README.md
View File

@@ -1,44 +1,90 @@
![# LLaMA Factory](assets/logo.png) ![# LLaMA Factory](assets/logo.png)
[![GitHub Repo stars](https://img.shields.io/github/stars/hiyouga/LLaMA-Factory?style=social)](https://github.com/hiyouga/LLaMA-Factory/stargazers) [![GitHub Repo stars](https://img.shields.io/github/stars/hiyouga/LLaMA-Factory?style=social)](https://github.com/hiyouga/LLaMA-Factory/stargazers)
[![GitHub Code License](https://img.shields.io/github/license/hiyouga/LLaMA-Factory)](LICENSE)
[![GitHub last commit](https://img.shields.io/github/last-commit/hiyouga/LLaMA-Factory)](https://github.com/hiyouga/LLaMA-Factory/commits/main) [![GitHub last commit](https://img.shields.io/github/last-commit/hiyouga/LLaMA-Factory)](https://github.com/hiyouga/LLaMA-Factory/commits/main)
[![GitHub contributors](https://img.shields.io/github/contributors/hiyouga/LLaMA-Factory?color=orange)](https://github.com/hiyouga/LLaMA-Factory/graphs/contributors)
[![GitHub workflow](https://github.com/hiyouga/LLaMA-Factory/actions/workflows/tests.yml/badge.svg)](https://github.com/hiyouga/LLaMA-Factory/actions/workflows/tests.yml)
[![PyPI](https://img.shields.io/pypi/v/llamafactory)](https://pypi.org/project/llamafactory/) [![PyPI](https://img.shields.io/pypi/v/llamafactory)](https://pypi.org/project/llamafactory/)
[![Citation](https://img.shields.io/badge/citation-72-green)](#projects-using-llama-factory) [![Citation](https://img.shields.io/badge/citation-1000+-green)](https://scholar.google.com/scholar?cites=12620864006390196564)
[![GitHub pull request](https://img.shields.io/badge/PRs-welcome-blue)](https://github.com/hiyouga/LLaMA-Factory/pulls) [![Docker Pulls](https://img.shields.io/docker/pulls/hiyouga/llamafactory)](https://hub.docker.com/r/hiyouga/llamafactory/tags)
[![Discord](https://dcbadge.vercel.app/api/server/rKfvV9r9FK?compact=true&style=flat)](https://discord.gg/rKfvV9r9FK)
[![Twitter](https://img.shields.io/twitter/follow/llamafactory_ai)](https://twitter.com/llamafactory_ai) [![Twitter](https://img.shields.io/twitter/follow/llamafactory_ai)](https://twitter.com/llamafactory_ai)
[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1eRTPn37ltBbYsISy9Aw2NuI2Aq5CQrD9?usp=sharing) [![Discord](assets/thirdparty/discord.svg)](https://discord.gg/rKfvV9r9FK)
[![Open in DSW](https://gallery.pai-ml.com/assets/open-in-dsw.svg)](https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory) [![WeChat](https://img.shields.io/badge/WeChat-User%20Group-blue?logo=wechat)](https://github.com/hiyouga/llamafactory-community)
[![Spaces](https://img.shields.io/badge/🤗-Open%20in%20Spaces-blue)](https://huggingface.co/spaces/hiyouga/LLaMA-Board) [![Blog](https://img.shields.io/badge/Hugo-Official%20Blog-blue?logo=hugo)](https://blog.llamafactory.net/en/)
[![Studios](https://img.shields.io/badge/ModelScope-Open%20in%20Studios-blue)](https://modelscope.cn/studios/hiyouga/LLaMA-Board)
[![GitHub Tread](https://trendshift.io/api/badge/repositories/4535)](https://trendshift.io/repositories/4535) [![Open in Colab](assets/thirdparty/colab.svg)](https://colab.research.google.com/drive/1eRTPn37ltBbYsISy9Aw2NuI2Aq5CQrD9?usp=sharing)
[![Open in DSW](assets/thirdparty/dsw.svg)](https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory)
[![Open in Lab4ai](assets/thirdparty/lab4ai.svg)](https://www.lab4ai.cn/course/detail?id=7c13e60f6137474eb40f6fd3983c0f46&utm_source=LLaMA-Factory)
[![Open in Online](assets/thirdparty/online.svg)](https://www.llamafactory.com.cn/?utm_source=LLaMA-Factory)
[![Open in Spaces](https://img.shields.io/badge/🤗-Open%20in%20Spaces-blue)](https://huggingface.co/spaces/hiyouga/LLaMA-Board)
[![Open in Studios](https://img.shields.io/badge/ModelScope-Open%20in%20Studios-blue)](https://modelscope.cn/studios/hiyouga/LLaMA-Board)
[![Open in Novita](https://img.shields.io/badge/Novita-Deploy%20Template-blue)](https://novita.ai/templates-library/105981?sharer=88115474-394e-4bda-968e-b88e123d0c47)
👋 Join our [WeChat](assets/wechat.jpg) or [NPU user group](assets/wechat_npu.jpg). ### Used by [Amazon](https://aws.amazon.com/cn/blogs/machine-learning/how-apoidea-group-enhances-visual-information-extraction-from-banking-documents-with-multimodal-models-using-llama-factory-on-amazon-sagemaker-hyperpod/), [NVIDIA](https://developer.nvidia.com/rtx/ai-toolkit), [Aliyun](https://help.aliyun.com/zh/pai/use-cases/fine-tune-a-llama-3-model-with-llama-factory), etc.
<div align="center" markdown="1">
### Supporters ❤️
| <div style="text-align: center;"><a href="https://warp.dev/llama-factory"><img alt="Warp sponsorship" width="400" src="assets/sponsors/warp.jpg"></a><br><a href="https://warp.dev/llama-factory" style="font-size:larger;">Warp, the agentic terminal for developers</a><br><a href="https://warp.dev/llama-factory">Available for MacOS, Linux, & Windows</a> | <a href="https://serpapi.com"><img alt="SerpAPI sponsorship" width="250" src="assets/sponsors/serpapi.svg"> </a> |
| ---- | ---- |
----
### Easily fine-tune 100+ large language models with zero-code [CLI](#quickstart) and [Web UI](#fine-tuning-with-llama-board-gui-powered-by-gradio)
![GitHub Trend](https://trendshift.io/api/badge/repositories/4535)
</div>
👋 Join our [WeChat](https://github.com/hiyouga/llamafactory-community/blob/main/wechat/main.jpg), [NPU](https://github.com/hiyouga/llamafactory-community/blob/main/wechat/npu.jpg), [Lab4AI](https://github.com/hiyouga/llamafactory-community/blob/main/wechat/lab4ai.jpg), [LLaMA Factory Online](https://github.com/hiyouga/llamafactory-community/blob/main/wechat/online.jpg) user group.
\[ English | [中文](README_zh.md) \] \[ English | [中文](README_zh.md) \]
**Fine-tuning a large language model can be easy as...** **Fine-tuning a large language model can be easy as...**
https://github.com/hiyouga/LLaMA-Factory/assets/16256802/9840a653-7e9c-41c8-ae89-7ace5698baf6 https://github.com/user-attachments/assets/3991a3a8-4276-4d30-9cab-4cb0c4b9b99e
Choose your path: Start local training:
- Please refer to [usage](#getting-started)
- **Colab**: https://colab.research.google.com/drive/1eRTPn37ltBbYsISy9Aw2NuI2Aq5CQrD9?usp=sharing Start cloud training:
- **PAI-DSW**: https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory - **Colab (free)**: https://colab.research.google.com/drive/1eRTPn37ltBbYsISy9Aw2NuI2Aq5CQrD9?usp=sharing
- **Local machine**: Please refer to [usage](#getting-started) - **PAI-DSW (free trial)**: https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory
- **LLaMA Factory Online**: https://www.llamafactory.com.cn/?utm_source=LLaMA-Factory
- **Alaya NeW (cloud GPU deal)**: https://docs.alayanew.com/docs/documents/useGuide/LLaMAFactory/mutiple/?utm_source=LLaMA-Factory
Read technical notes:
- **Documentation (WIP)**: https://llamafactory.readthedocs.io/en/latest/
- **Documentation (AMD GPU)**: https://rocm.docs.amd.com/projects/ai-developer-hub/en/latest/notebooks/fine_tune/llama_factory_llama3.html
- **Official Blog**: https://blog.llamafactory.net/en/
- **Official Course**: https://www.lab4ai.cn/course/detail?id=7c13e60f6137474eb40f6fd3983c0f46&utm_source=LLaMA-Factory
> [!NOTE]
> Except for the above links, all other websites are unauthorized third-party websites. Please carefully use them.
## Table of Contents ## Table of Contents
- [Features](#features) - [Features](#features)
- [Benchmark](#benchmark) - [Blogs](#blogs)
- [Changelog](#changelog) - [Changelog](#changelog)
- [Supported Models](#supported-models) - [Supported Models](#supported-models)
- [Supported Training Approaches](#supported-training-approaches) - [Supported Training Approaches](#supported-training-approaches)
- [Provided Datasets](#provided-datasets) - [Provided Datasets](#provided-datasets)
- [Requirement](#requirement) - [Requirement](#requirement)
- [Getting Started](#getting-started) - [Getting Started](#getting-started)
- [Installation](#installation)
- [Data Preparation](#data-preparation)
- [Quickstart](#quickstart)
- [Fine-Tuning with LLaMA Board GUI](#fine-tuning-with-llama-board-gui-powered-by-gradio)
- [LLaMA Factory Online](#llama-factory-online)
- [Build Docker](#build-docker)
- [Deploy with OpenAI-style API and vLLM](#deploy-with-openai-style-api-and-vllm)
- [Download from ModelScope Hub](#download-from-modelscope-hub)
- [Download from Modelers Hub](#download-from-modelers-hub)
- [Use W&B Logger](#use-wb-logger)
- [Use SwanLab Logger](#use-swanlab-logger)
- [Projects using LLaMA Factory](#projects-using-llama-factory) - [Projects using LLaMA Factory](#projects-using-llama-factory)
- [License](#license) - [License](#license)
- [Citation](#citation) - [Citation](#citation)
@@ -46,40 +92,115 @@ Choose your path:
## Features ## Features
- **Various models**: LLaMA, LLaVA, Mistral, Mixtral-MoE, Qwen, Yi, Gemma, Baichuan, ChatGLM, Phi, etc. - **Various models**: LLaMA, LLaVA, Mistral, Mixtral-MoE, Qwen, Qwen2-VL, DeepSeek, Yi, Gemma, ChatGLM, Phi, etc.
- **Integrated methods**: (Continuous) pre-training, (multimodal) supervised fine-tuning, reward modeling, PPO, DPO, KTO, ORPO, etc. - **Integrated methods**: (Continuous) pre-training, (multimodal) supervised fine-tuning, reward modeling, PPO, DPO, KTO, ORPO, etc.
- **Scalable resources**: 16-bit full-tuning, freeze-tuning, LoRA and 2/3/4/5/6/8-bit QLoRA via AQLM/AWQ/GPTQ/LLM.int8/HQQ/EETQ. - **Scalable resources**: 16-bit full-tuning, freeze-tuning, LoRA and 2/3/4/5/6/8-bit QLoRA via AQLM/AWQ/GPTQ/LLM.int8/HQQ/EETQ.
- **Advanced algorithms**: GaLore, BAdam, DoRA, LongLoRA, LLaMA Pro, Mixture-of-Depths, LoRA+, LoftQ, PiSSA and Agent tuning. - **Advanced algorithms**: [GaLore](https://github.com/jiaweizzhao/GaLore), [BAdam](https://github.com/Ledzy/BAdam), [APOLLO](https://github.com/zhuhanqing/APOLLO), [Adam-mini](https://github.com/zyushun/Adam-mini), [Muon](https://github.com/KellerJordan/Muon), [OFT](https://github.com/huggingface/peft/tree/main/src/peft/tuners/oft), DoRA, LongLoRA, LLaMA Pro, Mixture-of-Depths, LoRA+, LoftQ and PiSSA.
- **Practical tricks**: FlashAttention-2, Unsloth, RoPE scaling, NEFTune and rsLoRA. - **Practical tricks**: [FlashAttention-2](https://github.com/Dao-AILab/flash-attention), [Unsloth](https://github.com/unslothai/unsloth), [Liger Kernel](https://github.com/linkedin/Liger-Kernel), RoPE scaling, NEFTune and rsLoRA.
- **Experiment monitors**: LlamaBoard, TensorBoard, Wandb, MLflow, etc. - **Wide tasks**: Multi-turn dialogue, tool using, image understanding, visual grounding, video recognition, audio understanding, etc.
- **Faster inference**: OpenAI-style API, Gradio UI and CLI with vLLM worker. - **Experiment monitors**: LlamaBoard, TensorBoard, Wandb, MLflow, [SwanLab](https://github.com/SwanHubX/SwanLab), etc.
- **Faster inference**: OpenAI-style API, Gradio UI and CLI with [vLLM worker](https://github.com/vllm-project/vllm) or [SGLang worker](https://github.com/sgl-project/sglang).
## Benchmark ### Day-N Support for Fine-Tuning Cutting-Edge Models
Compared to ChatGLM's [P-Tuning](https://github.com/THUDM/ChatGLM2-6B/tree/main/ptuning), LLaMA Factory's LoRA tuning offers up to **3.7 times faster** training speed with a better Rouge score on the advertising text generation task. By leveraging 4-bit quantization technique, LLaMA Factory's QLoRA further improves the efficiency regarding the GPU memory. | Support Date | Model Name |
| ------------ | -------------------------------------------------------------------- |
| Day 0 | Qwen3 / Qwen2.5-VL / Gemma 3 / GLM-4.1V / InternLM 3 / MiniCPM-o-2.6 |
| Day 1 | Llama 3 / GLM-4 / Mistral Small / PaliGemma2 / Llama 4 |
![benchmark](assets/benchmark.svg) ## Blogs
<details><summary>Definitions</summary> > [!TIP]
> Now we have a dedicated blog for LLaMA Factory!
>
> Website: https://blog.llamafactory.net/en/
- **Training Speed**: the number of training samples processed per second during the training. (bs=4, cutoff_len=1024) - 💡 [Easy Dataset × LLaMA Factory: Enabling LLMs to Efficiently Learn Domain Knowledge](https://buaa-act.feishu.cn/wiki/GVzlwYcRFiR8OLkHbL6cQpYin7g) (English)
- **Rouge Score**: Rouge-2 score on the development set of the [advertising text generation](https://aclanthology.org/D19-1321.pdf) task. (bs=4, cutoff_len=1024) - [Fine-tune a mental health LLM using LLaMA-Factory](https://www.lab4ai.cn/project/detail?id=25cce32ec131497b9e06a93336a0817f&type=project&utm_source=LLaMA-Factory) (Chinese)
- **GPU Memory**: Peak GPU memory usage in 4-bit quantized training. (bs=1, cutoff_len=1024) - [Fine-tune GPT-OSS for Role-Playing using LLaMA-Factory](https://docs.llamafactory.com.cn/docs/documents/best-practice/gptroleplay/?utm_source=LLaMA-Factory) (Chinese)
- We adopt `pre_seq_len=128` for ChatGLM's P-Tuning and `lora_rank=32` for LLaMA Factory's LoRA tuning. - [A One-Stop Code-Free Model Reinforcement Learning and Deployment Platform based on LLaMA-Factory and EasyR1](https://aws.amazon.com/cn/blogs/china/building-llm-model-hub-based-on-llamafactory-and-easyr1/) (Chinese)
- [How Apoidea Group enhances visual information extraction from banking documents with multimodal models using LLaMA-Factory on Amazon SageMaker HyperPod](https://aws.amazon.com/cn/blogs/machine-learning/how-apoidea-group-enhances-visual-information-extraction-from-banking-documents-with-multimodal-models-using-llama-factory-on-amazon-sagemaker-hyperpod/) (English)
<details><summary>All Blogs</summary>
- [Fine-tune Llama3.1-70B for Medical Diagnosis using LLaMA-Factory](https://docs.alayanew.com/docs/documents/bestPractice/bigModel/llama70B/?utm_source=LLaMA-Factory) (Chinese)
- [Fine-tune Qwen2.5-VL for Autonomous Driving using LLaMA-Factory](https://docs.alayanew.com/docs/documents/useGuide/LLaMAFactory/mutiple/?utm_source=LLaMA-Factory) (Chinese)
- [LLaMA Factory: Fine-tuning the DeepSeek-R1-Distill-Qwen-7B Model for News Classifier](https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory_deepseek_r1_distill_7b) (Chinese)
- [A One-Stop Code-Free Model Fine-Tuning \& Deployment Platform based on SageMaker and LLaMA-Factory](https://aws.amazon.com/cn/blogs/china/a-one-stop-code-free-model-fine-tuning-deployment-platform-based-on-sagemaker-and-llama-factory/) (Chinese)
- [LLaMA Factory Multi-Modal Fine-Tuning Practice: Fine-Tuning Qwen2-VL for Personal Tourist Guide](https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory_qwen2vl) (Chinese)
- [LLaMA Factory: Fine-tuning Llama3 for Role-Playing](https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory) (Chinese)
</details> </details>
## Changelog ## Changelog
[24/06/16] We support **[PiSSA](https://arxiv.org/abs/2404.02948)** algorithm. See [examples](examples/README.md) for usage. [25/10/26] We support Megatron-core training backend with [**mcore_adapter**](https://github.com/alibaba/ROLL/tree/main/mcore_adapter). See [PR #9237](https://github.com/hiyouga/LLaMA-Factory/pull/9237) to get started.
[25/08/22] We supported **[OFT](https://arxiv.org/abs/2306.07280)** and **[OFTv2](https://arxiv.org/abs/2506.19847)**. See [examples](examples/README.md) for usage.
[25/08/20] We supported fine-tuning the **[Intern-S1-mini](https://huggingface.co/internlm/Intern-S1-mini)** models. See [PR #8976](https://github.com/hiyouga/LLaMA-Factory/pull/8976) to get started.
[25/08/06] We supported fine-tuning the **[GPT-OSS](https://github.com/openai/gpt-oss)** models. See [PR #8826](https://github.com/hiyouga/LLaMA-Factory/pull/8826) to get started.
<details><summary>Full Changelog</summary>
[25/07/02] We supported fine-tuning the **[GLM-4.1V-9B-Thinking](https://github.com/THUDM/GLM-4.1V-Thinking)** model.
[25/04/28] We supported fine-tuning the **[Qwen3](https://qwenlm.github.io/blog/qwen3/)** model family.
[25/04/21] We supported the **[Muon](https://github.com/KellerJordan/Muon)** optimizer. See [examples](examples/README.md) for usage. Thank [@tianshijing](https://github.com/tianshijing)'s PR.
[25/04/16] We supported fine-tuning the **[InternVL3](https://huggingface.co/OpenGVLab/InternVL3-8B)** model. See [PR #7258](https://github.com/hiyouga/LLaMA-Factory/pull/7258) to get started.
[25/04/14] We supported fine-tuning the **[GLM-Z1](https://huggingface.co/THUDM/GLM-Z1-9B-0414)** and **[Kimi-VL](https://huggingface.co/moonshotai/Kimi-VL-A3B-Instruct)** models.
[25/04/06] We supported fine-tuning the **[Llama 4](https://ai.meta.com/blog/llama-4-multimodal-intelligence/)** model. See [PR #7611](https://github.com/hiyouga/LLaMA-Factory/pull/7611) to get started.
[25/03/31] We supported fine-tuning the **[Qwen2.5 Omni](https://qwenlm.github.io/blog/qwen2.5-omni/)** model. See [PR #7537](https://github.com/hiyouga/LLaMA-Factory/pull/7537) to get started.
[25/03/15] We supported **[SGLang](https://github.com/sgl-project/sglang)** as inference backend. Try `infer_backend: sglang` to accelerate inference.
[25/03/12] We supported fine-tuning the **[Gemma 3](https://huggingface.co/blog/gemma3)** model.
[25/02/24] Announcing **[EasyR1](https://github.com/hiyouga/EasyR1)**, an efficient, scalable and multi-modality RL training framework for efficient GRPO training.
[25/02/11] We supported saving the **[Ollama](https://github.com/ollama/ollama)** modelfile when exporting the model checkpoints. See [examples](examples/README.md) for usage.
[25/02/05] We supported fine-tuning the **[Qwen2-Audio](Qwen/Qwen2-Audio-7B-Instruct)** and **[MiniCPM-o-2.6](https://huggingface.co/openbmb/MiniCPM-o-2_6)** on audio understanding tasks.
[25/01/31] We supported fine-tuning the **[DeepSeek-R1](https://huggingface.co/deepseek-ai/DeepSeek-R1)** and **[Qwen2.5-VL](https://huggingface.co/Qwen/Qwen2.5-VL-7B-Instruct)** models.
[25/01/15] We supported **[APOLLO](https://arxiv.org/abs/2412.05270)** optimizer. See [examples](examples/README.md) for usage.
[25/01/14] We supported fine-tuning the **[MiniCPM-o-2.6](https://huggingface.co/openbmb/MiniCPM-o-2_6)** and **[MiniCPM-V-2.6](https://huggingface.co/openbmb/MiniCPM-V-2_6)** models. Thank [@BUAADreamer](https://github.com/BUAADreamer)'s PR.
[25/01/14] We supported fine-tuning the **[InternLM 3](https://huggingface.co/collections/internlm/)** models. Thank [@hhaAndroid](https://github.com/hhaAndroid)'s PR.
[25/01/10] We supported fine-tuning the **[Phi-4](https://huggingface.co/microsoft/phi-4)** model.
[24/12/21] We supported using **[SwanLab](https://github.com/SwanHubX/SwanLab)** for experiment tracking and visualization. See [this section](#use-swanlab-logger) for details.
[24/11/27] We supported fine-tuning the **[Skywork-o1](https://huggingface.co/Skywork/Skywork-o1-Open-Llama-3.1-8B)** model and the **[OpenO1](https://huggingface.co/datasets/O1-OPEN/OpenO1-SFT)** dataset.
[24/10/09] We supported downloading pre-trained models and datasets from the **[Modelers Hub](https://modelers.cn/models)**. See [this tutorial](#download-from-modelers-hub) for usage.
[24/09/19] We supported fine-tuning the **[Qwen2.5](https://qwenlm.github.io/blog/qwen2.5/)** models.
[24/08/30] We supported fine-tuning the **[Qwen2-VL](https://qwenlm.github.io/blog/qwen2-vl/)** models. Thank [@simonJJJ](https://github.com/simonJJJ)'s PR.
[24/08/27] We supported **[Liger Kernel](https://github.com/linkedin/Liger-Kernel)**. Try `enable_liger_kernel: true` for efficient training.
[24/08/09] We supported **[Adam-mini](https://github.com/zyushun/Adam-mini)** optimizer. See [examples](examples/README.md) for usage. Thank [@relic-yuexi](https://github.com/relic-yuexi)'s PR.
[24/07/04] We supported [contamination-free packed training](https://github.com/MeetKai/functionary/tree/main/functionary/train/packing). Use `neat_packing: true` to activate it. Thank [@chuan298](https://github.com/chuan298)'s PR.
[24/06/16] We supported **[PiSSA](https://arxiv.org/abs/2404.02948)** algorithm. See [examples](examples/README.md) for usage.
[24/06/07] We supported fine-tuning the **[Qwen2](https://qwenlm.github.io/blog/qwen2/)** and **[GLM-4](https://github.com/THUDM/GLM-4)** models. [24/06/07] We supported fine-tuning the **[Qwen2](https://qwenlm.github.io/blog/qwen2/)** and **[GLM-4](https://github.com/THUDM/GLM-4)** models.
[24/05/26] We supported **[SimPO](https://arxiv.org/abs/2405.14734)** algorithm for preference learning. See [examples](examples/README.md) for usage. [24/05/26] We supported **[SimPO](https://arxiv.org/abs/2405.14734)** algorithm for preference learning. See [examples](examples/README.md) for usage.
<details><summary>Full Changelog</summary> [24/05/20] We supported fine-tuning the **PaliGemma** series models. Note that the PaliGemma models are pre-trained models, you need to fine-tune them with `paligemma` template for chat completion.
[24/05/20] We supported fine-tuning the **PaliGemma** series models. Note that the PaliGemma models are pre-trained models, you need to fine-tune them with `gemma` template for chat completion.
[24/05/18] We supported **[KTO](https://arxiv.org/abs/2402.01306)** algorithm for preference learning. See [examples](examples/README.md) for usage. [24/05/18] We supported **[KTO](https://arxiv.org/abs/2402.01306)** algorithm for preference learning. See [examples](examples/README.md) for usage.
@@ -91,7 +212,7 @@ Compared to ChatGLM's [P-Tuning](https://github.com/THUDM/ChatGLM2-6B/tree/main/
[24/04/21] We supported **[Mixture-of-Depths](https://arxiv.org/abs/2404.02258)** according to [AstraMindAI's implementation](https://github.com/astramind-ai/Mixture-of-depths). See [examples](examples/README.md) for usage. [24/04/21] We supported **[Mixture-of-Depths](https://arxiv.org/abs/2404.02258)** according to [AstraMindAI's implementation](https://github.com/astramind-ai/Mixture-of-depths). See [examples](examples/README.md) for usage.
[24/04/16] We supported **[BAdam](https://arxiv.org/abs/2404.02827)**. See [examples](examples/README.md) for usage. [24/04/16] We supported **[BAdam](https://arxiv.org/abs/2404.02827)** optimizer. See [examples](examples/README.md) for usage.
[24/04/16] We supported **[unsloth](https://github.com/unslothai/unsloth)**'s long-sequence training (Llama-2-7B-56k within 24GB). It achieves **117%** speed and **50%** memory compared with FlashAttention-2, more benchmarks can be found in [this page](https://github.com/hiyouga/LLaMA-Factory/wiki/Performance-comparison). [24/04/16] We supported **[unsloth](https://github.com/unslothai/unsloth)**'s long-sequence training (Llama-2-7B-56k within 24GB). It achieves **117%** speed and **50%** memory compared with FlashAttention-2, more benchmarks can be found in [this page](https://github.com/hiyouga/LLaMA-Factory/wiki/Performance-comparison).
@@ -103,7 +224,7 @@ Compared to ChatGLM's [P-Tuning](https://github.com/THUDM/ChatGLM2-6B/tree/main/
[24/03/13] We supported **[LoRA+](https://arxiv.org/abs/2402.12354)**. See [examples](examples/README.md) for usage. [24/03/13] We supported **[LoRA+](https://arxiv.org/abs/2402.12354)**. See [examples](examples/README.md) for usage.
[24/03/07] We supported gradient low-rank projection (**[GaLore](https://arxiv.org/abs/2403.03507)**) algorithm. See [examples](examples/README.md) for usage. [24/03/07] We supported **[GaLore](https://arxiv.org/abs/2403.03507)** optimizer. See [examples](examples/README.md) for usage.
[24/03/07] We integrated **[vLLM](https://github.com/vllm-project/vllm)** for faster and concurrent inference. Try `infer_backend: vllm` to enjoy **270%** inference speed. [24/03/07] We integrated **[vLLM](https://github.com/vllm-project/vllm)** for faster and concurrent inference. Try `infer_backend: vllm` to enjoy **270%** inference speed.
@@ -119,7 +240,7 @@ Compared to ChatGLM's [P-Tuning](https://github.com/THUDM/ChatGLM2-6B/tree/main/
[23/12/12] We supported fine-tuning the latest MoE model **[Mixtral 8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1)** in our framework. See hardware requirement [here](#hardware-requirement). [23/12/12] We supported fine-tuning the latest MoE model **[Mixtral 8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1)** in our framework. See hardware requirement [here](#hardware-requirement).
[23/12/01] We supported downloading pre-trained models and datasets from the **[ModelScope Hub](https://modelscope.cn/models)** for Chinese mainland users. See [this tutorial](#download-from-modelscope-hub) for usage. [23/12/01] We supported downloading pre-trained models and datasets from the **[ModelScope Hub](https://modelscope.cn/models)**. See [this tutorial](#download-from-modelscope-hub) for usage.
[23/10/21] We supported **[NEFTune](https://arxiv.org/abs/2310.05914)** trick for fine-tuning. Try `neftune_noise_alpha: 5` argument to activate NEFTune. [23/10/21] We supported **[NEFTune](https://arxiv.org/abs/2310.05914)** trick for fine-tuning. Try `neftune_noise_alpha: 5` argument to activate NEFTune.
@@ -149,39 +270,86 @@ Compared to ChatGLM's [P-Tuning](https://github.com/THUDM/ChatGLM2-6B/tree/main/
</details> </details>
> [!TIP]
> If you cannot use the latest feature, please pull the latest code and install LLaMA-Factory again.
## Supported Models ## Supported Models
| Model | Model size | Template | | Model | Model size | Template |
| ------------------------------------------------------------ | -------------------------------- | --------- | | ----------------------------------------------------------------- | -------------------------------- | -------------------- |
| [Baichuan 2](https://huggingface.co/baichuan-inc) | 7B/13B | baichuan2 | | [Baichuan 2](https://huggingface.co/baichuan-inc) | 7B/13B | baichuan2 |
| [BLOOM/BLOOMZ](https://huggingface.co/bigscience) | 560M/1.1B/1.7B/3B/7.1B/176B | - | | [BLOOM/BLOOMZ](https://huggingface.co/bigscience) | 560M/1.1B/1.7B/3B/7.1B/176B | - |
| [ChatGLM3](https://huggingface.co/THUDM) | 6B | chatglm3 | | [ChatGLM3](https://huggingface.co/THUDM) | 6B | chatglm3 |
| [Command R](https://huggingface.co/CohereForAI) | 35B/104B | cohere | | [Command R](https://huggingface.co/CohereForAI) | 35B/104B | cohere |
| [DeepSeek (Code/MoE)](https://huggingface.co/deepseek-ai) | 7B/16B/67B/236B | deepseek | | [DeepSeek (Code/MoE)](https://huggingface.co/deepseek-ai) | 7B/16B/67B/236B | deepseek |
| [Falcon](https://huggingface.co/tiiuae) | 7B/11B/40B/180B | falcon | | [DeepSeek 2.5/3](https://huggingface.co/deepseek-ai) | 236B/671B | deepseek3 |
| [Gemma/Gemma 2/CodeGemma](https://huggingface.co/google) | 2B/7B/9B/27B | gemma | | [DeepSeek R1 (Distill)](https://huggingface.co/deepseek-ai) | 1.5B/7B/8B/14B/32B/70B/671B | deepseekr1 |
| [GLM-4](https://huggingface.co/THUDM) | 9B | glm4 | | [ERNIE-4.5](https://huggingface.co/baidu) | 0.3B/21B/300B | ernie/ernie_nothink |
| [InternLM2](https://huggingface.co/internlm) | 7B/20B | intern2 | | [Falcon](https://huggingface.co/tiiuae) | 7B/11B/40B/180B | falcon |
| [Llama](https://github.com/facebookresearch/llama) | 7B/13B/33B/65B | - | | [Falcon-H1](https://huggingface.co/tiiuae) | 0.5B/1.5B/3B/7B/34B | falcon_h1 |
| [Llama 2](https://huggingface.co/meta-llama) | 7B/13B/70B | llama2 | | [Gemma/Gemma 2/CodeGemma](https://huggingface.co/google) | 2B/7B/9B/27B | gemma/gemma2 |
| [Llama 3](https://huggingface.co/meta-llama) | 8B/70B | llama3 | | [Gemma 3/Gemma 3n](https://huggingface.co/google) | 270M/1B/4B/6B/8B/12B/27B | gemma3/gemma3n |
| [LLaVA-1.5](https://huggingface.co/llava-hf) | 7B/13B | vicuna | | [GLM-4/GLM-4-0414/GLM-Z1](https://huggingface.co/zai-org) | 9B/32B | glm4/glmz1 |
| [Mistral/Mixtral](https://huggingface.co/mistralai) | 7B/8x7B/8x22B | mistral | | [GLM-4.1V](https://huggingface.co/zai-org) | 9B | glm4v |
| [OLMo](https://huggingface.co/allenai) | 1B/7B | - | | [GLM-4.5/GLM-4.5V](https://huggingface.co/zai-org) | 106B/355B | glm4_moe/glm4v_moe |
| [PaliGemma](https://huggingface.co/google) | 3B | gemma | | [GPT-2](https://huggingface.co/openai-community) | 0.1B/0.4B/0.8B/1.5B | - |
| [Phi-1.5/Phi-2](https://huggingface.co/microsoft) | 1.3B/2.7B | - | | [GPT-OSS](https://huggingface.co/openai) | 20B/120B | gpt |
| [Phi-3](https://huggingface.co/microsoft) | 4B/7B/14B | phi | | [Granite 3.0-3.3](https://huggingface.co/ibm-granite) | 1B/2B/3B/8B | granite3 |
| [Qwen/Qwen1.5/Qwen2 (Code/MoE)](https://huggingface.co/Qwen) | 0.5B/1.5B/4B/7B/14B/32B/72B/110B | qwen | | [Granite 4](https://huggingface.co/ibm-granite) | 7B | granite4 |
| [StarCoder 2](https://huggingface.co/bigcode) | 3B/7B/15B | - | | [Hunyuan (MT)](https://huggingface.co/tencent/) | 7B | hunyuan |
| [XVERSE](https://huggingface.co/xverse) | 7B/13B/65B | xverse | | [Index](https://huggingface.co/IndexTeam) | 1.9B | index |
| [Yi/Yi-1.5](https://huggingface.co/01-ai) | 6B/9B/34B | yi | | [InternLM 2-3](https://huggingface.co/internlm) | 7B/8B/20B | intern2 |
| [Yi-VL](https://huggingface.co/01-ai) | 6B/34B | yi_vl | | [InternVL 2.5-3.5](https://huggingface.co/OpenGVLab) | 1B/2B/4B/8B/14B/30B/38B/78B/241B | intern_vl |
| [Yuan 2](https://huggingface.co/IEITYuan) | 2B/51B/102B | yuan | | [InternLM/Intern-S1-mini](https://huggingface.co/internlm/) | 8B | intern_s1 |
| [Kimi-VL](https://huggingface.co/moonshotai) | 16B | kimi_vl |
| [Ling 2.0 (mini/flash)](https://huggingface.co/inclusionAI) | 16B/100B | bailing_v2 |
| [Llama](https://github.com/facebookresearch/llama) | 7B/13B/33B/65B | - |
| [Llama 2](https://huggingface.co/meta-llama) | 7B/13B/70B | llama2 |
| [Llama 3-3.3](https://huggingface.co/meta-llama) | 1B/3B/8B/70B | llama3 |
| [Llama 4](https://huggingface.co/meta-llama) | 109B/402B | llama4 |
| [Llama 3.2 Vision](https://huggingface.co/meta-llama) | 11B/90B | mllama |
| [LLaVA-1.5](https://huggingface.co/llava-hf) | 7B/13B | llava |
| [LLaVA-NeXT](https://huggingface.co/llava-hf) | 7B/8B/13B/34B/72B/110B | llava_next |
| [LLaVA-NeXT-Video](https://huggingface.co/llava-hf) | 7B/34B | llava_next_video |
| [MiMo](https://huggingface.co/XiaomiMiMo) | 7B | mimo |
| [MiniCPM 1-4.1](https://huggingface.co/openbmb) | 0.5B/1B/2B/4B/8B | cpm/cpm3/cpm4 |
| [MiniCPM-o-2.6/MiniCPM-V-2.6](https://huggingface.co/openbmb) | 8B | minicpm_o/minicpm_v |
| [Ministral/Mistral-Nemo](https://huggingface.co/mistralai) | 8B/12B | ministral |
| [Mistral/Mixtral](https://huggingface.co/mistralai) | 7B/8x7B/8x22B | mistral |
| [Mistral Small](https://huggingface.co/mistralai) | 24B | mistral_small |
| [OLMo](https://huggingface.co/allenai) | 1B/7B | - |
| [PaliGemma/PaliGemma2](https://huggingface.co/google) | 3B/10B/28B | paligemma |
| [Phi-1.5/Phi-2](https://huggingface.co/microsoft) | 1.3B/2.7B | - |
| [Phi-3/Phi-3.5](https://huggingface.co/microsoft) | 4B/14B | phi |
| [Phi-3-small](https://huggingface.co/microsoft) | 7B | phi_small |
| [Phi-4](https://huggingface.co/microsoft) | 14B | phi4 |
| [Pixtral](https://huggingface.co/mistralai) | 12B | pixtral |
| [Qwen (1-2.5) (Code/Math/MoE/QwQ)](https://huggingface.co/Qwen) | 0.5B/1.5B/3B/7B/14B/32B/72B/110B | qwen |
| [Qwen3 (MoE/Instruct/Thinking/Next)](https://huggingface.co/Qwen) | 0.6B/1.7B/4B/8B/14B/32B/80B/235B | qwen3/qwen3_nothink |
| [Qwen2-Audio](https://huggingface.co/Qwen) | 7B | qwen2_audio |
| [Qwen2.5-Omni](https://huggingface.co/Qwen) | 3B/7B | qwen2_omni |
| [Qwen3-Omni](https://huggingface.co/Qwen) | 30B | qwen3_omni |
| [Qwen2-VL/Qwen2.5-VL/QVQ](https://huggingface.co/Qwen) | 2B/3B/7B/32B/72B | qwen2_vl |
| [Qwen3-VL](https://huggingface.co/Qwen) | 2B/4B/8B/30B/32B/235B | qwen3_vl |
| [Seed (OSS/Coder)](https://huggingface.co/ByteDance-Seed) | 8B/36B | seed_oss/seed_coder |
| [Skywork o1](https://huggingface.co/Skywork) | 8B | skywork_o1 |
| [StarCoder 2](https://huggingface.co/bigcode) | 3B/7B/15B | - |
| [TeleChat2](https://huggingface.co/Tele-AI) | 3B/7B/35B/115B | telechat2 |
| [XVERSE](https://huggingface.co/xverse) | 7B/13B/65B | xverse |
| [Yi/Yi-1.5 (Code)](https://huggingface.co/01-ai) | 1.5B/6B/9B/34B | yi |
| [Yi-VL](https://huggingface.co/01-ai) | 6B/34B | yi_vl |
| [Yuan 2](https://huggingface.co/IEITYuan) | 2B/51B/102B | yuan |
> [!NOTE] > [!NOTE]
> For the "base" models, the `template` argument can be chosen from `default`, `alpaca`, `vicuna` etc. But make sure to use the **corresponding template** for the "instruct/chat" models. > For the "base" models, the `template` argument can be chosen from `default`, `alpaca`, `vicuna` etc. But make sure to use the **corresponding template** for the "instruct/chat" models.
> >
> If the model has both reasoning and non-reasoning versions, please use the `_nothink` suffix to distinguish between them. For example, `qwen3` and `qwen3_nothink`.
>
> Remember to use the **SAME** template in training and inference. > Remember to use the **SAME** template in training and inference.
>
> \*: You should install the `transformers` from main branch and use `DISABLE_VERSION_CHECK=1` to skip version check.
>
> \*\*: You need to install a specific version of `transformers` to use the corresponding model.
Please refer to [constants.py](src/llamafactory/extras/constants.py) for a full list of models we supported. Please refer to [constants.py](src/llamafactory/extras/constants.py) for a full list of models we supported.
@@ -189,16 +357,19 @@ You also can add a custom chat template to [template.py](src/llamafactory/data/t
## Supported Training Approaches ## Supported Training Approaches
| Approach | Full-tuning | Freeze-tuning | LoRA | QLoRA | | Approach | Full-tuning | Freeze-tuning | LoRA | QLoRA | OFT | QOFT |
| ---------------------- | ------------------ | ------------------ | ------------------ | ------------------ | | ---------------------- | ------------------ | ------------------ | ------------------ | ------------------ | ------------------ | ------------------ |
| Pre-Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | Pre-Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| Supervised Fine-Tuning | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | Supervised Fine-Tuning | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| Reward Modeling | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | Reward Modeling | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| PPO Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | PPO Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| DPO Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | DPO Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| KTO Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | KTO Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| ORPO Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | ORPO Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| SimPO Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | SimPO Training | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
> [!TIP]
> The implementation details of PPO can be found in [this blog](https://newfacade.github.io/notes-on-reinforcement-learning/17-ppo-trl.html).
## Provided Datasets ## Provided Datasets
@@ -213,6 +384,11 @@ You also can add a custom chat template to [template.py](src/llamafactory/data/t
- [SkyPile (zh)](https://huggingface.co/datasets/Skywork/SkyPile-150B) - [SkyPile (zh)](https://huggingface.co/datasets/Skywork/SkyPile-150B)
- [FineWeb (en)](https://huggingface.co/datasets/HuggingFaceFW/fineweb) - [FineWeb (en)](https://huggingface.co/datasets/HuggingFaceFW/fineweb)
- [FineWeb-Edu (en)](https://huggingface.co/datasets/HuggingFaceFW/fineweb-edu) - [FineWeb-Edu (en)](https://huggingface.co/datasets/HuggingFaceFW/fineweb-edu)
- [CCI3-HQ (zh)](https://huggingface.co/datasets/BAAI/CCI3-HQ)
- [CCI3-Data (zh)](https://huggingface.co/datasets/BAAI/CCI3-Data)
- [CCI4.0-M2-Base-v1 (en&zh)](https://huggingface.co/datasets/BAAI/CCI4.0-M2-Base-v1)
- [CCI4.0-M2-CoT-v1 (en&zh)](https://huggingface.co/datasets/BAAI/CCI4.0-M2-CoT-v1)
- [CCI4.0-M2-Extra-v1 (en&zh)](https://huggingface.co/datasets/BAAI/CCI4.0-M2-Extra-v1)
- [The Stack (en)](https://huggingface.co/datasets/bigcode/the-stack) - [The Stack (en)](https://huggingface.co/datasets/bigcode/the-stack)
- [StarCoder (en)](https://huggingface.co/datasets/bigcode/starcoderdata) - [StarCoder (en)](https://huggingface.co/datasets/bigcode/starcoderdata)
@@ -250,6 +426,7 @@ You also can add a custom chat template to [template.py](src/llamafactory/data/t
- [ShareGPT Hyperfiltered (en)](https://huggingface.co/datasets/totally-not-an-llm/sharegpt-hyperfiltered-3k) - [ShareGPT Hyperfiltered (en)](https://huggingface.co/datasets/totally-not-an-llm/sharegpt-hyperfiltered-3k)
- [ShareGPT4 (en&zh)](https://huggingface.co/datasets/shibing624/sharegpt_gpt4) - [ShareGPT4 (en&zh)](https://huggingface.co/datasets/shibing624/sharegpt_gpt4)
- [UltraChat 200k (en)](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) - [UltraChat 200k (en)](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k)
- [Infinity Instruct (zh)](https://huggingface.co/datasets/BAAI/Infinity-Instruct)
- [AgentInstruct (en)](https://huggingface.co/datasets/THUDM/AgentInstruct) - [AgentInstruct (en)](https://huggingface.co/datasets/THUDM/AgentInstruct)
- [LMSYS Chat 1M (en)](https://huggingface.co/datasets/lmsys/lmsys-chat-1m) - [LMSYS Chat 1M (en)](https://huggingface.co/datasets/lmsys/lmsys-chat-1m)
- [Evol Instruct V2 (en)](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_V2_196k) - [Evol Instruct V2 (en)](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_V2_196k)
@@ -257,9 +434,15 @@ You also can add a custom chat template to [template.py](src/llamafactory/data/t
- [STEM (zh)](https://huggingface.co/datasets/hfl/stem_zh_instruction) - [STEM (zh)](https://huggingface.co/datasets/hfl/stem_zh_instruction)
- [Ruozhiba (zh)](https://huggingface.co/datasets/hfl/ruozhiba_gpt4_turbo) - [Ruozhiba (zh)](https://huggingface.co/datasets/hfl/ruozhiba_gpt4_turbo)
- [Neo-sft (zh)](https://huggingface.co/datasets/m-a-p/neo_sft_phase2) - [Neo-sft (zh)](https://huggingface.co/datasets/m-a-p/neo_sft_phase2)
- [WebInstructSub (en)](https://huggingface.co/datasets/TIGER-Lab/WebInstructSub)
- [Magpie-Pro-300K-Filtered (en)](https://huggingface.co/datasets/Magpie-Align/Magpie-Pro-300K-Filtered) - [Magpie-Pro-300K-Filtered (en)](https://huggingface.co/datasets/Magpie-Align/Magpie-Pro-300K-Filtered)
- [Magpie-ultra-v0.1 (en)](https://huggingface.co/datasets/argilla/magpie-ultra-v0.1)
- [WebInstructSub (en)](https://huggingface.co/datasets/TIGER-Lab/WebInstructSub)
- [OpenO1-SFT (en&zh)](https://huggingface.co/datasets/O1-OPEN/OpenO1-SFT)
- [Open-Thoughts (en)](https://huggingface.co/datasets/open-thoughts/OpenThoughts-114k)
- [Open-R1-Math (en)](https://huggingface.co/datasets/open-r1/OpenR1-Math-220k)
- [Chinese-DeepSeek-R1-Distill (zh)](https://huggingface.co/datasets/Congliu/Chinese-DeepSeek-R1-Distill-data-110k-SFT)
- [LLaVA mixed (en&zh)](https://huggingface.co/datasets/BUAADreamer/llava-en-zh-300k) - [LLaVA mixed (en&zh)](https://huggingface.co/datasets/BUAADreamer/llava-en-zh-300k)
- [Pokemon-gpt4o-captions (en&zh)](https://huggingface.co/datasets/jugg1024/pokemon-gpt4o-captions)
- [Open Assistant (de)](https://huggingface.co/datasets/mayflowergmbh/oasst_de) - [Open Assistant (de)](https://huggingface.co/datasets/mayflowergmbh/oasst_de)
- [Dolly 15k (de)](https://huggingface.co/datasets/mayflowergmbh/dolly-15k_de) - [Dolly 15k (de)](https://huggingface.co/datasets/mayflowergmbh/dolly-15k_de)
- [Alpaca GPT4 (de)](https://huggingface.co/datasets/mayflowergmbh/alpaca-gpt4_de) - [Alpaca GPT4 (de)](https://huggingface.co/datasets/mayflowergmbh/alpaca-gpt4_de)
@@ -276,6 +459,10 @@ You also can add a custom chat template to [template.py](src/llamafactory/data/t
- [DPO mixed (en&zh)](https://huggingface.co/datasets/hiyouga/DPO-En-Zh-20k) - [DPO mixed (en&zh)](https://huggingface.co/datasets/hiyouga/DPO-En-Zh-20k)
- [UltraFeedback (en)](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized) - [UltraFeedback (en)](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized)
- [COIG-P (zh)](https://huggingface.co/datasets/m-a-p/COIG-P)
- [RLHF-V (en)](https://huggingface.co/datasets/openbmb/RLHF-V-Dataset)
- [VLFeedback (en)](https://huggingface.co/datasets/Zhihui/VLFeedback)
- [RLAIF-V (en)](https://huggingface.co/datasets/openbmb/RLAIF-V-Dataset)
- [Orca DPO Pairs (en)](https://huggingface.co/datasets/Intel/orca_dpo_pairs) - [Orca DPO Pairs (en)](https://huggingface.co/datasets/Intel/orca_dpo_pairs)
- [HH-RLHF (en)](https://huggingface.co/datasets/Anthropic/hh-rlhf) - [HH-RLHF (en)](https://huggingface.co/datasets/Anthropic/hh-rlhf)
- [Nectar (en)](https://huggingface.co/datasets/berkeley-nest/Nectar) - [Nectar (en)](https://huggingface.co/datasets/berkeley-nest/Nectar)
@@ -287,7 +474,7 @@ You also can add a custom chat template to [template.py](src/llamafactory/data/t
Some datasets require confirmation before using them, so we recommend logging in with your Hugging Face account using these commands. Some datasets require confirmation before using them, so we recommend logging in with your Hugging Face account using these commands.
```bash ```bash
pip install --upgrade huggingface_hub pip install "huggingface_hub<1.0.0"
huggingface-cli login huggingface-cli login
``` ```
@@ -295,35 +482,35 @@ huggingface-cli login
| Mandatory | Minimum | Recommend | | Mandatory | Minimum | Recommend |
| ------------ | ------- | --------- | | ------------ | ------- | --------- |
| python | 3.8 | 3.11 | | python | 3.9 | 3.10 |
| torch | 1.13.1 | 2.3.0 | | torch | 2.0.0 | 2.6.0 |
| transformers | 4.41.2 | 4.41.2 | | torchvision | 0.15.0 | 0.21.0 |
| datasets | 2.16.0 | 2.19.2 | | transformers | 4.49.0 | 4.50.0 |
| accelerate | 0.30.1 | 0.30.1 | | datasets | 2.16.0 | 3.2.0 |
| peft | 0.11.1 | 0.11.1 | | accelerate | 0.34.0 | 1.2.1 |
| trl | 0.8.6 | 0.9.4 | | peft | 0.14.0 | 0.15.1 |
| trl | 0.8.6 | 0.9.6 |
| Optional | Minimum | Recommend | | Optional | Minimum | Recommend |
| ------------ | ------- | --------- | | ------------ | ------- | --------- |
| CUDA | 11.6 | 12.2 | | CUDA | 11.6 | 12.2 |
| deepspeed | 0.10.0 | 0.14.0 | | deepspeed | 0.10.0 | 0.16.4 |
| bitsandbytes | 0.39.0 | 0.43.1 | | bitsandbytes | 0.39.0 | 0.43.1 |
| vllm | 0.4.3 | 0.4.3 | | vllm | 0.4.3 | 0.8.2 |
| flash-attn | 2.3.0 | 2.5.9 | | flash-attn | 2.5.6 | 2.7.2 |
### Hardware Requirement ### Hardware Requirement
\* *estimated* \* *estimated*
| Method | Bits | 7B | 13B | 30B | 70B | 110B | 8x7B | 8x22B | | Method | Bits | 7B | 14B | 30B | 70B | `x`B |
| ----------------- | ---- | ----- | ----- | ----- | ------ | ------ | ----- | ------ | | ----------------------------------- | ---- | ----- | ----- | ----- | ------ | ------- |
| Full | AMP | 120GB | 240GB | 600GB | 1200GB | 2000GB | 900GB | 2400GB | | Full (`bf16` or `fp16`) | 32 | 120GB | 240GB | 600GB | 1200GB | `18x`GB |
| Full | 16 | 60GB | 120GB | 300GB | 600GB | 900GB | 400GB | 1200GB | | Full (`pure_bf16`) | 16 | 60GB | 120GB | 300GB | 600GB | `8x`GB |
| Freeze | 16 | 20GB | 40GB | 80GB | 200GB | 360GB | 160GB | 400GB | | Freeze/LoRA/GaLore/APOLLO/BAdam/OFT | 16 | 16GB | 32GB | 64GB | 160GB | `2x`GB |
| LoRA/GaLore/BAdam | 16 | 16GB | 32GB | 64GB | 160GB | 240GB | 120GB | 320GB | | QLoRA / QOFT | 8 | 10GB | 20GB | 40GB | 80GB | `x`GB |
| QLoRA | 8 | 10GB | 20GB | 40GB | 80GB | 140GB | 60GB | 160GB | | QLoRA / QOFT | 4 | 6GB | 12GB | 24GB | 48GB | `x/2`GB |
| QLoRA | 4 | 6GB | 12GB | 24GB | 48GB | 72GB | 30GB | 96GB | | QLoRA / QOFT | 2 | 4GB | 8GB | 16GB | 24GB | `x/4`GB |
| QLoRA | 2 | 4GB | 8GB | 16GB | 24GB | 48GB | 18GB | 48GB |
## Getting Started ## Getting Started
@@ -332,53 +519,99 @@ huggingface-cli login
> [!IMPORTANT] > [!IMPORTANT]
> Installation is mandatory. > Installation is mandatory.
#### Install from Source
```bash ```bash
git clone --depth 1 https://github.com/hiyouga/LLaMA-Factory.git git clone --depth 1 https://github.com/hiyouga/LLaMA-Factory.git
cd LLaMA-Factory cd LLaMA-Factory
pip install -e ".[torch,metrics]" pip install -e ".[torch,metrics]" --no-build-isolation
``` ```
Extra dependencies available: torch, torch-npu, metrics, deepspeed, bitsandbytes, hqq, eetq, gptq, awq, aqlm, vllm, galore, badam, qwen, modelscope, quality Extra dependencies available: torch, torch-npu, metrics, deepspeed, liger-kernel, bitsandbytes, hqq, eetq, gptq, aqlm, vllm, sglang, galore, apollo, badam, adam-mini, qwen, minicpm_v, openmind, swanlab, dev
> [!TIP] #### Install from Docker Image
> Use `pip install --no-deps -e .` to resolve package conflicts.
```bash
docker run -it --rm --gpus=all --ipc=host hiyouga/llamafactory:latest
```
This image is built on Ubuntu 22.04 (x86\_64), CUDA 12.4, Python 3.11, PyTorch 2.6.0, and Flash-attn 2.7.4.
Find the pre-built images: https://hub.docker.com/r/hiyouga/llamafactory/tags
Please refer to [build docker](#build-docker) to build the image yourself.
<details><summary>Setting up a virtual environment with <b>uv</b></summary>
Create an isolated Python environment with [uv](https://github.com/astral-sh/uv):
```bash
uv sync --extra torch --extra metrics --prerelease=allow
```
Run LLaMA-Factory in the isolated environment:
```bash
uv run --prerelease=allow llamafactory-cli train examples/train_lora/llama3_lora_pretrain.yaml
```
</details>
<details><summary>For Windows users</summary> <details><summary>For Windows users</summary>
#### Install PyTorch
You need to manually install the GPU version of PyTorch on the Windows platform. Please refer to the [official website](https://pytorch.org/get-started/locally/) and the following command to install PyTorch with CUDA support:
```bash
pip uninstall torch torchvision torchaudio
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu126
python -c "import torch; print(torch.cuda.is_available())"
```
If you see `True` then you have successfully installed PyTorch with CUDA support.
Try `dataloader_num_workers: 0` if you encounter `Can't pickle local object` error.
#### Install BitsAndBytes
If you want to enable the quantized LoRA (QLoRA) on the Windows platform, you need to install a pre-built version of `bitsandbytes` library, which supports CUDA 11.1 to 12.2, please select the appropriate [release version](https://github.com/jllllll/bitsandbytes-windows-webui/releases/tag/wheels) based on your CUDA version. If you want to enable the quantized LoRA (QLoRA) on the Windows platform, you need to install a pre-built version of `bitsandbytes` library, which supports CUDA 11.1 to 12.2, please select the appropriate [release version](https://github.com/jllllll/bitsandbytes-windows-webui/releases/tag/wheels) based on your CUDA version.
```bash ```bash
pip install https://github.com/jllllll/bitsandbytes-windows-webui/releases/download/wheels/bitsandbytes-0.41.2.post2-py3-none-win_amd64.whl pip install https://github.com/jllllll/bitsandbytes-windows-webui/releases/download/wheels/bitsandbytes-0.41.2.post2-py3-none-win_amd64.whl
``` ```
To enable FlashAttention-2 on the Windows platform, you need to install the precompiled `flash-attn` library, which supports CUDA 12.1 to 12.2. Please download the corresponding version from [flash-attention](https://github.com/bdashore3/flash-attention/releases) based on your requirements. #### Install Flash Attention-2
To enable FlashAttention-2 on the Windows platform, please use the script from [flash-attention-windows-wheel](https://huggingface.co/lldacing/flash-attention-windows-wheel) to compile and install it by yourself.
</details> </details>
<details><summary>For Ascend NPU users</summary> <details><summary>For Ascend NPU users</summary>
To install LLaMA Factory on Ascend NPU devices, please specify extra dependencies: `pip install -e ".[torch-npu,metrics]"`. Additionally, you need to install the **[Ascend CANN Toolkit and Kernels](https://www.hiascend.com/developer/download/community/result?module=cann)**. Please follow the [installation tutorial](https://www.hiascend.com/document/detail/en/CANNCommunityEdition/600alphaX/softwareinstall/instg/atlasdeploy_03_0031.html) or use the following commands: To install LLaMA Factory on Ascend NPU devices, please upgrade Python to version 3.10 or higher and specify extra dependencies: `pip install -e ".[torch-npu,metrics]"`. Additionally, you need to install the **[Ascend CANN Toolkit and Kernels](https://www.hiascend.com/developer/download/community/result?module=cann)**. Please follow the [installation tutorial](https://www.hiascend.com/document/detail/en/CANNCommunityEdition/600alphaX/softwareinstall/instg/atlasdeploy_03_0031.html) or use the following commands:
```bash ```bash
# replace the url according to your CANN version and devices # replace the url according to your CANN version and devices
# install CANN Toolkit # install CANN Toolkit
wget https://ascend-repo.obs.cn-east-2.myhuaweicloud.com/Milan-ASL/Milan-ASL%20V100R001C17SPC701/Ascend-cann-toolkit_8.0.RC1.alpha001_linux-"$(uname -i)".run wget https://ascend-repo.obs.cn-east-2.myhuaweicloud.com/Milan-ASL/Milan-ASL%20V100R001C20SPC702/Ascend-cann-toolkit_8.0.0.alpha002_linux-"$(uname -i)".run
bash Ascend-cann-toolkit_8.0.RC1.alpha001_linux-"$(uname -i)".run --install bash Ascend-cann-toolkit_8.0.0.alpha002_linux-"$(uname -i)".run --install
# install CANN Kernels # install CANN Kernels
wget https://ascend-repo.obs.cn-east-2.myhuaweicloud.com/Milan-ASL/Milan-ASL%20V100R001C17SPC701/Ascend-cann-kernels-910b_8.0.RC1.alpha001_linux.run wget https://ascend-repo.obs.cn-east-2.myhuaweicloud.com/Milan-ASL/Milan-ASL%20V100R001C20SPC702/Ascend-cann-kernels-910b_8.0.0.alpha002_linux-"$(uname -i)".run
bash Ascend-cann-kernels-910b_8.0.RC1.alpha001_linux.run --install bash Ascend-cann-kernels-910b_8.0.0.alpha002_linux-"$(uname -i)".run --install
# set env variables # set env variables
source /usr/local/Ascend/ascend-toolkit/set_env.sh source /usr/local/Ascend/ascend-toolkit/set_env.sh
``` ```
| Requirement | Minimum | Recommend | | Requirement | Minimum | Recommend |
| ------------ | ------- | ----------- | | ------------ | ------- | -------------- |
| CANN | 8.0.RC1 | 8.0.RC1 | | CANN | 8.0.RC1 | 8.0.0.alpha002 |
| torch | 2.1.0 | 2.1.0 | | torch | 2.1.0 | 2.4.0 |
| torch-npu | 2.1.0 | 2.1.0.post3 | | torch-npu | 2.1.0 | 2.4.0.post2 |
| deepspeed | 0.13.2 | 0.13.2 | | deepspeed | 0.13.2 | 0.13.2 |
| vllm-ascend | - | 0.7.3 |
Remember to use `ASCEND_RT_VISIBLE_DEVICES` instead of `CUDA_VISIBLE_DEVICES` to specify the device to use. Remember to use `ASCEND_RT_VISIBLE_DEVICES` instead of `CUDA_VISIBLE_DEVICES` to specify the device to use.
@@ -386,15 +619,51 @@ If you cannot infer model on NPU devices, try setting `do_sample: false` in the
Download the pre-built Docker images: [32GB](http://mirrors.cn-central-221.ovaijisuan.com/detail/130.html) | [64GB](http://mirrors.cn-central-221.ovaijisuan.com/detail/131.html) Download the pre-built Docker images: [32GB](http://mirrors.cn-central-221.ovaijisuan.com/detail/130.html) | [64GB](http://mirrors.cn-central-221.ovaijisuan.com/detail/131.html)
#### Install BitsAndBytes
To use QLoRA based on bitsandbytes on Ascend NPU, please follow these 3 steps:
1. Manually compile bitsandbytes: Refer to [the installation documentation](https://huggingface.co/docs/bitsandbytes/installation?backend=Ascend+NPU&platform=Ascend+NPU) for the NPU version of bitsandbytes to complete the compilation and installation. The compilation requires a cmake version of at least 3.22.1 and a g++ version of at least 12.x.
```bash
# Install bitsandbytes from source
# Clone bitsandbytes repo, Ascend NPU backend is currently enabled on multi-backend-refactor branch
git clone -b multi-backend-refactor https://github.com/bitsandbytes-foundation/bitsandbytes.git
cd bitsandbytes/
# Install dependencies
pip install -r requirements-dev.txt
# Install the dependencies for the compilation tools. Note that the commands for this step may vary depending on the operating system. The following are provided for reference
apt-get install -y build-essential cmake
# Compile & install
cmake -DCOMPUTE_BACKEND=npu -S .
make
pip install .
```
2. Install transformers from the main branch.
```bash
git clone -b main https://github.com/huggingface/transformers.git
cd transformers
pip install .
```
3. Set `double_quantization: false` in the configuration. You can refer to the [example](examples/train_qlora/llama3_lora_sft_bnb_npu.yaml).
</details> </details>
### Data Preparation ### Data Preparation
Please refer to [data/README.md](data/README.md) for checking the details about the format of dataset files. You can either use datasets on HuggingFace / ModelScope hub or load the dataset in local disk. Please refer to [data/README.md](data/README.md) for checking the details about the format of dataset files. You can use datasets on HuggingFace / ModelScope / Modelers hub, load the dataset in local disk, or specify a path to s3/gcs cloud storage.
> [!NOTE] > [!NOTE]
> Please update `data/dataset_info.json` to use your custom dataset. > Please update `data/dataset_info.json` to use your custom dataset.
You can also use **[Easy Dataset](https://github.com/ConardLi/easy-dataset)**, **[DataFlow](https://github.com/OpenDCAI/DataFlow)** and **[GraphGen](https://github.com/open-sciencelab/GraphGen)** to create synthetic data for fine-tuning.
### Quickstart ### Quickstart
Use the following 3 commands to run LoRA **fine-tuning**, **inference** and **merging** of the Llama3-8B-Instruct model, respectively. Use the following 3 commands to run LoRA **fine-tuning**, **inference** and **merging** of the Llama3-8B-Instruct model, respectively.
@@ -409,6 +678,8 @@ See [examples/README.md](examples/README.md) for advanced usage (including distr
> [!TIP] > [!TIP]
> Use `llamafactory-cli help` to show help information. > Use `llamafactory-cli help` to show help information.
>
> Read [FAQs](https://github.com/hiyouga/LLaMA-Factory/issues/4614) first if you encounter any problems.
### Fine-Tuning with LLaMA Board GUI (powered by [Gradio](https://github.com/gradio-app/gradio)) ### Fine-Tuning with LLaMA Board GUI (powered by [Gradio](https://github.com/gradio-app/gradio))
@@ -416,22 +687,34 @@ See [examples/README.md](examples/README.md) for advanced usage (including distr
llamafactory-cli webui llamafactory-cli webui
``` ```
### LLaMA Factory Online
Read our [documentation](https://docs.llamafactory.com.cn/docs/documents/quickstart/getstarted/?utm_source=LLaMA-Factory).
### Build Docker ### Build Docker
For CUDA users: For CUDA users:
```bash ```bash
cd docker/docker-cuda/ cd docker/docker-cuda/
docker-compose up -d docker compose up -d
docker-compose exec llamafactory bash docker compose exec llamafactory bash
``` ```
For Ascend NPU users: For Ascend NPU users:
```bash ```bash
cd docker/docker-npu/ cd docker/docker-npu/
docker-compose up -d docker compose up -d
docker-compose exec llamafactory bash docker compose exec llamafactory bash
```
For AMD ROCm users:
```bash
cd docker/docker-rocm/
docker compose up -d
docker compose exec llamafactory bash
``` ```
<details><summary>Build without Docker Compose</summary> <details><summary>Build without Docker Compose</summary>
@@ -440,21 +723,13 @@ For CUDA users:
```bash ```bash
docker build -f ./docker/docker-cuda/Dockerfile \ docker build -f ./docker/docker-cuda/Dockerfile \
--build-arg INSTALL_BNB=false \
--build-arg INSTALL_VLLM=false \
--build-arg INSTALL_DEEPSPEED=false \
--build-arg INSTALL_FLASHATTN=false \
--build-arg PIP_INDEX=https://pypi.org/simple \ --build-arg PIP_INDEX=https://pypi.org/simple \
--build-arg EXTRAS=metrics \
-t llamafactory:latest . -t llamafactory:latest .
docker run -dit --gpus=all \ docker run -dit --ipc=host --gpus=all \
-v ./hf_cache:/root/.cache/huggingface \
-v ./ms_cache:/root/.cache/modelscope \
-v ./data:/app/data \
-v ./output:/app/output \
-p 7860:7860 \ -p 7860:7860 \
-p 8000:8000 \ -p 8000:8000 \
--shm-size 16G \
--name llamafactory \ --name llamafactory \
llamafactory:latest llamafactory:latest
@@ -464,18 +739,12 @@ docker exec -it llamafactory bash
For Ascend NPU users: For Ascend NPU users:
```bash ```bash
# Choose docker image upon your environment
docker build -f ./docker/docker-npu/Dockerfile \ docker build -f ./docker/docker-npu/Dockerfile \
--build-arg INSTALL_DEEPSPEED=false \
--build-arg PIP_INDEX=https://pypi.org/simple \ --build-arg PIP_INDEX=https://pypi.org/simple \
--build-arg EXTRAS=torch-npu,metrics \
-t llamafactory:latest . -t llamafactory:latest .
# Change `device` upon your resources docker run -dit --ipc=host \
docker run -dit \
-v ./hf_cache:/root/.cache/huggingface \
-v ./ms_cache:/root/.cache/modelscope \
-v ./data:/app/data \
-v ./output:/app/output \
-v /usr/local/dcmi:/usr/local/dcmi \ -v /usr/local/dcmi:/usr/local/dcmi \
-v /usr/local/bin/npu-smi:/usr/local/bin/npu-smi \ -v /usr/local/bin/npu-smi:/usr/local/bin/npu-smi \
-v /usr/local/Ascend/driver:/usr/local/Ascend/driver \ -v /usr/local/Ascend/driver:/usr/local/Ascend/driver \
@@ -486,7 +755,25 @@ docker run -dit \
--device /dev/davinci_manager \ --device /dev/davinci_manager \
--device /dev/devmm_svm \ --device /dev/devmm_svm \
--device /dev/hisi_hdc \ --device /dev/hisi_hdc \
--shm-size 16G \ --name llamafactory \
llamafactory:latest
docker exec -it llamafactory bash
```
For AMD ROCm users:
```bash
docker build -f ./docker/docker-rocm/Dockerfile \
--build-arg PIP_INDEX=https://pypi.org/simple \
--build-arg EXTRAS=metrics \
-t llamafactory:latest .
docker run -dit --ipc=host \
-p 7860:7860 \
-p 8000:8000 \
--device /dev/kfd \
--device /dev/dri \
--name llamafactory \ --name llamafactory \
llamafactory:latest llamafactory:latest
@@ -495,22 +782,28 @@ docker exec -it llamafactory bash
</details> </details>
<details><summary>Details about volume</summary> <details><summary>Use Docker volumes</summary>
- hf_cache: Utilize Hugging Face cache on the host machine. Reassignable if a cache already exists in a different directory. You can uncomment `VOLUME [ "/root/.cache/huggingface", "/app/shared_data", "/app/output" ]` in the Dockerfile to use data volumes.
- data: Place datasets on this dir of the host machine so that they can be selected on LLaMA Board GUI.
- output: Set export dir to this location so that the merged result can be accessed directly on the host machine. When building the Docker image, use `-v ./hf_cache:/root/.cache/huggingface` argument to mount the local directory to the container. The following data volumes are available.
- `hf_cache`: Utilize Hugging Face cache on the host machine.
- `shared_data`: The directionary to store datasets on the host machine.
- `output`: Set export dir to this location so that the merged result can be accessed directly on the host machine.
</details> </details>
### Deploy with OpenAI-style API and vLLM ### Deploy with OpenAI-style API and vLLM
```bash ```bash
API_PORT=8000 llamafactory-cli api examples/inference/llama3_vllm.yaml API_PORT=8000 llamafactory-cli api examples/inference/llama3.yaml infer_backend=vllm vllm_enforce_eager=true
``` ```
> [!TIP] > [!TIP]
> Visit https://platform.openai.com/docs/api-reference/chat/create for API document. > Visit [this page](https://platform.openai.com/docs/api-reference/chat/create) for API document.
>
> Examples: [Image understanding](scripts/api_example/test_image.py) | [Function calling](scripts/api_example/test_toolcall.py)
### Download from ModelScope Hub ### Download from ModelScope Hub
@@ -522,6 +815,16 @@ export USE_MODELSCOPE_HUB=1 # `set USE_MODELSCOPE_HUB=1` for Windows
Train the model by specifying a model ID of the ModelScope Hub as the `model_name_or_path`. You can find a full list of model IDs at [ModelScope Hub](https://modelscope.cn/models), e.g., `LLM-Research/Meta-Llama-3-8B-Instruct`. Train the model by specifying a model ID of the ModelScope Hub as the `model_name_or_path`. You can find a full list of model IDs at [ModelScope Hub](https://modelscope.cn/models), e.g., `LLM-Research/Meta-Llama-3-8B-Instruct`.
### Download from Modelers Hub
You can also use Modelers Hub to download models and datasets.
```bash
export USE_OPENMIND_HUB=1 # `set USE_OPENMIND_HUB=1` for Windows
```
Train the model by specifying a model ID of the Modelers Hub as the `model_name_or_path`. You can find a full list of model IDs at [Modelers Hub](https://modelers.cn/models), e.g., `TeleAI/TeleChat-7B-pt`.
### Use W&B Logger ### Use W&B Logger
To use [Weights & Biases](https://wandb.ai) for logging experimental results, you need to add the following arguments to yaml files. To use [Weights & Biases](https://wandb.ai) for logging experimental results, you need to add the following arguments to yaml files.
@@ -533,6 +836,21 @@ run_name: test_run # optional
Set `WANDB_API_KEY` to [your key](https://wandb.ai/authorize) when launching training tasks to log in with your W&B account. Set `WANDB_API_KEY` to [your key](https://wandb.ai/authorize) when launching training tasks to log in with your W&B account.
### Use SwanLab Logger
To use [SwanLab](https://github.com/SwanHubX/SwanLab) for logging experimental results, you need to add the following arguments to yaml files.
```yaml
use_swanlab: true
swanlab_run_name: test_run # optional
```
When launching training tasks, you can log in to SwanLab in three ways:
1. Add `swanlab_api_key=<your_api_key>` to the yaml file, and set it to your [API key](https://swanlab.cn/settings).
2. Set the environment variable `SWANLAB_API_KEY` to your [API key](https://swanlab.cn/settings).
3. Use the `swanlab login` command to complete the login.
## Projects using LLaMA Factory ## Projects using LLaMA Factory
If you have a project that should be incorporated, please contact via email or create a pull request. If you have a project that should be incorporated, please contact via email or create a pull request.
@@ -600,25 +918,50 @@ If you have a project that should be incorporated, please contact via email or c
1. Feng et al. SS-Bench: A Benchmark for Social Story Generation and Evaluation. 2024. [[arxiv]](https://arxiv.org/abs/2406.15695) 1. Feng et al. SS-Bench: A Benchmark for Social Story Generation and Evaluation. 2024. [[arxiv]](https://arxiv.org/abs/2406.15695)
1. Feng et al. Self-Constructed Context Decompilation with Fined-grained Alignment Enhancement. 2024. [[arxiv]](https://arxiv.org/abs/2406.17233) 1. Feng et al. Self-Constructed Context Decompilation with Fined-grained Alignment Enhancement. 2024. [[arxiv]](https://arxiv.org/abs/2406.17233)
1. Liu et al. Large Language Models for Cuffless Blood Pressure Measurement From Wearable Biosignals. 2024. [[arxiv]](https://arxiv.org/abs/2406.18069) 1. Liu et al. Large Language Models for Cuffless Blood Pressure Measurement From Wearable Biosignals. 2024. [[arxiv]](https://arxiv.org/abs/2406.18069)
1. Iyer et al. Exploring Very Low-Resource Translation with LLMs: The University of Edinburghs Submission to AmericasNLP 2024 Translation Task. AmericasNLP 2024. [[paper]](https://aclanthology.org/2024.americasnlp-1.25) 1. Iyer et al. Exploring Very Low-Resource Translation with LLMs: The University of Edinburgh's Submission to AmericasNLP 2024 Translation Task. AmericasNLP 2024. [[paper]](https://aclanthology.org/2024.americasnlp-1.25)
1. Li et al. Calibrating LLMs with Preference Optimization on Thought Trees for Generating Rationale in Science Question Scoring. 2024. [[arxiv]](https://arxiv.org/abs/2406.19949)
1. Yang et al. Financial Knowledge Large Language Model. 2024. [[arxiv]](https://arxiv.org/abs/2407.00365)
1. Lin et al. DogeRM: Equipping Reward Models with Domain Knowledge through Model Merging. 2024. [[arxiv]](https://arxiv.org/abs/2407.01470)
1. Bako et al. Evaluating the Semantic Profiling Abilities of LLMs for Natural Language Utterances in Data Visualization. 2024. [[arxiv]](https://arxiv.org/abs/2407.06129)
1. Huang et al. RoLoRA: Fine-tuning Rotated Outlier-free LLMs for Effective Weight-Activation Quantization. 2024. [[arxiv]](https://arxiv.org/abs/2407.08044)
1. Jiang et al. LLM-Collaboration on Automatic Science Journalism for the General Audience. 2024. [[arxiv]](https://arxiv.org/abs/2407.09756)
1. Inouye et al. Applied Auto-tuning on LoRA Hyperparameters. 2024. [[paper]](https://scholarcommons.scu.edu/cseng_senior/272/)
1. Qi et al. Research on Tibetan Tourism Viewpoints information generation system based on LLM. 2024. [[arxiv]](https://arxiv.org/abs/2407.13561)
1. Xu et al. Course-Correction: Safety Alignment Using Synthetic Preferences. 2024. [[arxiv]](https://arxiv.org/abs/2407.16637)
1. Sun et al. LAMBDA: A Large Model Based Data Agent. 2024. [[arxiv]](https://arxiv.org/abs/2407.17535)
1. Zhu et al. CollectiveSFT: Scaling Large Language Models for Chinese Medical Benchmark with Collective Instructions in Healthcare. 2024. [[arxiv]](https://arxiv.org/abs/2407.19705)
1. Yu et al. Correcting Negative Bias in Large Language Models through Negative Attention Score Alignment. 2024. [[arxiv]](https://arxiv.org/abs/2408.00137)
1. Xie et al. The Power of Personalized Datasets: Advancing Chinese Composition Writing for Elementary School through Targeted Model Fine-Tuning. IALP 2024. [[paper]](https://www.asianlp.sg/conferences/ialp2024/proceedings/papers/IALP2024_P055.pdf)
1. Liu et al. Instruct-Code-Llama: Improving Capabilities of Language Model in Competition Level Code Generation by Online Judge Feedback. ICIC 2024. [[paper]](https://link.springer.com/chapter/10.1007/978-981-97-5669-8_11)
1. Wang et al. Cybernetic Sentinels: Unveiling the Impact of Safety Data Selection on Model Security in Supervised Fine-Tuning. ICIC 2024. [[paper]](https://link.springer.com/chapter/10.1007/978-981-97-5669-8_23)
1. Xia et al. Understanding the Performance and Estimating the Cost of LLM Fine-Tuning. 2024. [[arxiv]](https://arxiv.org/abs/2408.04693)
1. Zeng et al. Perceive, Reflect, and Plan: Designing LLM Agent for Goal-Directed City Navigation without Instructions. 2024. [[arxiv]](https://arxiv.org/abs/2408.04168)
1. Xia et al. Using Pre-trained Language Model for Accurate ESG Prediction. FinNLP 2024. [[paper]](https://aclanthology.org/2024.finnlp-2.1/)
1. Liang et al. I-SHEEP: Self-Alignment of LLM from Scratch through an Iterative Self-Enhancement Paradigm. 2024. [[arxiv]](https://arxiv.org/abs/2408.08072)
1. Bai et al. Aligning Large Language Model with Direct Multi-Preference Optimization for Recommendation. CIKM 2024. [[paper]](https://dl.acm.org/doi/10.1145/3627673.3679611)
1. Zhang et al. CPsyCoun: A Report-based Multi-turn Dialogue Reconstruction and Evaluation Framework for Chinese Psychological Counseling. ACL 2024. [[paper]](https://aclanthology.org/2024.findings-acl.830.pdf)
1. **[StarWhisper](https://github.com/Yu-Yang-Li/StarWhisper)**: A large language model for Astronomy, based on ChatGLM2-6B and Qwen-14B. 1. **[StarWhisper](https://github.com/Yu-Yang-Li/StarWhisper)**: A large language model for Astronomy, based on ChatGLM2-6B and Qwen-14B.
1. **[DISC-LawLLM](https://github.com/FudanDISC/DISC-LawLLM)**: A large language model specialized in Chinese legal domain, based on Baichuan-13B, is capable of retrieving and reasoning on legal knowledge. 1. **[DISC-LawLLM](https://github.com/FudanDISC/DISC-LawLLM)**: A large language model specialized in Chinese legal domain, based on Baichuan-13B, is capable of retrieving and reasoning on legal knowledge.
1. **[Sunsimiao](https://github.com/X-D-Lab/Sunsimiao)**: A large language model specialized in Chinese medical domain, based on Baichuan-7B and ChatGLM-6B. 1. **[Sunsimiao](https://github.com/X-D-Lab/Sunsimiao)**: A large language model specialized in Chinese medical domain, based on Baichuan-7B and ChatGLM-6B.
1. **[CareGPT](https://github.com/WangRongsheng/CareGPT)**: A series of large language models for Chinese medical domain, based on LLaMA2-7B and Baichuan-13B. 1. **[CareGPT](https://github.com/WangRongsheng/CareGPT)**: A series of large language models for Chinese medical domain, based on LLaMA2-7B and Baichuan-13B.
1. **[MachineMindset](https://github.com/PKU-YuanGroup/Machine-Mindset/)**: A series of MBTI Personality large language models, capable of giving any LLM 16 different personality types based on different datasets and training methods. 1. **[MachineMindset](https://github.com/PKU-YuanGroup/Machine-Mindset/)**: A series of MBTI Personality large language models, capable of giving any LLM 16 different personality types based on different datasets and training methods.
1. **[Luminia-13B-v3](https://huggingface.co/Nekochu/Luminia-13B-v3)**: A large language model specialized in generate metadata for stable diffusion. [[🤗Demo]](https://huggingface.co/spaces/Nekochu/Luminia-13B_SD_Prompt) 1. **[Luminia-13B-v3](https://huggingface.co/Nekochu/Luminia-13B-v3)**: A large language model specialized in generate metadata for stable diffusion. [[demo]](https://huggingface.co/spaces/Nekochu/Luminia-13B_SD_Prompt)
1. **[Chinese-LLaVA-Med](https://github.com/BUAADreamer/Chinese-LLaVA-Med)**: A multimodal large language model specialized in Chinese medical domain, based on LLaVA-1.5-7B. 1. **[Chinese-LLaVA-Med](https://github.com/BUAADreamer/Chinese-LLaVA-Med)**: A multimodal large language model specialized in Chinese medical domain, based on LLaVA-1.5-7B.
1. **[AutoRE](https://github.com/THUDM/AutoRE)**: A document-level relation extraction system based on large language models. 1. **[AutoRE](https://github.com/THUDM/AutoRE)**: A document-level relation extraction system based on large language models.
1. **[NVIDIA RTX AI Toolkit](https://github.com/NVIDIA/RTX-AI-Toolkit)**: SDKs for fine-tuning LLMs on Windows PC for NVIDIA RTX. 1. **[NVIDIA RTX AI Toolkit](https://github.com/NVIDIA/RTX-AI-Toolkit)**: SDKs for fine-tuning LLMs on Windows PC for NVIDIA RTX.
1. **[LazyLLM](https://github.com/LazyAGI/LazyLLM)**: An easy and lazy way for building multi-agent LLMs applications and supports model fine-tuning via LLaMA Factory. 1. **[LazyLLM](https://github.com/LazyAGI/LazyLLM)**: An easy and lazy way for building multi-agent LLMs applications and supports model fine-tuning via LLaMA Factory.
1. **[RAG-Retrieval](https://github.com/NLPJCL/RAG-Retrieval)**: A full pipeline for RAG retrieval model fine-tuning, inference, and distillation. [[blog]](https://zhuanlan.zhihu.com/p/987727357)
1. **[360-LLaMA-Factory](https://github.com/Qihoo360/360-LLaMA-Factory)**: A modified library that supports long sequence SFT & DPO using ring attention.
1. **[Sky-T1](https://novasky-ai.github.io/posts/sky-t1/)**: An o1-like model fine-tuned by NovaSky AI with very small cost.
1. **[WeClone](https://github.com/xming521/WeClone)**: One-stop solution for creating your digital avatar from chat logs.
1. **[EmoLLM](https://github.com/SmartFlowAI/EmoLLM)**: A project about large language models (LLMs) and mental health.
</details> </details>
## License ## License
This repository is licensed under the [Apache-2.0 License](LICENSE). This repository is licensed under the [Apache-2.0 License](LICENSE).
Please follow the model licenses to use the corresponding model weights: [Baichuan 2](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base/blob/main/Community%20License%20for%20Baichuan%202%20Model.pdf) / [BLOOM](https://huggingface.co/spaces/bigscience/license) / [ChatGLM3](https://github.com/THUDM/ChatGLM3/blob/main/MODEL_LICENSE) / [Command R](https://cohere.com/c4ai-cc-by-nc-license) / [DeepSeek](https://github.com/deepseek-ai/DeepSeek-LLM/blob/main/LICENSE-MODEL) / [Falcon](https://huggingface.co/tiiuae/falcon-180B/blob/main/LICENSE.txt) / [Gemma](https://ai.google.dev/gemma/terms) / [GLM-4](https://huggingface.co/THUDM/glm-4-9b/blob/main/LICENSE) / [InternLM2](https://github.com/InternLM/InternLM#license) / [Llama](https://github.com/facebookresearch/llama/blob/main/MODEL_CARD.md) / [Llama 2 (LLaVA-1.5)](https://ai.meta.com/llama/license/) / [Llama 3](https://llama.meta.com/llama3/license/) / [Mistral](LICENSE) / [OLMo](LICENSE) / [Phi-1.5/Phi-2](https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx) / [Phi-3](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/blob/main/LICENSE) / [Qwen](https://github.com/QwenLM/Qwen/blob/main/Tongyi%20Qianwen%20LICENSE%20AGREEMENT) / [StarCoder 2](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement) / [XVERSE](https://github.com/xverse-ai/XVERSE-13B/blob/main/MODEL_LICENSE.pdf) / [Yi](https://huggingface.co/01-ai/Yi-6B/blob/main/LICENSE) / [Yi-1.5](LICENSE) / [Yuan 2](https://github.com/IEIT-Yuan/Yuan-2.0/blob/main/LICENSE-Yuan) Please follow the model licenses to use the corresponding model weights: [Baichuan 2](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base/blob/main/Community%20License%20for%20Baichuan%202%20Model.pdf) / [BLOOM](https://huggingface.co/spaces/bigscience/license) / [ChatGLM3](https://github.com/THUDM/ChatGLM3/blob/main/MODEL_LICENSE) / [Command R](https://cohere.com/c4ai-cc-by-nc-license) / [DeepSeek](https://github.com/deepseek-ai/DeepSeek-LLM/blob/main/LICENSE-MODEL) / [Falcon](https://huggingface.co/tiiuae/falcon-180B/blob/main/LICENSE.txt) / [Gemma](https://ai.google.dev/gemma/terms) / [GLM-4](https://huggingface.co/THUDM/glm-4-9b/blob/main/LICENSE) / [GPT-2](https://github.com/openai/gpt-2/blob/master/LICENSE) / [Granite](LICENSE) / [Index](https://huggingface.co/IndexTeam/Index-1.9B/blob/main/LICENSE) / [InternLM](https://github.com/InternLM/InternLM#license) / [Llama](https://github.com/facebookresearch/llama/blob/main/MODEL_CARD.md) / [Llama 2](https://ai.meta.com/llama/license/) / [Llama 3](https://llama.meta.com/llama3/license/) / [Llama 4](https://github.com/meta-llama/llama-models/blob/main/models/llama4/LICENSE) / [MiniCPM](https://github.com/OpenBMB/MiniCPM/blob/main/MiniCPM%20Model%20License.md) / [Mistral/Mixtral/Pixtral](LICENSE) / [OLMo](LICENSE) / [Phi-1.5/Phi-2](https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx) / [Phi-3/Phi-4](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/blob/main/LICENSE) / [Qwen](https://github.com/QwenLM/Qwen/blob/main/Tongyi%20Qianwen%20LICENSE%20AGREEMENT) / [Skywork](https://huggingface.co/Skywork/Skywork-13B-base/blob/main/Skywork%20Community%20License.pdf) / [StarCoder 2](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement) / [TeleChat2](https://huggingface.co/Tele-AI/telechat-7B/blob/main/TeleChat%E6%A8%A1%E5%9E%8B%E7%A4%BE%E5%8C%BA%E8%AE%B8%E5%8F%AF%E5%8D%8F%E8%AE%AE.pdf) / [XVERSE](https://github.com/xverse-ai/XVERSE-13B/blob/main/MODEL_LICENSE.pdf) / [Yi](https://huggingface.co/01-ai/Yi-6B/blob/main/LICENSE) / [Yi-1.5](LICENSE) / [Yuan 2](https://github.com/IEIT-Yuan/Yuan-2.0/blob/main/LICENSE-Yuan)
## Citation ## Citation

View File

@@ -1,44 +1,92 @@
![# LLaMA Factory](assets/logo.png) ![# LLaMA Factory](assets/logo.png)
[![GitHub Repo stars](https://img.shields.io/github/stars/hiyouga/LLaMA-Factory?style=social)](https://github.com/hiyouga/LLaMA-Factory/stargazers) [![GitHub Repo stars](https://img.shields.io/github/stars/hiyouga/LLaMA-Factory?style=social)](https://github.com/hiyouga/LLaMA-Factory/stargazers)
[![GitHub Code License](https://img.shields.io/github/license/hiyouga/LLaMA-Factory)](LICENSE)
[![GitHub last commit](https://img.shields.io/github/last-commit/hiyouga/LLaMA-Factory)](https://github.com/hiyouga/LLaMA-Factory/commits/main) [![GitHub last commit](https://img.shields.io/github/last-commit/hiyouga/LLaMA-Factory)](https://github.com/hiyouga/LLaMA-Factory/commits/main)
[![GitHub contributors](https://img.shields.io/github/contributors/hiyouga/LLaMA-Factory?color=orange)](https://github.com/hiyouga/LLaMA-Factory/graphs/contributors)
[![GitHub workflow](https://github.com/hiyouga/LLaMA-Factory/actions/workflows/tests.yml/badge.svg)](https://github.com/hiyouga/LLaMA-Factory/actions/workflows/tests.yml)
[![PyPI](https://img.shields.io/pypi/v/llamafactory)](https://pypi.org/project/llamafactory/) [![PyPI](https://img.shields.io/pypi/v/llamafactory)](https://pypi.org/project/llamafactory/)
[![Citation](https://img.shields.io/badge/citation-72-green)](#使用了-llama-factory-的项目) [![Citation](https://img.shields.io/badge/citation-1000+-green)](https://scholar.google.com/scholar?cites=12620864006390196564)
[![GitHub pull request](https://img.shields.io/badge/PRs-welcome-blue)](https://github.com/hiyouga/LLaMA-Factory/pulls) [![Docker Pulls](https://img.shields.io/docker/pulls/hiyouga/llamafactory)](https://hub.docker.com/r/hiyouga/llamafactory/tags)
[![Discord](https://dcbadge.vercel.app/api/server/rKfvV9r9FK?compact=true&style=flat)](https://discord.gg/rKfvV9r9FK)
[![Twitter](https://img.shields.io/twitter/follow/llamafactory_ai)](https://twitter.com/llamafactory_ai) [![Twitter](https://img.shields.io/twitter/follow/llamafactory_ai)](https://twitter.com/llamafactory_ai)
[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1d5KQtbemerlSDSxZIfAaWXhKr30QypiK?usp=sharing) [![Discord](assets/thirdparty/discord.svg)](https://discord.gg/rKfvV9r9FK)
[![Open in DSW](https://gallery.pai-ml.com/assets/open-in-dsw.svg)](https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory) [![WeChat](https://img.shields.io/badge/WeChat-User%20Group-blue?logo=wechat)](https://github.com/hiyouga/llamafactory-community)
[![Spaces](https://img.shields.io/badge/🤗-Open%20in%20Spaces-blue)](https://huggingface.co/spaces/hiyouga/LLaMA-Board) [![Blog](https://img.shields.io/badge/Hugo-Official%20Blog-blue?logo=hugo)](https://blog.llamafactory.net/)
[![Studios](https://img.shields.io/badge/ModelScope-Open%20in%20Studios-blue)](https://modelscope.cn/studios/hiyouga/LLaMA-Board)
[![GitHub Tread](https://trendshift.io/api/badge/repositories/4535)](https://trendshift.io/repositories/4535) [![Open in Colab](assets/thirdparty/colab.svg)](https://colab.research.google.com/drive/1d5KQtbemerlSDSxZIfAaWXhKr30QypiK?usp=sharing)
[![Open in DSW](assets/thirdparty/dsw.svg)](https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory)
[![Open in Lab4ai](assets/thirdparty/lab4ai.svg)](https://www.lab4ai.cn/course/detail?id=7c13e60f6137474eb40f6fd3983c0f46&utm_source=LLaMA-Factory)
[![Open in Online](assets/thirdparty/online.svg)](https://www.llamafactory.com.cn/?utm_source=LLaMA-Factory)
[![Open in Spaces](https://img.shields.io/badge/🤗-Open%20in%20Spaces-blue)](https://huggingface.co/spaces/hiyouga/LLaMA-Board)
[![Open in Studios](https://img.shields.io/badge/ModelScope-Open%20in%20Studios-blue)](https://modelscope.cn/studios/hiyouga/LLaMA-Board)
[![Open in Novita](https://img.shields.io/badge/Novita-Deploy%20Template-blue)](https://novita.ai/templates-library/105981?sharer=88115474-394e-4bda-968e-b88e123d0c47)
👋 加入我们的[微信群](assets/wechat.jpg)或 [NPU 用户群](assets/wechat_npu.jpg) ### 获得[亚马逊](https://aws.amazon.com/cn/blogs/china/a-one-stop-code-free-model-fine-tuning-deployment-platform-based-on-sagemaker-and-llama-factory/)、[英伟达](https://developer.nvidia.cn/rtx/ai-toolkit)、[阿里云](https://help.aliyun.com/zh/pai/use-cases/fine-tune-a-llama-3-model-with-llama-factory)等的应用
<div align="center" markdown="1">
### 赞助商 ❤️
| <div style="text-align: center;"><a href="https://warp.dev/llama-factory"><img alt="Warp sponsorship" width="400" src="assets/sponsors/warp.jpg"></a><br><a href="https://warp.dev/llama-factory" style="font-size:larger;">Warp面向开发者的智能终端</a><br><a href="https://warp.dev/llama-factory">适用于 MacOS、Linux 和 Windows</a> | <a href="https://serpapi.com"><img alt="SerpAPI sponsorship" width="250" src="assets/sponsors/serpapi.svg"> </a> |
| ---- | ---- |
----
### 使用零代码[命令行](#快速开始)与 [Web UI](#llama-board-可视化微调由-gradio-驱动) 轻松微调百余种大模型
![GitHub Trend](https://trendshift.io/api/badge/repositories/4535)
</div>
👋 加入我们的[微信群](https://github.com/hiyouga/llamafactory-community/blob/main/wechat/main.jpg)、[NPU 用户群](https://github.com/hiyouga/llamafactory-community/blob/main/wechat/npu.jpg)、[大模型实验室群](https://github.com/hiyouga/llamafactory-community/blob/main/wechat/lab4ai.jpg) 或 [LLaMA Factory Online 用户群](https://github.com/hiyouga/llamafactory-community/blob/main/wechat/online.png)。
\[ [English](README.md) | 中文 \] \[ [English](README.md) | 中文 \]
**微调大模型可以像这样轻松…** **微调大模型可以像这样轻松…**
https://github.com/hiyouga/LLaMA-Factory/assets/16256802/ec36a9dd-37f4-4f72-81bd-d76c6d0a6594 https://github.com/user-attachments/assets/43b700c6-a178-41db-b1f8-8190a5d3fcfc
选择你的打开方式 开始本地训练
- 请见[如何使用](#如何使用)
- **Colab**https://colab.research.google.com/drive/1d5KQtbemerlSDSxZIfAaWXhKr30QypiK?usp=sharing 开始云端训练:
- **PAI-DSW**: https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory - **Colab免费**https://colab.research.google.com/drive/1d5KQtbemerlSDSxZIfAaWXhKr30QypiK?usp=sharing
- **本地机器**:请见[如何使用](#如何使用) - **PAI-DSW免费试用**https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory
- **LLaMA Factory Online在线微调**https://www.llamafactory.com.cn/?utm_source=LLaMA-Factory
- **九章智算云(算力优惠活动)**https://docs.alayanew.com/docs/documents/useGuide/LLaMAFactory/mutiple/?utm_source=LLaMA-Factory
阅读技术文档:
- **入门教程**https://zhuanlan.zhihu.com/p/695287607
- **微调视频教程**https://www.bilibili.com/video/BV1djgRzxEts/
- **框架文档**https://llamafactory.readthedocs.io/zh-cn/latest/
- **框架文档(昇腾 NPU**https://ascend.github.io/docs/sources/llamafactory/
- **官方博客**https://blog.llamafactory.net/
- **官方课程**https://www.lab4ai.cn/course/detail?id=7c13e60f6137474eb40f6fd3983c0f46&utm_source=LLaMA-Factory
> [!NOTE]
> 除上述链接以外的其他网站均为未经许可的第三方网站,请小心甄别。
## 目录 ## 目录
- [项目特色](#项目特色) - [项目特色](#项目特色)
- [性能指标](#性能指标) - [官方博客](#官方博客)
- [更新日志](#更新日志) - [更新日志](#更新日志)
- [模型](#模型) - [模型](#模型)
- [训练方法](#训练方法) - [训练方法](#训练方法)
- [数据集](#数据集) - [数据集](#数据集)
- [软硬件依赖](#软硬件依赖) - [软硬件依赖](#软硬件依赖)
- [如何使用](#如何使用) - [如何使用](#如何使用)
- [安装 LLaMA Factory](#安装-llama-factory)
- [数据准备](#数据准备)
- [快速开始](#快速开始)
- [LLaMA Board 可视化微调](#llama-board-可视化微调由-gradio-驱动)
- [LLaMA Factory Online 在线微调](#llama-factory-online-在线微调)
- [构建 Docker](#构建-docker)
- [利用 vLLM 部署 OpenAI API](#利用-vllm-部署-openai-api)
- [从魔搭社区下载](#从魔搭社区下载)
- [从魔乐社区下载](#从魔乐社区下载)
- [使用 W&B 面板](#使用-wb-面板)
- [使用 SwanLab 面板](#使用-swanlab-面板)
- [使用了 LLaMA Factory 的项目](#使用了-llama-factory-的项目) - [使用了 LLaMA Factory 的项目](#使用了-llama-factory-的项目)
- [协议](#协议) - [协议](#协议)
- [引用](#引用) - [引用](#引用)
@@ -46,40 +94,115 @@ https://github.com/hiyouga/LLaMA-Factory/assets/16256802/ec36a9dd-37f4-4f72-81bd
## 项目特色 ## 项目特色
- **多种模型**LLaMA、LLaVA、Mistral、Mixtral-MoE、Qwen、Yi、Gemma、Baichuan、ChatGLM、Phi 等等。 - **多种模型**LLaMA、LLaVA、Mistral、Mixtral-MoE、Qwen、Qwen2-VL、DeepSeek、Yi、Gemma、ChatGLM、Phi 等等。
- **集成方法**增量预训练、多模态指令监督微调、奖励模型训练、PPO 训练、DPO 训练、KTO 训练、ORPO 训练等等。 - **集成方法**增量预训练、多模态指令监督微调、奖励模型训练、PPO 训练、DPO 训练、KTO 训练、ORPO 训练等等。
- **多种精度**16 比特全参数微调、冻结微调、LoRA 微调和基于 AQLM/AWQ/GPTQ/LLM.int8/HQQ/EETQ 的 2/3/4/5/6/8 比特 QLoRA 微调。 - **多种精度**16 比特全参数微调、冻结微调、LoRA 微调和基于 AQLM/AWQ/GPTQ/LLM.int8/HQQ/EETQ 的 2/3/4/5/6/8 比特 QLoRA 微调。
- **先进算法**GaLore、BAdam、DoRA、LongLoRA、LLaMA Pro、Mixture-of-Depths、LoRA+、LoftQPiSSA 和 Agent 微调 - **先进算法**[GaLore](https://github.com/jiaweizzhao/GaLore)、[BAdam](https://github.com/Ledzy/BAdam)、[APOLLO](https://github.com/zhuhanqing/APOLLO)、[Adam-mini](https://github.com/zyushun/Adam-mini)、[Muon](https://github.com/KellerJordan/Muon)、[OFT](https://github.com/huggingface/peft/tree/main/src/peft/tuners/oft)、DoRA、LongLoRA、LLaMA Pro、Mixture-of-Depths、LoRA+、LoftQPiSSA。
- **实用技巧**FlashAttention-2、Unsloth、RoPE scaling、NEFTune 和 rsLoRA。 - **实用技巧**[FlashAttention-2](https://github.com/Dao-AILab/flash-attention)、[Unsloth](https://github.com/unslothai/unsloth)、[Liger Kernel](https://github.com/linkedin/Liger-Kernel)、RoPE scaling、NEFTune 和 rsLoRA。
- **实验监控**LlamaBoard、TensorBoard、Wandb、MLflow 等等。 - **广泛任务**:多轮对话、工具调用、图像理解、视觉定位、视频识别和语音理解等等。
- **极速推理**:基于 vLLM 的 OpenAI 风格 API、浏览器界面和命令行接口 - **实验监控**LlamaBoard、TensorBoard、Wandb、MLflow、[SwanLab](https://github.com/SwanHubX/SwanLab) 等等
- **极速推理**:基于 [vLLM](https://github.com/vllm-project/vllm) 或 [SGLang](https://github.com/sgl-project/sglang) 的 OpenAI 风格 API、浏览器界面和命令行接口。
## 性能指标 ### 最新模型的 Day-N 微调适配
与 ChatGLM 官方的 [P-Tuning](https://github.com/THUDM/ChatGLM2-6B/tree/main/ptuning) 微调相比LLaMA Factory 的 LoRA 微调提供了 **3.7 倍**的加速比,同时在广告文案生成任务上取得了更高的 Rouge 分数。结合 4 比特量化技术LLaMA Factory 的 QLoRA 微调进一步降低了 GPU 显存消耗。 | 适配时间 | 模型名称 |
| ------------ | -------------------------------------------------------------------- |
| Day 0 | Qwen3 / Qwen2.5-VL / Gemma 3 / GLM-4.1V / InternLM 3 / MiniCPM-o-2.6 |
| Day 1 | Llama 3 / GLM-4 / Mistral Small / PaliGemma2 / Llama 4 |
![benchmark](assets/benchmark.svg) ## 官方博客
<details><summary>变量定义</summary> > [!TIP]
> 我们现在拥有了 LLaMA Factory 的专属博客!
>
> 网站地址https://blog.llamafactory.net/
- **Training Speed**: 训练阶段每秒处理的样本数量。(批处理大小=4截断长度=1024 - 💡 [Easy Dataset × LLaMA Factory: 让大模型高效学习领域知识](https://buaa-act.feishu.cn/wiki/KY9xwTGs1iqHrRkjXBwcZP9WnL9)(中文
- **Rouge Score**: [广告文案生成](https://aclanthology.org/D19-1321.pdf)任务验证集上的 Rouge-2 分数。(批处理大小=4截断长度=1024 - [使用 LLaMA-Factory 微调心理健康大模型](https://www.lab4ai.cn/project/detail?id=25cce32ec131497b9e06a93336a0817f&type=project&utm_source=LLaMA-Factory)(中文
- **GPU Memory**: 4 比特量化训练的 GPU 显存峰值。(批处理大小=1截断长度=1024 - [使用 LLaMA-Factory 构建 GPT-OSS 角色扮演模型](https://docs.llamafactory.com.cn/docs/documents/best-practice/gptroleplay/?utm_source=LLaMA-Factory)(中文
- 我们在 ChatGLM 的 P-Tuning 中采用 `pre_seq_len=128`,在 LLaMA Factory 的 LoRA 微调中采用 `lora_rank=32` - [基于 LLaMA-Factory 和 EasyR1 打造一站式无代码大模型强化学习和部署平台 LLM Model Hub](https://aws.amazon.com/cn/blogs/china/building-llm-model-hub-based-on-llamafactory-and-easyr1/)(中文)
- [通过亚马逊 SageMaker HyperPod 上的 LLaMA-Factory 增强多模态模型银行文档的视觉信息提取](https://aws.amazon.com/cn/blogs/machine-learning/how-apoidea-group-enhances-visual-information-extraction-from-banking-documents-with-multimodal-models-using-llama-factory-on-amazon-sagemaker-hyperpod/)(英文)
<details><summary>全部博客</summary>
- [使用 LLaMA-Factory 微调 Llama3.1-70B 医学诊断模型](https://docs.alayanew.com/docs/documents/bestPractice/bigModel/llama70B/?utm_source=LLaMA-Factory)(中文)
- [使用 LLaMA-Factory 微调 Qwen2.5-VL 实现自动驾驶场景微调](https://docs.alayanew.com/docs/documents/useGuide/LLaMAFactory/mutiple/?utm_source=LLaMA-Factory)(中文)
- [LLaMA Factory微调 DeepSeek-R1-Distill-Qwen-7B 模型实现新闻标题分类器](https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory_deepseek_r1_distill_7b)(中文)
- [基于 Amazon SageMaker 和 LLaMA-Factory 打造一站式无代码模型微调部署平台 Model Hub](https://aws.amazon.com/cn/blogs/china/a-one-stop-code-free-model-fine-tuning-deployment-platform-based-on-sagemaker-and-llama-factory/)(中文)
- [LLaMA Factory 多模态微调实践:微调 Qwen2-VL 构建文旅大模型](https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory_qwen2vl)(中文)
- [LLaMA Factory微调 Llama3 模型实现角色扮演](https://gallery.pai-ml.com/#/preview/deepLearning/nlp/llama_factory)(中文)
</details> </details>
## 更新日志 ## 更新日志
[25/10/26] 我们支持了Megatron-core作为训练后端和适配了[**mcore_adapter**](https://github.com/alibaba/ROLL/tree/main/mcore_adapter)。查看[PR #9237](https://github.com/hiyouga/LLaMA-Factory/pull/9237)以使用。
[25/08/22] 我们支持了 **[OFT](https://arxiv.org/abs/2306.07280)** 和 **[OFTv2](https://arxiv.org/abs/2506.19847)** 模型的微调。查看 [examples](examples/README.md) 以使用。
[25/08/20] 我们支持了 **[Intern-S1-mini](https://huggingface.co/internlm/Intern-S1-mini)** 模型的微调。查看 [PR #8976](https://github.com/hiyouga/LLaMA-Factory/pull/8976) 以使用。
[25/08/06] 我们支持了 **[GPT-OSS](https://github.com/openai/gpt-oss)** 模型的微调。查看 [PR #8826](https://github.com/hiyouga/LLaMA-Factory/pull/8826) 以使用。
<details><summary>展开日志</summary>
[25/07/02] 我们支持了 **[GLM-4.1V-9B-Thinking](https://github.com/THUDM/GLM-4.1V-Thinking)** 模型的微调。
[25/04/28] 我们支持了 **[Qwen3](https://qwenlm.github.io/blog/qwen3/)** 系列模型的微调。
[25/04/21] 我们支持了 **[Muon](https://github.com/KellerJordan/Muon)** 优化器。详细用法请参照 [examples](examples/README_zh.md)。感谢 [@tianshijing](https://github.com/tianshijing) 的 PR。
[25/04/16] 我们支持了 **[InternVL3](https://huggingface.co/OpenGVLab/InternVL3-8B)** 模型的微调。查看 [PR #7258](https://github.com/hiyouga/LLaMA-Factory/pull/7258) 以使用。
[25/04/14] 我们支持了 **[GLM-Z1](https://huggingface.co/THUDM/GLM-Z1-9B-0414)** 和 **[Kimi-VL](https://huggingface.co/moonshotai/Kimi-VL-A3B-Instruct)** 模型的微调。
[25/04/06] 我们支持了 **[Llama 4](https://ai.meta.com/blog/llama-4-multimodal-intelligence/)** 模型的微调。查看 [PR #7611](https://github.com/hiyouga/LLaMA-Factory/pull/7611) 以使用。
[25/03/31] 我们支持了 **[Qwen2.5 Omni](https://qwenlm.github.io/blog/qwen2.5-omni/)** 模型的微调。查看 [PR #7537](https://github.com/hiyouga/LLaMA-Factory/pull/7537) 以使用。
[25/03/15] 我们支持了 **[SGLang](https://github.com/sgl-project/sglang)** 推理后端,请使用 `infer_backend: sglang` 启用。
[25/03/12] 我们支持了 **[Gemma 3](https://huggingface.co/blog/gemma3)** 模型的微调。
[25/02/24] 我们宣布开源 **[EasyR1](https://github.com/hiyouga/EasyR1)**,一个高效可扩展的多模态强化学习框架,支持高效的 GRPO 训练。
[25/02/11] 我们支持了在导出模型时保存 **[Ollama](https://github.com/ollama/ollama)** 配置文件。详细用法请参照 [examples](examples/README_zh.md)。
[25/02/05] 我们支持了在语音理解任务上微调 **[Qwen2-Audio](Qwen/Qwen2-Audio-7B-Instruct)** 和 **[MiniCPM-o-2.6](https://huggingface.co/openbmb/MiniCPM-o-2_6)** 模型。
[25/01/31] 我们支持了 **[DeepSeek-R1](https://huggingface.co/deepseek-ai/DeepSeek-R1)** 和 **[Qwen2.5-VL](https://huggingface.co/Qwen/Qwen2.5-VL-7B-Instruct)** 模型的微调。
[25/01/15] 我们支持了 **[APOLLO](https://arxiv.org/abs/2412.05270)** 优化器。详细用法请参照 [examples](examples/README_zh.md)。
[25/01/14] 我们支持了 **[MiniCPM-o-2.6](https://huggingface.co/openbmb/MiniCPM-o-2_6)** 和 **[MiniCPM-V-2.6](https://huggingface.co/openbmb/MiniCPM-V-2_6)** 模型的微调。 感谢 [@BUAADreamer](https://github.com/BUAADreamer) 的 PR.
[25/01/14] 我们支持了 **[InternLM 3](https://huggingface.co/collections/internlm/)** 模型的微调。感谢 [@hhaAndroid](https://github.com/hhaAndroid) 的 PR。
[25/01/10] 我们支持了 **[Phi-4](https://huggingface.co/microsoft/phi-4)** 模型的微调。
[24/12/21] 我们支持了使用 **[SwanLab](https://github.com/SwanHubX/SwanLab)** 跟踪与可视化实验。详细用法请参考 [此部分](#使用-swanlab-面板)。
[24/11/27] 我们支持了 **[Skywork-o1](https://huggingface.co/Skywork/Skywork-o1-Open-Llama-3.1-8B)** 模型的微调和 **[OpenO1](https://huggingface.co/datasets/O1-OPEN/OpenO1-SFT)** 数据集。
[24/10/09] 我们支持了从 **[魔乐社区](https://modelers.cn/models)** 下载预训练模型和数据集。详细用法请参照 [此教程](#从魔乐社区下载)。
[24/09/19] 我们支持了 **[Qwen2.5](https://qwenlm.github.io/blog/qwen2.5/)** 模型的微调。
[24/08/30] 我们支持了 **[Qwen2-VL](https://qwenlm.github.io/blog/qwen2-vl/)** 模型的微调。感谢 [@simonJJJ](https://github.com/simonJJJ) 的 PR。
[24/08/27] 我们支持了 **[Liger Kernel](https://github.com/linkedin/Liger-Kernel)**。请使用 `enable_liger_kernel: true` 来加速训练。
[24/08/09] 我们支持了 **[Adam-mini](https://github.com/zyushun/Adam-mini)** 优化器。详细用法请参照 [examples](examples/README_zh.md)。感谢 [@relic-yuexi](https://github.com/relic-yuexi) 的 PR。
[24/07/04] 我们支持了[无污染打包训练](https://github.com/MeetKai/functionary/tree/main/functionary/train/packing)。请使用 `neat_packing: true` 参数。感谢 [@chuan298](https://github.com/chuan298) 的 PR。
[24/06/16] 我们支持了 **[PiSSA](https://arxiv.org/abs/2404.02948)** 算法。详细用法请参照 [examples](examples/README_zh.md)。 [24/06/16] 我们支持了 **[PiSSA](https://arxiv.org/abs/2404.02948)** 算法。详细用法请参照 [examples](examples/README_zh.md)。
[24/06/07] 我们支持了 **[Qwen2](https://qwenlm.github.io/blog/qwen2/)** 和 **[GLM-4](https://github.com/THUDM/GLM-4)** 模型的微调。 [24/06/07] 我们支持了 **[Qwen2](https://qwenlm.github.io/blog/qwen2/)** 和 **[GLM-4](https://github.com/THUDM/GLM-4)** 模型的微调。
[24/05/26] 我们支持了 **[SimPO](https://arxiv.org/abs/2405.14734)** 偏好对齐算法。详细用法请参照 [examples](examples/README_zh.md)。 [24/05/26] 我们支持了 **[SimPO](https://arxiv.org/abs/2405.14734)** 偏好对齐算法。详细用法请参照 [examples](examples/README_zh.md)。
<details><summary>展开日志</summary> [24/05/20] 我们支持了 **PaliGemma** 系列模型的微调。注意 PaliGemma 是预训练模型,你需要使用 `paligemma` 模板进行微调使其获得对话能力。
[24/05/20] 我们支持了 **PaliGemma** 系列模型的微调。注意 PaliGemma 是预训练模型,你需要使用 `gemma` 模板进行微调使其获得对话能力。
[24/05/18] 我们支持了 **[KTO](https://arxiv.org/abs/2402.01306)** 偏好对齐算法。详细用法请参照 [examples](examples/README_zh.md)。 [24/05/18] 我们支持了 **[KTO](https://arxiv.org/abs/2402.01306)** 偏好对齐算法。详细用法请参照 [examples](examples/README_zh.md)。
@@ -91,7 +214,7 @@ https://github.com/hiyouga/LLaMA-Factory/assets/16256802/ec36a9dd-37f4-4f72-81bd
[24/04/21] 我们基于 [AstraMindAI 的仓库](https://github.com/astramind-ai/Mixture-of-depths)支持了 **[混合深度训练](https://arxiv.org/abs/2404.02258)**。详细用法请参照 [examples](examples/README_zh.md)。 [24/04/21] 我们基于 [AstraMindAI 的仓库](https://github.com/astramind-ai/Mixture-of-depths)支持了 **[混合深度训练](https://arxiv.org/abs/2404.02258)**。详细用法请参照 [examples](examples/README_zh.md)。
[24/04/16] 我们支持了 **[BAdam](https://arxiv.org/abs/2404.02827)**。详细用法请参照 [examples](examples/README_zh.md)。 [24/04/16] 我们支持了 **[BAdam](https://arxiv.org/abs/2404.02827)** 优化器。详细用法请参照 [examples](examples/README_zh.md)。
[24/04/16] 我们支持了 **[unsloth](https://github.com/unslothai/unsloth)** 的长序列训练24GB 可训练 Llama-2-7B-56k。该方法相比 FlashAttention-2 提供了 **117%** 的训练速度和 **50%** 的显存节约。更多数据请见[此页面](https://github.com/hiyouga/LLaMA-Factory/wiki/Performance-comparison)。 [24/04/16] 我们支持了 **[unsloth](https://github.com/unslothai/unsloth)** 的长序列训练24GB 可训练 Llama-2-7B-56k。该方法相比 FlashAttention-2 提供了 **117%** 的训练速度和 **50%** 的显存节约。更多数据请见[此页面](https://github.com/hiyouga/LLaMA-Factory/wiki/Performance-comparison)。
@@ -103,7 +226,7 @@ https://github.com/hiyouga/LLaMA-Factory/assets/16256802/ec36a9dd-37f4-4f72-81bd
[24/03/13] 我们支持了 **[LoRA+](https://arxiv.org/abs/2402.12354)**。详细用法请参照 [examples](examples/README_zh.md)。 [24/03/13] 我们支持了 **[LoRA+](https://arxiv.org/abs/2402.12354)**。详细用法请参照 [examples](examples/README_zh.md)。
[24/03/07] 我们支持了梯度低秩投影(**[GaLore](https://arxiv.org/abs/2403.03507)**)算法。详细用法请参照 [examples](examples/README_zh.md)。 [24/03/07] 我们支持了 **[GaLore](https://arxiv.org/abs/2403.03507)** 优化器。详细用法请参照 [examples](examples/README_zh.md)。
[24/03/07] 我们集成了 **[vLLM](https://github.com/vllm-project/vllm)** 以实现极速并发推理。请使用 `infer_backend: vllm` 来获得 **270%** 的推理速度。 [24/03/07] 我们集成了 **[vLLM](https://github.com/vllm-project/vllm)** 以实现极速并发推理。请使用 `infer_backend: vllm` 来获得 **270%** 的推理速度。
@@ -149,39 +272,86 @@ https://github.com/hiyouga/LLaMA-Factory/assets/16256802/ec36a9dd-37f4-4f72-81bd
</details> </details>
> [!TIP]
> 如果您无法使用最新的功能,请尝试重新拉取代码并再次安装 LLaMA-Factory。
## 模型 ## 模型
| 模型名 | 模型大小 | Template | | 模型名 | 参数量 | Template |
| ------------------------------------------------------------ | -------------------------------- | --------- | | ----------------------------------------------------------------- | -------------------------------- | -------------------- |
| [Baichuan 2](https://huggingface.co/baichuan-inc) | 7B/13B | baichuan2 | | [Baichuan 2](https://huggingface.co/baichuan-inc) | 7B/13B | baichuan2 |
| [BLOOM/BLOOMZ](https://huggingface.co/bigscience) | 560M/1.1B/1.7B/3B/7.1B/176B | - | | [BLOOM/BLOOMZ](https://huggingface.co/bigscience) | 560M/1.1B/1.7B/3B/7.1B/176B | - |
| [ChatGLM3](https://huggingface.co/THUDM) | 6B | chatglm3 | | [ChatGLM3](https://huggingface.co/THUDM) | 6B | chatglm3 |
| [Command R](https://huggingface.co/CohereForAI) | 35B/104B | cohere | | [Command R](https://huggingface.co/CohereForAI) | 35B/104B | cohere |
| [DeepSeek (Code/MoE)](https://huggingface.co/deepseek-ai) | 7B/16B/67B/236B | deepseek | | [DeepSeek (Code/MoE)](https://huggingface.co/deepseek-ai) | 7B/16B/67B/236B | deepseek |
| [Falcon](https://huggingface.co/tiiuae) | 7B/11B/40B/180B | falcon | | [DeepSeek 2.5/3](https://huggingface.co/deepseek-ai) | 236B/671B | deepseek3 |
| [Gemma/Gemma 2/CodeGemma](https://huggingface.co/google) | 2B/7B/9B/27B | gemma | | [DeepSeek R1 (Distill)](https://huggingface.co/deepseek-ai) | 1.5B/7B/8B/14B/32B/70B/671B | deepseekr1 |
| [GLM-4](https://huggingface.co/THUDM) | 9B | glm4 | | [ERNIE-4.5](https://huggingface.co/baidu) | 0.3B/21B/300B | ernie/ernie_nothink |
| [InternLM2](https://huggingface.co/internlm) | 7B/20B | intern2 | | [Falcon](https://huggingface.co/tiiuae) | 7B/11B/40B/180B | falcon |
| [Llama](https://github.com/facebookresearch/llama) | 7B/13B/33B/65B | - | | [Falcon-H1](https://huggingface.co/tiiuae) | 0.5B/1.5B/3B/7B/34B | falcon_h1 |
| [Llama 2](https://huggingface.co/meta-llama) | 7B/13B/70B | llama2 | | [Gemma/Gemma 2/CodeGemma](https://huggingface.co/google) | 2B/7B/9B/27B | gemma/gemma2 |
| [Llama 3](https://huggingface.co/meta-llama) | 8B/70B | llama3 | | [Gemma 3/Gemma 3n](https://huggingface.co/google) | 270M/1B/4B/6B/8B/12B/27B | gemma3/gemma3n |
| [LLaVA-1.5](https://huggingface.co/llava-hf) | 7B/13B | vicuna | | [GLM-4/GLM-4-0414/GLM-Z1](https://huggingface.co/zai-org) | 9B/32B | glm4/glmz1 |
| [Mistral/Mixtral](https://huggingface.co/mistralai) | 7B/8x7B/8x22B | mistral | | [GLM-4.1V](https://huggingface.co/zai-org) | 9B | glm4v |
| [OLMo](https://huggingface.co/allenai) | 1B/7B | - | | [GLM-4.5/GLM-4.5V](https://huggingface.co/zai-org) | 106B/355B | glm4_moe/glm4v_moe |
| [PaliGemma](https://huggingface.co/google) | 3B | gemma | | [GPT-2](https://huggingface.co/openai-community) | 0.1B/0.4B/0.8B/1.5B | - |
| [Phi-1.5/Phi-2](https://huggingface.co/microsoft) | 1.3B/2.7B | - | | [GPT-OSS](https://huggingface.co/openai) | 20B/120B | gpt |
| [Phi-3](https://huggingface.co/microsoft) | 4B/7B/14B | phi | | [Granite 3.0-3.3](https://huggingface.co/ibm-granite) | 1B/2B/3B/8B | granite3 |
| [Qwen/Qwen1.5/Qwen2 (Code/MoE)](https://huggingface.co/Qwen) | 0.5B/1.5B/4B/7B/14B/32B/72B/110B | qwen | | [Granite 4](https://huggingface.co/ibm-granite) | 7B | granite4 |
| [StarCoder 2](https://huggingface.co/bigcode) | 3B/7B/15B | - | | [Hunyuan (MT)](https://huggingface.co/tencent/) | 7B | hunyuan |
| [XVERSE](https://huggingface.co/xverse) | 7B/13B/65B | xverse | | [Index](https://huggingface.co/IndexTeam) | 1.9B | index |
| [Yi/Yi-1.5](https://huggingface.co/01-ai) | 6B/9B/34B | yi | | [InternLM 2-3](https://huggingface.co/internlm) | 7B/8B/20B | intern2 |
| [Yi-VL](https://huggingface.co/01-ai) | 6B/34B | yi_vl | | [InternVL 2.5-3.5](https://huggingface.co/OpenGVLab) | 1B/2B/4B/8B/14B/30B/38B/78B/241B | intern_vl |
| [Yuan 2](https://huggingface.co/IEITYuan) | 2B/51B/102B | yuan | | [InternLM/Intern-S1-mini](https://huggingface.co/internlm/) | 8B | intern_s1 |
| [Kimi-VL](https://huggingface.co/moonshotai) | 16B | kimi_vl |
| [Ling 2.0 (mini/flash)](https://huggingface.co/inclusionAI) | 16B/100B | bailing_v2 |
| [Llama](https://github.com/facebookresearch/llama) | 7B/13B/33B/65B | - |
| [Llama 2](https://huggingface.co/meta-llama) | 7B/13B/70B | llama2 |
| [Llama 3-3.3](https://huggingface.co/meta-llama) | 1B/3B/8B/70B | llama3 |
| [Llama 4](https://huggingface.co/meta-llama) | 109B/402B | llama4 |
| [Llama 3.2 Vision](https://huggingface.co/meta-llama) | 11B/90B | mllama |
| [LLaVA-1.5](https://huggingface.co/llava-hf) | 7B/13B | llava |
| [LLaVA-NeXT](https://huggingface.co/llava-hf) | 7B/8B/13B/34B/72B/110B | llava_next |
| [LLaVA-NeXT-Video](https://huggingface.co/llava-hf) | 7B/34B | llava_next_video |
| [MiMo](https://huggingface.co/XiaomiMiMo) | 7B | mimo |
| [MiniCPM 1-4.1](https://huggingface.co/openbmb) | 0.5B/1B/2B/4B/8B | cpm/cpm3/cpm4 |
| [MiniCPM-o-2.6/MiniCPM-V-2.6](https://huggingface.co/openbmb) | 8B | minicpm_o/minicpm_v |
| [Ministral/Mistral-Nemo](https://huggingface.co/mistralai) | 8B/12B | ministral |
| [Mistral/Mixtral](https://huggingface.co/mistralai) | 7B/8x7B/8x22B | mistral |
| [Mistral Small](https://huggingface.co/mistralai) | 24B | mistral_small |
| [OLMo](https://huggingface.co/allenai) | 1B/7B | - |
| [PaliGemma/PaliGemma2](https://huggingface.co/google) | 3B/10B/28B | paligemma |
| [Phi-1.5/Phi-2](https://huggingface.co/microsoft) | 1.3B/2.7B | - |
| [Phi-3/Phi-3.5](https://huggingface.co/microsoft) | 4B/14B | phi |
| [Phi-3-small](https://huggingface.co/microsoft) | 7B | phi_small |
| [Phi-4](https://huggingface.co/microsoft) | 14B | phi4 |
| [Pixtral](https://huggingface.co/mistralai) | 12B | pixtral |
| [Qwen (1-2.5) (Code/Math/MoE/QwQ)](https://huggingface.co/Qwen) | 0.5B/1.5B/3B/7B/14B/32B/72B/110B | qwen |
| [Qwen3 (MoE/Instruct/Thinking/Next)](https://huggingface.co/Qwen) | 0.6B/1.7B/4B/8B/14B/32B/80B/235B | qwen3/qwen3_nothink |
| [Qwen2-Audio](https://huggingface.co/Qwen) | 7B | qwen2_audio |
| [Qwen2.5-Omni](https://huggingface.co/Qwen) | 3B/7B | qwen2_omni |
| [Qwen3-Omni](https://huggingface.co/Qwen) | 30B | qwen3_omni |
| [Qwen2-VL/Qwen2.5-VL/QVQ](https://huggingface.co/Qwen) | 2B/3B/7B/32B/72B | qwen2_vl |
| [Qwen3-VL](https://huggingface.co/Qwen) | 2B/4B/8B/30B/32B/235B | qwen3_vl |
| [Seed (OSS/Coder)](https://huggingface.co/ByteDance-Seed) | 8B/36B | seed_oss/seed_coder |
| [Skywork o1](https://huggingface.co/Skywork) | 8B | skywork_o1 |
| [StarCoder 2](https://huggingface.co/bigcode) | 3B/7B/15B | - |
| [TeleChat2](https://huggingface.co/Tele-AI) | 3B/7B/35B/115B | telechat2 |
| [XVERSE](https://huggingface.co/xverse) | 7B/13B/65B | xverse |
| [Yi/Yi-1.5 (Code)](https://huggingface.co/01-ai) | 1.5B/6B/9B/34B | yi |
| [Yi-VL](https://huggingface.co/01-ai) | 6B/34B | yi_vl |
| [Yuan 2](https://huggingface.co/IEITYuan) | 2B/51B/102B | yuan |
> [!NOTE] > [!NOTE]
> 对于所有“基座”Base模型`template` 参数可以是 `default`, `alpaca`, `vicuna` 等任意值。但“对话”Instruct/Chat模型请务必使用**对应的模板**。 > 对于所有“基座”Base模型`template` 参数可以是 `default`, `alpaca`, `vicuna` 等任意值。但“对话”Instruct/Chat模型请务必使用**对应的模板**。
> >
> 如果模型有推理 / 非推理两个版本,请使用 `_nothink` 后缀来区分不同的模板。例如 `qwen3` 和 `qwen3_nothink`。
>
> 请务必在训练和推理时采用**完全一致**的模板。 > 请务必在训练和推理时采用**完全一致**的模板。
>
> \*:您需要从 main 分支安装 `transformers` 并使用 `DISABLE_VERSION_CHECK=1` 来跳过版本检查。
>
> \*\*:您需要安装特定版本的 `transformers` 以使用该模型。
项目所支持模型的完整列表请参阅 [constants.py](src/llamafactory/extras/constants.py)。 项目所支持模型的完整列表请参阅 [constants.py](src/llamafactory/extras/constants.py)。
@@ -190,7 +360,7 @@ https://github.com/hiyouga/LLaMA-Factory/assets/16256802/ec36a9dd-37f4-4f72-81bd
## 训练方法 ## 训练方法
| 方法 | 全参数训练 | 部分参数训练 | LoRA | QLoRA | | 方法 | 全参数训练 | 部分参数训练 | LoRA | QLoRA |
| ---------------------- | ------------------ | ------------------ | ------------------ | ------------------ | | --------------------- | ------------------ | ------------------ | ------------------ | ------------------ |
| 预训练 | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | 预训练 | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| 指令监督微调 | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | 指令监督微调 | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| 奖励模型训练 | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | 奖励模型训练 | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
@@ -200,6 +370,9 @@ https://github.com/hiyouga/LLaMA-Factory/assets/16256802/ec36a9dd-37f4-4f72-81bd
| ORPO 训练 | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | ORPO 训练 | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| SimPO 训练 | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | SimPO 训练 | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
> [!TIP]
> 有关 PPO 的实现细节,请参考[此博客](https://newfacade.github.io/notes-on-reinforcement-learning/17-ppo-trl.html)。
## 数据集 ## 数据集
<details><summary>预训练数据集</summary> <details><summary>预训练数据集</summary>
@@ -213,6 +386,11 @@ https://github.com/hiyouga/LLaMA-Factory/assets/16256802/ec36a9dd-37f4-4f72-81bd
- [SkyPile (zh)](https://huggingface.co/datasets/Skywork/SkyPile-150B) - [SkyPile (zh)](https://huggingface.co/datasets/Skywork/SkyPile-150B)
- [FineWeb (en)](https://huggingface.co/datasets/HuggingFaceFW/fineweb) - [FineWeb (en)](https://huggingface.co/datasets/HuggingFaceFW/fineweb)
- [FineWeb-Edu (en)](https://huggingface.co/datasets/HuggingFaceFW/fineweb-edu) - [FineWeb-Edu (en)](https://huggingface.co/datasets/HuggingFaceFW/fineweb-edu)
- [CCI3-HQ (zh)](https://huggingface.co/datasets/BAAI/CCI3-HQ)
- [CCI3-Data (zh)](https://huggingface.co/datasets/BAAI/CCI3-Data)
- [CCI4.0-M2-Base-v1 (en&zh)](https://huggingface.co/datasets/BAAI/CCI4.0-M2-Base-v1)
- [CCI4.0-M2-CoT-v1 (en&zh)](https://huggingface.co/datasets/BAAI/CCI4.0-M2-CoT-v1)
- [CCI4.0-M2-Extra-v1 (en&zh)](https://huggingface.co/datasets/BAAI/CCI4.0-M2-Extra-v1)
- [The Stack (en)](https://huggingface.co/datasets/bigcode/the-stack) - [The Stack (en)](https://huggingface.co/datasets/bigcode/the-stack)
- [StarCoder (en)](https://huggingface.co/datasets/bigcode/starcoderdata) - [StarCoder (en)](https://huggingface.co/datasets/bigcode/starcoderdata)
@@ -250,6 +428,7 @@ https://github.com/hiyouga/LLaMA-Factory/assets/16256802/ec36a9dd-37f4-4f72-81bd
- [ShareGPT Hyperfiltered (en)](https://huggingface.co/datasets/totally-not-an-llm/sharegpt-hyperfiltered-3k) - [ShareGPT Hyperfiltered (en)](https://huggingface.co/datasets/totally-not-an-llm/sharegpt-hyperfiltered-3k)
- [ShareGPT4 (en&zh)](https://huggingface.co/datasets/shibing624/sharegpt_gpt4) - [ShareGPT4 (en&zh)](https://huggingface.co/datasets/shibing624/sharegpt_gpt4)
- [UltraChat 200k (en)](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) - [UltraChat 200k (en)](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k)
- [Infinity Instruct (zh)](https://huggingface.co/datasets/BAAI/Infinity-Instruct)
- [AgentInstruct (en)](https://huggingface.co/datasets/THUDM/AgentInstruct) - [AgentInstruct (en)](https://huggingface.co/datasets/THUDM/AgentInstruct)
- [LMSYS Chat 1M (en)](https://huggingface.co/datasets/lmsys/lmsys-chat-1m) - [LMSYS Chat 1M (en)](https://huggingface.co/datasets/lmsys/lmsys-chat-1m)
- [Evol Instruct V2 (en)](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_V2_196k) - [Evol Instruct V2 (en)](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_V2_196k)
@@ -257,9 +436,15 @@ https://github.com/hiyouga/LLaMA-Factory/assets/16256802/ec36a9dd-37f4-4f72-81bd
- [STEM (zh)](https://huggingface.co/datasets/hfl/stem_zh_instruction) - [STEM (zh)](https://huggingface.co/datasets/hfl/stem_zh_instruction)
- [Ruozhiba (zh)](https://huggingface.co/datasets/hfl/ruozhiba_gpt4_turbo) - [Ruozhiba (zh)](https://huggingface.co/datasets/hfl/ruozhiba_gpt4_turbo)
- [Neo-sft (zh)](https://huggingface.co/datasets/m-a-p/neo_sft_phase2) - [Neo-sft (zh)](https://huggingface.co/datasets/m-a-p/neo_sft_phase2)
- [WebInstructSub (en)](https://huggingface.co/datasets/TIGER-Lab/WebInstructSub)
- [Magpie-Pro-300K-Filtered (en)](https://huggingface.co/datasets/Magpie-Align/Magpie-Pro-300K-Filtered) - [Magpie-Pro-300K-Filtered (en)](https://huggingface.co/datasets/Magpie-Align/Magpie-Pro-300K-Filtered)
- [Magpie-ultra-v0.1 (en)](https://huggingface.co/datasets/argilla/magpie-ultra-v0.1)
- [WebInstructSub (en)](https://huggingface.co/datasets/TIGER-Lab/WebInstructSub)
- [OpenO1-SFT (en&zh)](https://huggingface.co/datasets/O1-OPEN/OpenO1-SFT)
- [Open-Thoughts (en)](https://huggingface.co/datasets/open-thoughts/OpenThoughts-114k)
- [Open-R1-Math (en)](https://huggingface.co/datasets/open-r1/OpenR1-Math-220k)
- [Chinese-DeepSeek-R1-Distill (zh)](https://huggingface.co/datasets/Congliu/Chinese-DeepSeek-R1-Distill-data-110k-SFT)
- [LLaVA mixed (en&zh)](https://huggingface.co/datasets/BUAADreamer/llava-en-zh-300k) - [LLaVA mixed (en&zh)](https://huggingface.co/datasets/BUAADreamer/llava-en-zh-300k)
- [Pokemon-gpt4o-captions (en&zh)](https://huggingface.co/datasets/jugg1024/pokemon-gpt4o-captions)
- [Open Assistant (de)](https://huggingface.co/datasets/mayflowergmbh/oasst_de) - [Open Assistant (de)](https://huggingface.co/datasets/mayflowergmbh/oasst_de)
- [Dolly 15k (de)](https://huggingface.co/datasets/mayflowergmbh/dolly-15k_de) - [Dolly 15k (de)](https://huggingface.co/datasets/mayflowergmbh/dolly-15k_de)
- [Alpaca GPT4 (de)](https://huggingface.co/datasets/mayflowergmbh/alpaca-gpt4_de) - [Alpaca GPT4 (de)](https://huggingface.co/datasets/mayflowergmbh/alpaca-gpt4_de)
@@ -276,6 +461,10 @@ https://github.com/hiyouga/LLaMA-Factory/assets/16256802/ec36a9dd-37f4-4f72-81bd
- [DPO mixed (en&zh)](https://huggingface.co/datasets/hiyouga/DPO-En-Zh-20k) - [DPO mixed (en&zh)](https://huggingface.co/datasets/hiyouga/DPO-En-Zh-20k)
- [UltraFeedback (en)](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized) - [UltraFeedback (en)](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized)
- [COIG-P (zh)](https://huggingface.co/datasets/m-a-p/COIG-P)
- [RLHF-V (en)](https://huggingface.co/datasets/openbmb/RLHF-V-Dataset)
- [VLFeedback (en)](https://huggingface.co/datasets/Zhihui/VLFeedback)
- [RLAIF-V (en)](https://huggingface.co/datasets/openbmb/RLAIF-V-Dataset)
- [Orca DPO Pairs (en)](https://huggingface.co/datasets/Intel/orca_dpo_pairs) - [Orca DPO Pairs (en)](https://huggingface.co/datasets/Intel/orca_dpo_pairs)
- [HH-RLHF (en)](https://huggingface.co/datasets/Anthropic/hh-rlhf) - [HH-RLHF (en)](https://huggingface.co/datasets/Anthropic/hh-rlhf)
- [Nectar (en)](https://huggingface.co/datasets/berkeley-nest/Nectar) - [Nectar (en)](https://huggingface.co/datasets/berkeley-nest/Nectar)
@@ -293,37 +482,37 @@ huggingface-cli login
## 软硬件依赖 ## 软硬件依赖
| 必需项 | 至少 | 推荐 | | 必需项 | 至少 | 推荐 |
| ------------ | ------- | --------- | | ------------ | ------- | --------- |
| python | 3.8 | 3.11 | | python | 3.9 | 3.10 |
| torch | 1.13.1 | 2.3.0 | | torch | 2.0.0 | 2.6.0 |
| transformers | 4.41.2 | 4.41.2 | | torchvision | 0.15.0 | 0.21.0 |
| datasets | 2.16.0 | 2.19.2 | | transformers | 4.49.0 | 4.50.0 |
| accelerate | 0.30.1 | 0.30.1 | | datasets | 2.16.0 | 3.2.0 |
| peft | 0.11.1 | 0.11.1 | | accelerate | 0.34.0 | 1.2.1 |
| trl | 0.8.6 | 0.9.4 | | peft | 0.14.0 | 0.15.1 |
| trl | 0.8.6 | 0.9.6 |
| 可选项 | 至少 | 推荐 | | 可选项 | 至少 | 推荐 |
| ------------ | ------- | --------- | | ------------ | ------- | --------- |
| CUDA | 11.6 | 12.2 | | CUDA | 11.6 | 12.2 |
| deepspeed | 0.10.0 | 0.14.0 | | deepspeed | 0.10.0 | 0.16.4 |
| bitsandbytes | 0.39.0 | 0.43.1 | | bitsandbytes | 0.39.0 | 0.43.1 |
| vllm | 0.4.3 | 0.4.3 | | vllm | 0.4.3 | 0.8.2 |
| flash-attn | 2.3.0 | 2.5.9 | | flash-attn | 2.5.6 | 2.7.2 |
### 硬件依赖 ### 硬件依赖
\* *估算值* \* *估算值*
| 方法 | 精度 | 7B | 13B | 30B | 70B | 110B | 8x7B | 8x22B | | 方法 | 精度 | 7B | 14B | 30B | 70B | `x`B |
| ----------------- | ---- | ----- | ----- | ----- | ------ | ------ | ----- | ------ | | ------------------------------- | ---- | ----- | ----- | ----- | ------ | ------- |
| Full | AMP | 120GB | 240GB | 600GB | 1200GB | 2000GB | 900GB | 2400GB | | Full (`bf16` or `fp16`) | 32 | 120GB | 240GB | 600GB | 1200GB | `18x`GB |
| Full | 16 | 60GB | 120GB | 300GB | 600GB | 900GB | 400GB | 1200GB | | Full (`pure_bf16`) | 16 | 60GB | 120GB | 300GB | 600GB | `8x`GB |
| Freeze | 16 | 20GB | 40GB | 80GB | 200GB | 360GB | 160GB | 400GB | | Freeze/LoRA/GaLore/APOLLO/BAdam | 16 | 16GB | 32GB | 64GB | 160GB | `2x`GB |
| LoRA/GaLore/BAdam | 16 | 16GB | 32GB | 64GB | 160GB | 240GB | 120GB | 320GB | | QLoRA | 8 | 10GB | 20GB | 40GB | 80GB | `x`GB |
| QLoRA | 8 | 10GB | 20GB | 40GB | 80GB | 140GB | 60GB | 160GB | | QLoRA | 4 | 6GB | 12GB | 24GB | 48GB | `x/2`GB |
| QLoRA | 4 | 6GB | 12GB | 24GB | 48GB | 72GB | 30GB | 96GB | | QLoRA | 2 | 4GB | 8GB | 16GB | 24GB | `x/4`GB |
| QLoRA | 2 | 4GB | 8GB | 16GB | 24GB | 48GB | 18GB | 48GB |
## 如何使用 ## 如何使用
@@ -332,32 +521,77 @@ huggingface-cli login
> [!IMPORTANT] > [!IMPORTANT]
> 此步骤为必需。 > 此步骤为必需。
#### 从源码安装
```bash ```bash
git clone --depth 1 https://github.com/hiyouga/LLaMA-Factory.git git clone --depth 1 https://github.com/hiyouga/LLaMA-Factory.git
cd LLaMA-Factory cd LLaMA-Factory
pip install -e ".[torch,metrics]" pip install -e ".[torch,metrics]" --no-build-isolation
``` ```
可选的额外依赖项torch、torch-npu、metrics、deepspeed、bitsandbytes、hqq、eetq、gptq、awq、aqlm、vllm、galore、badam、qwen、modelscope、quality 可选的额外依赖项torch、torch-npu、metrics、deepspeed、liger-kernel、bitsandbytes、hqq、eetq、gptq、aqlm、vllm、sglang、galore、apollo、badam、adam-mini、qwen、minicpm_v、openmind、swanlab、dev
> [!TIP] #### 从镜像安装
> 遇到包冲突时,可使用 `pip install --no-deps -e .` 解决。
```bash
docker run -it --rm --gpus=all --ipc=host hiyouga/llamafactory:latest
```
该镜像基于 Ubuntu 22.04x86\_64、CUDA 12.4、Python 3.11、PyTorch 2.6.0 和 Flash-attn 2.7.4 构建。
查看全部镜像https://hub.docker.com/r/hiyouga/llamafactory/tags
请参阅[构建 Docker](#构建-docker) 来重新构建镜像。
<details><summary>使用 <b>uv</b> 构建虚拟环境</summary>
使用 [uv](https://github.com/astral-sh/uv) 创建隔离的 Python 环境:
```bash
uv sync --extra torch --extra metrics --prerelease=allow
```
在环境中运行 LLaMA-Factory
```bash
uv run --prerelease=allow llamafactory-cli train examples/train_lora/llama3_lora_pretrain.yaml
```
</details>
<details><summary>Windows 用户指南</summary> <details><summary>Windows 用户指南</summary>
#### 安装 PyTorch
Windows 平台需要额外手动安装 GPU 版本的 PyTorch 依赖包,您可以参考[官方网站](https://pytorch.org/get-started/locally/)和以下命令安装并测试 PyTorch 是否正确安装。
```bash
pip uninstall torch torchvision torchaudio
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu126
python -c "import torch; print(torch.cuda.is_available())"
```
如果看到 `True` 则说明安装成功。
若遇到类似 `Can't pickle local object` 的报错,请设置 `dataloader_num_workers: 0`
#### 安装 BitsAndBytes
如果要在 Windows 平台上开启量化 LoRAQLoRA需要安装预编译的 `bitsandbytes` 库, 支持 CUDA 11.1 到 12.2, 请根据您的 CUDA 版本情况选择适合的[发布版本](https://github.com/jllllll/bitsandbytes-windows-webui/releases/tag/wheels)。 如果要在 Windows 平台上开启量化 LoRAQLoRA需要安装预编译的 `bitsandbytes` 库, 支持 CUDA 11.1 到 12.2, 请根据您的 CUDA 版本情况选择适合的[发布版本](https://github.com/jllllll/bitsandbytes-windows-webui/releases/tag/wheels)。
```bash ```bash
pip install https://github.com/jllllll/bitsandbytes-windows-webui/releases/download/wheels/bitsandbytes-0.41.2.post2-py3-none-win_amd64.whl pip install https://github.com/jllllll/bitsandbytes-windows-webui/releases/download/wheels/bitsandbytes-0.41.2.post2-py3-none-win_amd64.whl
``` ```
如果要在 Windows 平台上开启 FlashAttention-2需要安装预编译的 `flash-attn` 库,支持 CUDA 12.1 到 12.2,请根据需求到 [flash-attention](https://github.com/bdashore3/flash-attention/releases) 下载对应版本安装。 #### 安装 Flash Attention-2
如果要在 Windows 平台上开启 FlashAttention-2请使用 [flash-attention-windows-wheel](https://huggingface.co/lldacing/flash-attention-windows-wheel) 中的脚本自行编译与安装。
</details> </details>
<details><summary>昇腾 NPU 用户指南</summary> <details><summary>昇腾 NPU 用户指南</summary>
在昇腾 NPU 设备上安装 LLaMA Factory 时,需要指定额外依赖项,使用 `pip install -e ".[torch-npu,metrics]"` 命令安装。此外,还需要安装 **[Ascend CANN Toolkit 与 Kernels](https://www.hiascend.com/developer/download/community/result?module=cann)**,安装方法请参考[安装教程](https://www.hiascend.com/document/detail/zh/CANNCommunityEdition/80RC2alpha002/quickstart/quickstart/quickstart_18_0004.html)或使用以下命令: 在昇腾 NPU 设备上安装 LLaMA Factory 时,请升级 Python 到 3.10 及以上,并需要指定额外依赖项,使用 `pip install -e ".[torch-npu,metrics]"` 命令安装。此外,还需要安装 **[Ascend CANN Toolkit 与 Kernels](https://www.hiascend.com/developer/download/community/result?module=cann)**,安装方法请参考[安装教程](https://www.hiascend.com/document/detail/zh/CANNCommunityEdition/80RC2alpha002/quickstart/quickstart/quickstart_18_0004.html)或使用以下命令:
```bash ```bash
# 请替换 URL 为 CANN 版本和设备型号对应的 URL # 请替换 URL 为 CANN 版本和设备型号对应的 URL
@@ -373,12 +607,13 @@ bash Ascend-cann-kernels-910b_8.0.RC1.alpha001_linux.run --install
source /usr/local/Ascend/ascend-toolkit/set_env.sh source /usr/local/Ascend/ascend-toolkit/set_env.sh
``` ```
| 依赖项 | 至少 | 推荐 | | 依赖项 | 至少 | 推荐 |
| ------------ | ------- | ----------- | | ------------ | ------- | -------------- |
| CANN | 8.0.RC1 | 8.0.RC1 | | CANN | 8.0.RC1 | 8.0.0.alpha002 |
| torch | 2.1.0 | 2.1.0 | | torch | 2.1.0 | 2.4.0 |
| torch-npu | 2.1.0 | 2.1.0.post3 | | torch-npu | 2.1.0 | 2.4.0.post2 |
| deepspeed | 0.13.2 | 0.13.2 | | deepspeed | 0.13.2 | 0.13.2 |
| vllm-ascend | - | 0.7.3 |
请使用 `ASCEND_RT_VISIBLE_DEVICES` 而非 `CUDA_VISIBLE_DEVICES` 来指定运算设备。 请使用 `ASCEND_RT_VISIBLE_DEVICES` 而非 `CUDA_VISIBLE_DEVICES` 来指定运算设备。
@@ -386,15 +621,51 @@ source /usr/local/Ascend/ascend-toolkit/set_env.sh
下载预构建 Docker 镜像:[32GB](http://mirrors.cn-central-221.ovaijisuan.com/detail/130.html) | [64GB](http://mirrors.cn-central-221.ovaijisuan.com/detail/131.html) 下载预构建 Docker 镜像:[32GB](http://mirrors.cn-central-221.ovaijisuan.com/detail/130.html) | [64GB](http://mirrors.cn-central-221.ovaijisuan.com/detail/131.html)
#### 安装 BitsAndBytes
如果要在 Ascend NPU 上进行基于 bitsandbytes 的 QLoRA 量化微调,请执行如下步骤:
1. 手动编译 bitsandbytes请参考[安装文档](https://huggingface.co/docs/bitsandbytes/installation?backend=Ascend+NPU&platform=Ascend+NPU)完成 NPU 版的 bitsandbytes 安装,编译要求环境 cmake 版本不低于 3.22.1g++ 版本不低于 12.x。
```bash
# 从源码安装 bitsandbytes
# 克隆 bitsandbytes 仓库, Ascend NPU 目前在 multi-backend-refactor 中支持
git clone -b multi-backend-refactor https://github.com/bitsandbytes-foundation/bitsandbytes.git
cd bitsandbytes/
# 安装依赖
pip install -r requirements-dev.txt
# 安装编译工具依赖,该步骤在不同系统上命令有所不同,供参考
apt-get install -y build-essential cmake
# 编译 & 安装
cmake -DCOMPUTE_BACKEND=npu -S .
make
pip install .
```
2. 安装 transformers 的 main 分支版本。
```bash
git clone -b main https://github.com/huggingface/transformers.git
cd transformers
pip install .
```
3. 在训练参数中设置 `double_quantization: false`,可参考[示例](examples/train_qlora/llama3_lora_sft_bnb_npu.yaml)。
</details> </details>
### 数据准备 ### 数据准备
关于数据集文件的格式,请参考 [data/README_zh.md](data/README_zh.md) 的内容。你可以使用 HuggingFace / ModelScope 上的数据集或加载本地数据集。 关于数据集文件的格式,请参考 [data/README_zh.md](data/README_zh.md) 的内容。你可以使用 HuggingFace / ModelScope / Modelers 上的数据集或加载本地数据集。
> [!NOTE] > [!NOTE]
> 使用自定义数据集时,请更新 `data/dataset_info.json` 文件。 > 使用自定义数据集时,请更新 `data/dataset_info.json` 文件。
您也可以使用 **[Easy Dataset](https://github.com/ConardLi/easy-dataset)**、**[DataFlow](https://github.com/OpenDCAI/DataFlow)** 和 **[GraphGen](https://github.com/open-sciencelab/GraphGen)** 构建用于微调的合成数据。
### 快速开始 ### 快速开始
下面三行命令分别对 Llama3-8B-Instruct 模型进行 LoRA **微调**、**推理**和**合并**。 下面三行命令分别对 Llama3-8B-Instruct 模型进行 LoRA **微调**、**推理**和**合并**。
@@ -409,6 +680,8 @@ llamafactory-cli export examples/merge_lora/llama3_lora_sft.yaml
> [!TIP] > [!TIP]
> 使用 `llamafactory-cli help` 显示帮助信息。 > 使用 `llamafactory-cli help` 显示帮助信息。
>
> 遇到报错请先看[常见问题](https://github.com/hiyouga/LLaMA-Factory/issues/4614)。
### LLaMA Board 可视化微调(由 [Gradio](https://github.com/gradio-app/gradio) 驱动) ### LLaMA Board 可视化微调(由 [Gradio](https://github.com/gradio-app/gradio) 驱动)
@@ -416,22 +689,34 @@ llamafactory-cli export examples/merge_lora/llama3_lora_sft.yaml
llamafactory-cli webui llamafactory-cli webui
``` ```
### LLaMA Factory Online 在线微调
详情阅读该[文档](https://docs.llamafactory.com.cn/docs/documents/quickstart/getstarted/?utm_source=LLaMA-Factory)。
### 构建 Docker ### 构建 Docker
CUDA 用户: CUDA 用户:
```bash ```bash
cd docker/docker-cuda/ cd docker/docker-cuda/
docker-compose up -d docker compose up -d
docker-compose exec llamafactory bash docker compose exec llamafactory bash
``` ```
昇腾 NPU 用户: 昇腾 NPU 用户:
```bash ```bash
cd docker/docker-npu/ cd docker/docker-npu/
docker-compose up -d docker compose up -d
docker-compose exec llamafactory bash docker compose exec llamafactory bash
```
AMD ROCm 用户:
```bash
cd docker/docker-rocm/
docker compose up -d
docker compose exec llamafactory bash
``` ```
<details><summary>不使用 Docker Compose 构建</summary> <details><summary>不使用 Docker Compose 构建</summary>
@@ -440,21 +725,13 @@ CUDA 用户:
```bash ```bash
docker build -f ./docker/docker-cuda/Dockerfile \ docker build -f ./docker/docker-cuda/Dockerfile \
--build-arg INSTALL_BNB=false \
--build-arg INSTALL_VLLM=false \
--build-arg INSTALL_DEEPSPEED=false \
--build-arg INSTALL_FLASHATTN=false \
--build-arg PIP_INDEX=https://pypi.org/simple \ --build-arg PIP_INDEX=https://pypi.org/simple \
--build-arg EXTRAS=metrics \
-t llamafactory:latest . -t llamafactory:latest .
docker run -dit --gpus=all \ docker run -dit --ipc=host --gpus=all \
-v ./hf_cache:/root/.cache/huggingface \
-v ./ms_cache:/root/.cache/modelscope \
-v ./data:/app/data \
-v ./output:/app/output \
-p 7860:7860 \ -p 7860:7860 \
-p 8000:8000 \ -p 8000:8000 \
--shm-size 16G \
--name llamafactory \ --name llamafactory \
llamafactory:latest llamafactory:latest
@@ -464,18 +741,12 @@ docker exec -it llamafactory bash
昇腾 NPU 用户: 昇腾 NPU 用户:
```bash ```bash
# 根据您的环境选择镜像
docker build -f ./docker/docker-npu/Dockerfile \ docker build -f ./docker/docker-npu/Dockerfile \
--build-arg INSTALL_DEEPSPEED=false \
--build-arg PIP_INDEX=https://pypi.org/simple \ --build-arg PIP_INDEX=https://pypi.org/simple \
--build-arg EXTRAS=torch-npu,metrics \
-t llamafactory:latest . -t llamafactory:latest .
# 根据您的资源更改 `device` docker run -dit --ipc=host \
docker run -dit \
-v ./hf_cache:/root/.cache/huggingface \
-v ./ms_cache:/root/.cache/modelscope \
-v ./data:/app/data \
-v ./output:/app/output \
-v /usr/local/dcmi:/usr/local/dcmi \ -v /usr/local/dcmi:/usr/local/dcmi \
-v /usr/local/bin/npu-smi:/usr/local/bin/npu-smi \ -v /usr/local/bin/npu-smi:/usr/local/bin/npu-smi \
-v /usr/local/Ascend/driver:/usr/local/Ascend/driver \ -v /usr/local/Ascend/driver:/usr/local/Ascend/driver \
@@ -486,7 +757,25 @@ docker run -dit \
--device /dev/davinci_manager \ --device /dev/davinci_manager \
--device /dev/devmm_svm \ --device /dev/devmm_svm \
--device /dev/hisi_hdc \ --device /dev/hisi_hdc \
--shm-size 16G \ --name llamafactory \
llamafactory:latest
docker exec -it llamafactory bash
```
AMD ROCm 用户:
```bash
docker build -f ./docker/docker-rocm/Dockerfile \
--build-arg PIP_INDEX=https://pypi.org/simple \
--build-arg EXTRAS=metrics \
-t llamafactory:latest .
docker run -dit --ipc=host \
-p 7860:7860 \
-p 8000:8000 \
--device /dev/kfd \
--device /dev/dri \
--name llamafactory \ --name llamafactory \
llamafactory:latest llamafactory:latest
@@ -495,22 +784,28 @@ docker exec -it llamafactory bash
</details> </details>
<details><summary>数据卷详情</summary> <details><summary>使用数据卷</summary>
- hf_cache使用宿主机的 Hugging Face 缓存文件夹,允许更改为新的目录 您可以通过移除 Dockerfile 中 `VOLUME [ "/root/.cache/huggingface", "/app/shared_data", "/app/output" ]` 的注释来使用数据卷
- data宿主机中存放数据集的文件夹路径。
- output将导出目录设置为该路径后即可在宿主机中访问导出后的模型 在构建 Docker 时使用参数 `-v ./hf_cache:/root/.cache/huggingface` 来挂载数据卷。各个数据卷的含义表示如下
- `hf_cache`:使用宿主机的 Hugging Face 缓存文件夹。
- `shared_data`:宿主机中存放数据集的文件夹路径。
- `output`:将导出目录设置为该路径后,即可在宿主机中访问导出后的模型。
</details> </details>
### 利用 vLLM 部署 OpenAI API ### 利用 vLLM 部署 OpenAI API
```bash ```bash
API_PORT=8000 llamafactory-cli api examples/inference/llama3_vllm.yaml API_PORT=8000 llamafactory-cli api examples/inference/llama3.yaml infer_backend=vllm vllm_enforce_eager=true
``` ```
> [!TIP] > [!TIP]
> API 文档请查阅 https://platform.openai.com/docs/api-reference/chat/create。 > API 文档请查阅[这里](https://platform.openai.com/docs/api-reference/chat/create)
>
> 示例:[图像理解](scripts/api_example/test_image.py) | [工具调用](scripts/api_example/test_toolcall.py)
### 从魔搭社区下载 ### 从魔搭社区下载
@@ -522,6 +817,16 @@ export USE_MODELSCOPE_HUB=1 # Windows 使用 `set USE_MODELSCOPE_HUB=1`
`model_name_or_path` 设置为模型 ID 来加载对应的模型。在[魔搭社区](https://modelscope.cn/models)查看所有可用的模型,例如 `LLM-Research/Meta-Llama-3-8B-Instruct` `model_name_or_path` 设置为模型 ID 来加载对应的模型。在[魔搭社区](https://modelscope.cn/models)查看所有可用的模型,例如 `LLM-Research/Meta-Llama-3-8B-Instruct`
### 从魔乐社区下载
您也可以通过下述方法,使用魔乐社区下载数据集和模型。
```bash
export USE_OPENMIND_HUB=1 # Windows 使用 `set USE_OPENMIND_HUB=1`
```
`model_name_or_path` 设置为模型 ID 来加载对应的模型。在[魔乐社区](https://modelers.cn/models)查看所有可用的模型,例如 `TeleAI/TeleChat-7B-pt`
### 使用 W&B 面板 ### 使用 W&B 面板
若要使用 [Weights & Biases](https://wandb.ai) 记录实验数据,请在 yaml 文件中添加下面的参数。 若要使用 [Weights & Biases](https://wandb.ai) 记录实验数据,请在 yaml 文件中添加下面的参数。
@@ -533,6 +838,21 @@ run_name: test_run # 可选
在启动训练任务时,将 `WANDB_API_KEY` 设置为[密钥](https://wandb.ai/authorize)来登录 W&B 账户。 在启动训练任务时,将 `WANDB_API_KEY` 设置为[密钥](https://wandb.ai/authorize)来登录 W&B 账户。
### 使用 SwanLab 面板
若要使用 [SwanLab](https://github.com/SwanHubX/SwanLab) 记录实验数据,请在 yaml 文件中添加下面的参数。
```yaml
use_swanlab: true
swanlab_run_name: test_run # 可选
```
在启动训练任务时登录SwanLab账户有以下三种方式
方式一:在 yaml 文件中添加 `swanlab_api_key=<your_api_key>` ,并设置为你的 [API 密钥](https://swanlab.cn/settings)。
方式二:将环境变量 `SWANLAB_API_KEY` 设置为你的 [API 密钥](https://swanlab.cn/settings)。
方式三:启动前使用 `swanlab login` 命令完成登录。
## 使用了 LLaMA Factory 的项目 ## 使用了 LLaMA Factory 的项目
如果您有项目希望添加至下述列表,请通过邮件联系或者创建一个 PR。 如果您有项目希望添加至下述列表,请通过邮件联系或者创建一个 PR。
@@ -600,17 +920,41 @@ run_name: test_run # 可选
1. Feng et al. SS-Bench: A Benchmark for Social Story Generation and Evaluation. 2024. [[arxiv]](https://arxiv.org/abs/2406.15695) 1. Feng et al. SS-Bench: A Benchmark for Social Story Generation and Evaluation. 2024. [[arxiv]](https://arxiv.org/abs/2406.15695)
1. Feng et al. Self-Constructed Context Decompilation with Fined-grained Alignment Enhancement. 2024. [[arxiv]](https://arxiv.org/abs/2406.17233) 1. Feng et al. Self-Constructed Context Decompilation with Fined-grained Alignment Enhancement. 2024. [[arxiv]](https://arxiv.org/abs/2406.17233)
1. Liu et al. Large Language Models for Cuffless Blood Pressure Measurement From Wearable Biosignals. 2024. [[arxiv]](https://arxiv.org/abs/2406.18069) 1. Liu et al. Large Language Models for Cuffless Blood Pressure Measurement From Wearable Biosignals. 2024. [[arxiv]](https://arxiv.org/abs/2406.18069)
1. Iyer et al. Exploring Very Low-Resource Translation with LLMs: The University of Edinburghs Submission to AmericasNLP 2024 Translation Task. AmericasNLP 2024. [[paper]](https://aclanthology.org/2024.americasnlp-1.25) 1. Iyer et al. Exploring Very Low-Resource Translation with LLMs: The University of Edinburgh's Submission to AmericasNLP 2024 Translation Task. AmericasNLP 2024. [[paper]](https://aclanthology.org/2024.americasnlp-1.25)
1. Li et al. Calibrating LLMs with Preference Optimization on Thought Trees for Generating Rationale in Science Question Scoring. 2024. [[arxiv]](https://arxiv.org/abs/2406.19949)
1. Yang et al. Financial Knowledge Large Language Model. 2024. [[arxiv]](https://arxiv.org/abs/2407.00365)
1. Lin et al. DogeRM: Equipping Reward Models with Domain Knowledge through Model Merging. 2024. [[arxiv]](https://arxiv.org/abs/2407.01470)
1. Bako et al. Evaluating the Semantic Profiling Abilities of LLMs for Natural Language Utterances in Data Visualization. 2024. [[arxiv]](https://arxiv.org/abs/2407.06129)
1. Huang et al. RoLoRA: Fine-tuning Rotated Outlier-free LLMs for Effective Weight-Activation Quantization. 2024. [[arxiv]](https://arxiv.org/abs/2407.08044)
1. Jiang et al. LLM-Collaboration on Automatic Science Journalism for the General Audience. 2024. [[arxiv]](https://arxiv.org/abs/2407.09756)
1. Inouye et al. Applied Auto-tuning on LoRA Hyperparameters. 2024. [[paper]](https://scholarcommons.scu.edu/cseng_senior/272/)
1. Qi et al. Research on Tibetan Tourism Viewpoints information generation system based on LLM. 2024. [[arxiv]](https://arxiv.org/abs/2407.13561)
1. Xu et al. Course-Correction: Safety Alignment Using Synthetic Preferences. 2024. [[arxiv]](https://arxiv.org/abs/2407.16637)
1. Sun et al. LAMBDA: A Large Model Based Data Agent. 2024. [[arxiv]](https://arxiv.org/abs/2407.17535)
1. Zhu et al. CollectiveSFT: Scaling Large Language Models for Chinese Medical Benchmark with Collective Instructions in Healthcare. 2024. [[arxiv]](https://arxiv.org/abs/2407.19705)
1. Yu et al. Correcting Negative Bias in Large Language Models through Negative Attention Score Alignment. 2024. [[arxiv]](https://arxiv.org/abs/2408.00137)
1. Xie et al. The Power of Personalized Datasets: Advancing Chinese Composition Writing for Elementary School through Targeted Model Fine-Tuning. IALP 2024. [[paper]](https://www.asianlp.sg/conferences/ialp2024/proceedings/papers/IALP2024_P055.pdf)
1. Liu et al. Instruct-Code-Llama: Improving Capabilities of Language Model in Competition Level Code Generation by Online Judge Feedback. ICIC 2024. [[paper]](https://link.springer.com/chapter/10.1007/978-981-97-5669-8_11)
1. Wang et al. Cybernetic Sentinels: Unveiling the Impact of Safety Data Selection on Model Security in Supervised Fine-Tuning. ICIC 2024. [[paper]](https://link.springer.com/chapter/10.1007/978-981-97-5669-8_23)
1. Xia et al. Understanding the Performance and Estimating the Cost of LLM Fine-Tuning. 2024. [[arxiv]](https://arxiv.org/abs/2408.04693)
1. Zeng et al. Perceive, Reflect, and Plan: Designing LLM Agent for Goal-Directed City Navigation without Instructions. 2024. [[arxiv]](https://arxiv.org/abs/2408.04168)
1. Xia et al. Using Pre-trained Language Model for Accurate ESG Prediction. FinNLP 2024. [[paper]](https://aclanthology.org/2024.finnlp-2.1/)
1. Liang et al. I-SHEEP: Self-Alignment of LLM from Scratch through an Iterative Self-Enhancement Paradigm. 2024. [[arxiv]](https://arxiv.org/abs/2408.08072)
1. Bai et al. Aligning Large Language Model with Direct Multi-Preference Optimization for Recommendation. CIKM 2024. [[paper]](https://dl.acm.org/doi/10.1145/3627673.3679611)
1. **[StarWhisper](https://github.com/Yu-Yang-Li/StarWhisper)**: 天文大模型 StarWhisper基于 ChatGLM2-6B 和 Qwen-14B 在天文数据上微调而得。 1. **[StarWhisper](https://github.com/Yu-Yang-Li/StarWhisper)**: 天文大模型 StarWhisper基于 ChatGLM2-6B 和 Qwen-14B 在天文数据上微调而得。
1. **[DISC-LawLLM](https://github.com/FudanDISC/DISC-LawLLM)**: 中文法律领域大模型 DISC-LawLLM基于 Baichuan-13B 微调而得,具有法律推理和知识检索能力。 1. **[DISC-LawLLM](https://github.com/FudanDISC/DISC-LawLLM)**: 中文法律领域大模型 DISC-LawLLM基于 Baichuan-13B 微调而得,具有法律推理和知识检索能力。
1. **[Sunsimiao](https://github.com/X-D-Lab/Sunsimiao)**: 孙思邈中文医疗大模型 Sumsimiao基于 Baichuan-7B 和 ChatGLM-6B 在中文医疗数据上微调而得。 1. **[Sunsimiao](https://github.com/X-D-Lab/Sunsimiao)**: 孙思邈中文医疗大模型 Sumsimiao基于 Baichuan-7B 和 ChatGLM-6B 在中文医疗数据上微调而得。
1. **[CareGPT](https://github.com/WangRongsheng/CareGPT)**: 医疗大模型项目 CareGPT基于 LLaMA2-7B 和 Baichuan-13B 在中文医疗数据上微调而得。 1. **[CareGPT](https://github.com/WangRongsheng/CareGPT)**: 医疗大模型项目 CareGPT基于 LLaMA2-7B 和 Baichuan-13B 在中文医疗数据上微调而得。
1. **[MachineMindset](https://github.com/PKU-YuanGroup/Machine-Mindset/)**MBTI性格大模型项目根据数据集与训练方式让任意 LLM 拥有 16 个不同的性格类型。 1. **[MachineMindset](https://github.com/PKU-YuanGroup/Machine-Mindset/)**MBTI性格大模型项目根据数据集与训练方式让任意 LLM 拥有 16 个不同的性格类型。
1. **[Luminia-13B-v3](https://huggingface.co/Nekochu/Luminia-13B-v3)**:一个用于生成 Stable Diffusion 提示词的大型语言模型。[[🤗Demo]](https://huggingface.co/spaces/Nekochu/Luminia-13B_SD_Prompt) 1. **[Luminia-13B-v3](https://huggingface.co/Nekochu/Luminia-13B-v3)**:一个用于生成 Stable Diffusion 提示词的大型语言模型。[[demo]](https://huggingface.co/spaces/Nekochu/Luminia-13B_SD_Prompt)
1. **[Chinese-LLaVA-Med](https://github.com/BUAADreamer/Chinese-LLaVA-Med)**:中文多模态医学大模型,基于 LLaVA-1.5-7B 在中文多模态医疗数据上微调而得。 1. **[Chinese-LLaVA-Med](https://github.com/BUAADreamer/Chinese-LLaVA-Med)**:中文多模态医学大模型,基于 LLaVA-1.5-7B 在中文多模态医疗数据上微调而得。
1. **[AutoRE](https://github.com/THUDM/AutoRE)**:基于大语言模型的文档级关系抽取系统。 1. **[AutoRE](https://github.com/THUDM/AutoRE)**:基于大语言模型的文档级关系抽取系统。
1. **[NVIDIA RTX AI Toolkit](https://github.com/NVIDIA/RTX-AI-Toolkit)**:在 Windows 主机上利用英伟达 RTX 设备进行大型语言模型微调的开发包。 1. **[NVIDIA RTX AI Toolkit](https://github.com/NVIDIA/RTX-AI-Toolkit)**:在 Windows 主机上利用英伟达 RTX 设备进行大型语言模型微调的开发包。
1. **[LazyLLM](https://github.com/LazyAGI/LazyLLM)**:一个低代码构建多 Agent 大模型应用的开发工具,支持基于 LLaMA Factory 的模型微调. 1. **[LazyLLM](https://github.com/LazyAGI/LazyLLM)**:一个低代码构建多 Agent 大模型应用的开发工具,支持基于 LLaMA Factory 的模型微调.
1. **[RAG-Retrieval](https://github.com/NLPJCL/RAG-Retrieval)**:一个全链路 RAG 检索模型微调、推理和蒸馏代码库。[[blog]](https://zhuanlan.zhihu.com/p/987727357)
1. **[360-LLaMA-Factory](https://github.com/Qihoo360/360-LLaMA-Factory)**:一个魔改后的代码库,通过 Ring Attention 支持长序列的 SFT 和 DPO 训练。
1. **[Sky-T1](https://novasky-ai.github.io/posts/sky-t1/)**:由 NovaSky AI 微调的低成本类 o1 长推理模型。
1. **[WeClone](https://github.com/xming521/WeClone)**:从聊天记录创造数字分身的一站式解决方案。
</details> </details>
@@ -618,7 +962,7 @@ run_name: test_run # 可选
本仓库的代码依照 [Apache-2.0](LICENSE) 协议开源。 本仓库的代码依照 [Apache-2.0](LICENSE) 协议开源。
使用模型权重时,请遵循对应的模型协议:[Baichuan 2](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base/blob/main/Community%20License%20for%20Baichuan%202%20Model.pdf) / [BLOOM](https://huggingface.co/spaces/bigscience/license) / [ChatGLM3](https://github.com/THUDM/ChatGLM3/blob/main/MODEL_LICENSE) / [Command R](https://cohere.com/c4ai-cc-by-nc-license) / [DeepSeek](https://github.com/deepseek-ai/DeepSeek-LLM/blob/main/LICENSE-MODEL) / [Falcon](https://huggingface.co/tiiuae/falcon-180B/blob/main/LICENSE.txt) / [Gemma](https://ai.google.dev/gemma/terms) / [GLM-4](https://huggingface.co/THUDM/glm-4-9b/blob/main/LICENSE) / [InternLM2](https://github.com/InternLM/InternLM#license) / [Llama](https://github.com/facebookresearch/llama/blob/main/MODEL_CARD.md) / [Llama 2 (LLaVA-1.5)](https://ai.meta.com/llama/license/) / [Llama 3](https://llama.meta.com/llama3/license/) / [Mistral](LICENSE) / [OLMo](LICENSE) / [Phi-1.5/Phi-2](https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx) / [Phi-3](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/blob/main/LICENSE) / [Qwen](https://github.com/QwenLM/Qwen/blob/main/Tongyi%20Qianwen%20LICENSE%20AGREEMENT) / [StarCoder 2](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement) / [XVERSE](https://github.com/xverse-ai/XVERSE-13B/blob/main/MODEL_LICENSE.pdf) / [Yi](https://huggingface.co/01-ai/Yi-6B/blob/main/LICENSE) / [Yi-1.5](LICENSE) / [Yuan 2](https://github.com/IEIT-Yuan/Yuan-2.0/blob/main/LICENSE-Yuan) 使用模型权重时,请遵循对应的模型协议:[Baichuan 2](https://huggingface.co/baichuan-inc/Baichuan2-7B-Base/blob/main/Community%20License%20for%20Baichuan%202%20Model.pdf) / [BLOOM](https://huggingface.co/spaces/bigscience/license) / [ChatGLM3](https://github.com/THUDM/ChatGLM3/blob/main/MODEL_LICENSE) / [Command R](https://cohere.com/c4ai-cc-by-nc-license) / [DeepSeek](https://github.com/deepseek-ai/DeepSeek-LLM/blob/main/LICENSE-MODEL) / [Falcon](https://huggingface.co/tiiuae/falcon-180B/blob/main/LICENSE.txt) / [Gemma](https://ai.google.dev/gemma/terms) / [GLM-4](https://huggingface.co/THUDM/glm-4-9b/blob/main/LICENSE) / [GPT-2](https://github.com/openai/gpt-2/blob/master/LICENSE) / [Granite](LICENSE) / [Index](https://huggingface.co/IndexTeam/Index-1.9B/blob/main/LICENSE) / [InternLM](https://github.com/InternLM/InternLM#license) / [Llama](https://github.com/facebookresearch/llama/blob/main/MODEL_CARD.md) / [Llama 2](https://ai.meta.com/llama/license/) / [Llama 3](https://llama.meta.com/llama3/license/) / [Llama 4](https://github.com/meta-llama/llama-models/blob/main/models/llama4/LICENSE) / [MiniCPM](https://github.com/OpenBMB/MiniCPM/blob/main/MiniCPM%20Model%20License.md) / [Mistral/Mixtral/Pixtral](LICENSE) / [OLMo](LICENSE) / [Phi-1.5/Phi-2](https://huggingface.co/microsoft/phi-1_5/resolve/main/Research%20License.docx) / [Phi-3/Phi-4](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/blob/main/LICENSE) / [Qwen](https://github.com/QwenLM/Qwen/blob/main/Tongyi%20Qianwen%20LICENSE%20AGREEMENT) / [Skywork](https://huggingface.co/Skywork/Skywork-13B-base/blob/main/Skywork%20Community%20License.pdf) / [StarCoder 2](https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement) / [TeleChat2](https://huggingface.co/Tele-AI/telechat-7B/blob/main/TeleChat%E6%A8%A1%E5%9E%8B%E7%A4%BE%E5%8C%BA%E8%AE%B8%E5%8F%AF%E5%8D%8F%E8%AE%AE.pdf) / [XVERSE](https://github.com/xverse-ai/XVERSE-13B/blob/main/MODEL_LICENSE.pdf) / [Yi](https://huggingface.co/01-ai/Yi-6B/blob/main/LICENSE) / [Yi-1.5](LICENSE) / [Yuan 2](https://github.com/IEIT-Yuan/Yuan-2.0/blob/main/LICENSE-Yuan)
## 引用 ## 引用

File diff suppressed because it is too large Load Diff

Before

Width:  |  Height:  |  Size: 29 KiB

BIN
assets/logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 56 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 6.0 KiB

BIN
assets/sponsors/warp.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 126 KiB

1
assets/thirdparty/colab.svg vendored Normal file
View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="117" height="20"><linearGradient id="b" x2="0" y2="100%"><stop offset="0" stop-color="#bbb" stop-opacity=".1"/><stop offset="1" stop-opacity=".1"/></linearGradient><clipPath id="a"><rect width="117" height="20" rx="3" fill="#fff"/></clipPath><g clip-path="url(#a)"><path fill="#555" d="M0 0h30v20H0z"/><path fill="#007ec6" d="M30 0h87v20H30z"/><path fill="url(#b)" d="M0 0h117v20H0z"/></g><g fill="#fff" text-anchor="middle" font-family="DejaVu Sans,Verdana,Geneva,sans-serif" font-size="110"><svg x="4px" y="0px" width="22px" height="20px" viewBox="-2 0 28 24" style="background-color: #fff;border-radius: 1px;"><path style="fill:#e8710a;" d="M1.977,16.77c-2.667-2.277-2.605-7.079,0-9.357C2.919,8.057,3.522,9.075,4.49,9.691c-1.152,1.6-1.146,3.201-0.004,4.803C3.522,15.111,2.918,16.126,1.977,16.77z"/><path style="fill:#f9ab00;" d="M12.257,17.114c-1.767-1.633-2.485-3.658-2.118-6.02c0.451-2.91,2.139-4.893,4.946-5.678c2.565-0.718,4.964-0.217,6.878,1.819c-0.884,0.743-1.707,1.547-2.434,2.446C18.488,8.827,17.319,8.435,16,8.856c-2.404,0.767-3.046,3.241-1.494,5.644c-0.241,0.275-0.493,0.541-0.721,0.826C13.295,15.939,12.511,16.3,12.257,17.114z"/><path style="fill:#e8710a;" d="M19.529,9.682c0.727-0.899,1.55-1.703,2.434-2.446c2.703,2.783,2.701,7.031-0.005,9.764c-2.648,2.674-6.936,2.725-9.701,0.115c0.254-0.814,1.038-1.175,1.528-1.788c0.228-0.285,0.48-0.552,0.721-0.826c1.053,0.916,2.254,1.268,3.6,0.83C20.502,14.551,21.151,11.927,19.529,9.682z"/><path style="fill:#f9ab00;" d="M4.49,9.691C3.522,9.075,2.919,8.057,1.977,7.413c2.209-2.398,5.721-2.942,8.476-1.355c0.555,0.32,0.719,0.606,0.285,1.128c-0.157,0.188-0.258,0.422-0.391,0.631c-0.299,0.47-0.509,1.067-0.929,1.371C8.933,9.539,8.523,8.847,8.021,8.746C6.673,8.475,5.509,8.787,4.49,9.691z"/><path style="fill:#f9ab00;" d="M1.977,16.77c0.941-0.644,1.545-1.659,2.509-2.277c1.373,1.152,2.85,1.433,4.45,0.499c0.332-0.194,0.503-0.088,0.673,0.19c0.386,0.635,0.753,1.285,1.181,1.89c0.34,0.48,0.222,0.715-0.253,1.006C7.84,19.73,4.205,19.188,1.977,16.77z"/></svg><text x="245" y="140" transform="scale(.1)" textLength="30"> </text><text x="725" y="150" fill="#010101" fill-opacity=".3" transform="scale(.1)" textLength="770">Open in Colab</text><text x="725" y="140" transform="scale(.1)" textLength="770">Open in Colab</text></g> </svg>

After

Width:  |  Height:  |  Size: 2.3 KiB

1
assets/thirdparty/discord.svg vendored Normal file
View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="115" height="20" role="img" aria-label="LLaMA Factory"><title>LLaMA Factory</title><linearGradient id="s" x2="0" y2="100%"><stop offset="0" stop-color="#bbb" stop-opacity=".1"/><stop offset="1" stop-opacity=".1"/></linearGradient><clipPath id="r"><rect width="115" height="20" rx="3" fill="#fff"/></clipPath><g clip-path="url(#r)"><rect width="24" height="20" fill="#5865f2"/><rect x="24" width="91" height="20" fill="#555"/><rect width="115" height="20" fill="url(#s)"/></g><g fill="#fff" text-anchor="middle" font-family="Verdana,Geneva,DejaVu Sans,sans-serif" text-rendering="geometricPrecision" font-size="110"><image x="5" y="3" width="14" height="14" xlink:href="data:image/svg+xml;base64,PHN2ZyBmaWxsPSJ3aGl0ZSIgcm9sZT0iaW1nIiB2aWV3Qm94PSIwIDAgMjQgMjQiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+PHRpdGxlPkRpc2NvcmQ8L3RpdGxlPjxwYXRoIGQ9Ik0yMC4zMTcgNC4zNjk4YTE5Ljc5MTMgMTkuNzkxMyAwIDAwLTQuODg1MS0xLjUxNTIuMDc0MS4wNzQxIDAgMDAtLjA3ODUuMDM3MWMtLjIxMS4zNzUzLS40NDQ3Ljg2NDgtLjYwODMgMS4yNDk1LTEuODQ0Ny0uMjc2Mi0zLjY4LS4yNzYyLTUuNDg2OCAwLS4xNjM2LS4zOTMzLS40MDU4LS44NzQyLS42MTc3LTEuMjQ5NWEuMDc3LjA3NyAwIDAwLS4wNzg1LS4wMzcgMTkuNzM2MyAxOS43MzYzIDAgMDAtNC44ODUyIDEuNTE1LjA2OTkuMDY5OSAwIDAwLS4wMzIxLjAyNzdDLjUzMzQgOS4wNDU4LS4zMTkgMTMuNTc5OS4wOTkyIDE4LjA1NzhhLjA4MjQuMDgyNCAwIDAwLjAzMTIuMDU2MWMyLjA1MjggMS41MDc2IDQuMDQxMyAyLjQyMjggNS45OTI5IDMuMDI5NGEuMDc3Ny4wNzc3IDAgMDAuMDg0Mi0uMDI3NmMuNDYxNi0uNjMwNC44NzMxLTEuMjk1MiAxLjIyNi0xLjk5NDJhLjA3Ni4wNzYgMCAwMC0uMDQxNi0uMTA1N2MtLjY1MjgtLjI0NzYtMS4yNzQzLS41NDk1LTEuODcyMi0uODkyM2EuMDc3LjA3NyAwIDAxLS4wMDc2LS4xMjc3Yy4xMjU4LS4wOTQzLjI1MTctLjE5MjMuMzcxOC0uMjkxNGEuMDc0My4wNzQzIDAgMDEuMDc3Ni0uMDEwNWMzLjkyNzggMS43OTMzIDguMTggMS43OTMzIDEyLjA2MTQgMGEuMDczOS4wNzM5IDAgMDEuMDc4NS4wMDk1Yy4xMjAyLjA5OS4yNDYuMTk4MS4zNzI4LjI5MjRhLjA3Ny4wNzcgMCAwMS0uMDA2Ni4xMjc2IDEyLjI5ODYgMTIuMjk4NiAwIDAxLTEuODczLjg5MTQuMDc2Ni4wNzY2IDAgMDAtLjA0MDcuMTA2N2MuMzYwNC42OTguNzcxOSAxLjM2MjggMS4yMjUgMS45OTMyYS4wNzYuMDc2IDAgMDAuMDg0Mi4wMjg2YzEuOTYxLS42MDY3IDMuOTQ5NS0xLjUyMTkgNi4wMDIzLTMuMDI5NGEuMDc3LjA3NyAwIDAwLjAzMTMtLjA1NTJjLjUwMDQtNS4xNzctLjgzODItOS42NzM5LTMuNTQ4NS0xMy42NjA0YS4wNjEuMDYxIDAgMDAtLjAzMTItLjAyODZ6TTguMDIgMTUuMzMxMmMtMS4xODI1IDAtMi4xNTY5LTEuMDg1Ny0yLjE1NjktMi40MTkgMC0xLjMzMzIuOTU1NS0yLjQxODkgMi4xNTctMi40MTg5IDEuMjEwOCAwIDIuMTc1NyAxLjA5NTIgMi4xNTY4IDIuNDE5IDAgMS4zMzMyLS45NTU1IDIuNDE4OS0yLjE1NjkgMi40MTg5em03Ljk3NDggMGMtMS4xODI1IDAtMi4xNTY5LTEuMDg1Ny0yLjE1NjktMi40MTkgMC0xLjMzMzIuOTU1NC0yLjQxODkgMi4xNTY5LTIuNDE4OSAxLjIxMDggMCAyLjE3NTcgMS4wOTUyIDIuMTU2OCAyLjQxOSAwIDEuMzMzMi0uOTQ2IDIuNDE4OS0yLjE1NjggMi40MTg5WiIvPjwvc3ZnPg=="/><text aria-hidden="true" x="685" y="150" fill="#010101" fill-opacity=".3" transform="scale(.1)" textLength="810">LLaMA Factory</text><text x="685" y="140" transform="scale(.1)" fill="#fff" textLength="810">LLaMA Factory</text></g></svg>

After

Width:  |  Height:  |  Size: 2.8 KiB

92
assets/thirdparty/dsw.svg vendored Normal file
View File

@@ -0,0 +1,92 @@
<?xml version="1.0" encoding="UTF-8"?>
<svg width="130px" height="20px" viewBox="0 0 130 20" version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<title>最终方案备份 6</title>
<defs>
<linearGradient x1="50%" y1="-11.4084043%" x2="50%" y2="107.220074%" id="linearGradient-1">
<stop stop-color="#FF7717" offset="0%"></stop>
<stop stop-color="#FF4707" offset="100%"></stop>
</linearGradient>
<linearGradient x1="50%" y1="0%" x2="50%" y2="107.220074%" id="linearGradient-2">
<stop stop-color="#FFFFFF" stop-opacity="0.15" offset="0%"></stop>
<stop stop-color="#FFFFFF" stop-opacity="0.05" offset="100%"></stop>
</linearGradient>
<path d="M6.45596591,3.88139205 C6.45596591,4.65080492 6.31628788,5.31486742 6.03693182,5.87357955 C5.75757576,6.43229167 5.37523674,6.86257102 4.88991477,7.16441761 C4.4045928,7.4662642 3.85061553,7.6171875 3.22798295,7.6171875 C2.60535038,7.6171875 2.05078125,7.4662642 1.56427557,7.16441761 C1.07776989,6.86257102 0.695430871,6.43229167 0.417258523,5.87357955 C0.139086174,5.31486742 0,4.65080492 0,3.88139205 C0,3.11197917 0.139086174,2.44791667 0.417258523,1.88920455 C0.695430871,1.33049242 1.07776989,0.900213068 1.56427557,0.598366477 C2.05078125,0.296519886 2.60535038,0.145596591 3.22798295,0.145596591 C3.85061553,0.145596591 4.4045928,0.296519886 4.88991477,0.598366477 C5.37523674,0.900213068 5.75757576,1.33049242 6.03693182,1.88920455 C6.31628788,2.44791667 6.45596591,3.11197917 6.45596591,3.88139205 Z M5.53622159,3.88139205 C5.53622159,3.26349432 5.43442235,2.74147727 5.23082386,2.31534091 C5.02722538,1.88920455 4.7508286,1.56664299 4.40163352,1.34765625 C4.05243845,1.12866951 3.66122159,1.01917614 3.22798295,1.01917614 C2.79237689,1.01917614 2.40056818,1.12866951 2.05255682,1.34765625 C1.70454545,1.56664299 1.42814867,1.88920455 1.22336648,2.31534091 C1.01858428,2.74147727 0.916193182,3.26349432 0.916193182,3.88139205 C0.916193182,4.49928977 1.01858428,5.02130682 1.22336648,5.44744318 C1.42814867,5.87357955 1.70454545,6.1961411 2.05255682,6.41512784 C2.40056818,6.63411458 2.79237689,6.74360795 3.22798295,6.74360795 C3.66122159,6.74360795 4.05243845,6.63411458 4.40163352,6.41512784 C4.7508286,6.1961411 5.02722538,5.87357955 5.23082386,5.44744318 C5.43442235,5.02130682 5.53622159,4.49928977 5.53622159,3.88139205 Z" id="path-3"></path>
<path d="M8.59629794,9.56321023 L8.59629794,2.06321023 L9.46987749,2.06321023 L9.46987749,2.93323864 L9.56575817,2.93323864 C9.6273112,2.83380682 9.71372218,2.7101089 9.82499112,2.56214489 C9.93626006,2.41418087 10.0954693,2.28219697 10.302619,2.16619318 C10.5097686,2.05018939 10.7885328,1.9921875 11.1389116,1.9921875 C11.593457,1.9921875 11.9953273,2.10582386 12.3445224,2.33309659 C12.6937174,2.56036932 12.9665631,2.88470644 13.1630593,3.30610795 C13.3595555,3.72750947 13.4578036,4.22703598 13.4578036,4.8046875 C13.4578036,5.38470644 13.3595555,5.88600852 13.1630593,6.30859375 C12.9665631,6.73117898 12.6949012,7.05669981 12.3480735,7.28515625 C12.0012459,7.51361269 11.6029267,7.62784091 11.1531161,7.62784091 C10.8074722,7.62784091 10.5298917,7.57043087 10.3203746,7.4556108 C10.1108576,7.34079072 9.94809718,7.20821496 9.83209339,7.05788352 C9.71608961,6.90755208 9.6273112,6.78030303 9.56575817,6.67613636 L9.49828658,6.67613636 L9.49828658,9.56321023 L8.59629794,9.56321023 Z M9.48053089,4.79048295 C9.48053089,5.19294508 9.5397165,5.54805871 9.65808771,5.85582386 C9.77645893,6.16358902 9.94868904,6.40329072 10.1747781,6.57492898 C10.4008671,6.74656723 10.6772638,6.83238636 11.0039684,6.83238636 C11.3448775,6.83238636 11.6301521,6.74183239 11.8597923,6.56072443 C12.0894324,6.37961648 12.2622544,6.13458807 12.3782582,5.8256392 C12.494262,5.51669034 12.5522638,5.17163826 12.5522638,4.79048295 C12.5522638,4.4140625 12.4954457,4.07433712 12.3818093,3.77130682 C12.2681729,3.46827652 12.0965347,3.22857481 11.8668945,3.0522017 C11.6372544,2.8758286 11.3496123,2.78764205 11.0039684,2.78764205 C10.672529,2.78764205 10.3937648,2.87168561 10.1676758,3.03977273 C9.94158677,3.20785985 9.77054036,3.44164299 9.65453658,3.74112216 C9.53853279,4.04060133 9.48053089,4.39038826 9.48053089,4.79048295 Z" id="path-4"></path>
<path d="M17.8164584,7.63139205 C17.2885228,7.63139205 16.8333855,7.5147964 16.4510465,7.28160511 C16.0687075,7.04841383 15.774555,6.72170928 15.5685891,6.30149148 C15.3626232,5.88127367 15.2596402,5.39180871 15.2596402,4.83309659 C15.2596402,4.27201705 15.3626232,3.77840909 15.5685891,3.35227273 C15.774555,2.92613636 16.0627889,2.5929214 16.4332908,2.35262784 C16.8037927,2.11233428 17.2376232,1.9921875 17.7347823,1.9921875 C18.0259755,1.9921875 18.3112501,2.0407197 18.5906061,2.13778409 C18.8699622,2.23484848 19.1238684,2.38991477 19.3523249,2.60298295 C19.5807813,2.81605114 19.7618893,3.09659091 19.8956487,3.44460227 C20.0294082,3.79261364 20.096288,4.21875 20.096288,4.72301136 L20.096288,5.08522727 L15.8562311,5.08522727 L15.8562311,4.33948864 L19.1907482,4.33948864 C19.1907482,4.04119318 19.1309707,3.77485795 19.0114158,3.54048295 C18.8918609,3.30610795 18.7231819,3.12085701 18.5053789,2.98473011 C18.2875758,2.84860322 18.0307103,2.78053977 17.7347823,2.78053977 C17.4104451,2.78053977 17.1304972,2.85984848 16.8949385,3.01846591 C16.6593798,3.17708333 16.47768,3.3836411 16.3498391,3.6381392 C16.2219982,3.89263731 16.1580777,4.16666667 16.1580777,4.46022727 L16.1580777,4.98224432 C16.1580777,5.37997159 16.2279167,5.71732955 16.3675948,5.99431818 C16.5072728,6.27130682 16.7014016,6.48200758 16.9499811,6.62642045 C17.1985607,6.77083333 17.4885701,6.84303977 17.8200095,6.84303977 C18.0354451,6.84303977 18.2301658,6.81226326 18.4041715,6.75071023 C18.5781772,6.6891572 18.7291005,6.59682765 18.8569414,6.47372159 C18.9847823,6.35061553 19.0830304,6.19673295 19.1516857,6.01207386 L20.018163,6.22514205 C19.9329357,6.50686553 19.7891147,6.75307765 19.5866999,6.96377841 C19.3842851,7.17447917 19.1327463,7.3384233 18.8320834,7.4556108 C18.5314205,7.5727983 18.1928789,7.63139205 17.8164584,7.63139205 Z" id="path-5"></path>
<path d="M23.0521052,4.24715909 L23.0521052,7.51775568 L22.1501165,7.51775568 L22.1501165,2.06321023 L23.0165938,2.06321023 L23.0165938,2.92613636 L23.0876165,2.92613636 C23.2154574,2.64441288 23.4107699,2.41832386 23.673554,2.24786932 C23.9363381,2.07741477 24.2736961,1.9921875 24.6856279,1.9921875 C25.0549461,1.9921875 25.3780995,2.06794508 25.6550881,2.21946023 C25.9320767,2.37097538 26.1481042,2.59883996 26.3031705,2.90305398 C26.4582368,3.20726799 26.5357699,3.59019886 26.5357699,4.05184659 L26.5357699,7.51775568 L25.6337813,7.51775568 L25.6337813,4.12642045 C25.6337813,3.70738636 25.5248798,3.38008996 25.3070767,3.14453125 C25.0892737,2.90897254 24.7897946,2.79119318 24.4086392,2.79119318 C24.1482226,2.79119318 23.9156232,2.84801136 23.710841,2.96164773 C23.5060588,3.07528409 23.3450739,3.24100379 23.2278864,3.45880682 C23.1106989,3.67660985 23.0521052,3.93939394 23.0521052,4.24715909 Z" id="path-6"></path>
<path d="M32.4028209,7.51775568 L32.4028209,2.06321023 L33.3048096,2.06321023 L33.3048096,7.51775568 L32.4028209,7.51775568 Z M32.8609175,1.171875 C32.690463,1.171875 32.5448664,1.11446496 32.4241278,0.999644886 C32.3033891,0.884824811 32.2430198,0.746922348 32.2430198,0.5859375 C32.2430198,0.424952652 32.3033891,0.287050189 32.4241278,0.172230114 C32.5448664,0.0574100379 32.690463,0 32.8609175,0 C33.0290046,0 33.1734175,0.0574100379 33.2941562,0.172230114 C33.4148948,0.287050189 33.4752641,0.424952652 33.4752641,0.5859375 C33.4752641,0.746922348 33.4148948,0.884824811 33.2941562,0.999644886 C33.1734175,1.11446496 33.0290046,1.171875 32.8609175,1.171875 Z" id="path-7"></path>
<path d="M36.5099554,4.24715909 L36.5099554,7.51775568 L35.6079668,7.51775568 L35.6079668,2.06321023 L36.474444,2.06321023 L36.474444,2.92613636 L36.5454668,2.92613636 C36.6733077,2.64441288 36.8686202,2.41832386 37.1314043,2.24786932 C37.3941883,2.07741477 37.7315463,1.9921875 38.1434781,1.9921875 C38.5127963,1.9921875 38.8359497,2.06794508 39.1129383,2.21946023 C39.389927,2.37097538 39.6059544,2.59883996 39.7610207,2.90305398 C39.916087,3.20726799 39.9936202,3.59019886 39.9936202,4.05184659 L39.9936202,7.51775568 L39.0916315,7.51775568 L39.0916315,4.12642045 C39.0916315,3.70738636 38.98273,3.38008996 38.764927,3.14453125 C38.5471239,2.90897254 38.2476448,2.79119318 37.8664895,2.79119318 C37.6060728,2.79119318 37.3734734,2.84801136 37.1686912,2.96164773 C36.963909,3.07528409 36.8029241,3.24100379 36.6857366,3.45880682 C36.5685491,3.67660985 36.5099554,3.93939394 36.5099554,4.24715909 Z" id="path-8"></path>
<path d="M45.9636541,7.51775568 L45.9636541,0.245028409 L48.460103,0.245028409 C49.0306522,0.245028409 49.4988104,0.347419508 49.8645774,0.552201705 C50.2303445,0.756983902 50.5020064,1.03456439 50.6795632,1.38494318 C50.85712,1.73532197 50.9458984,2.12713068 50.9458984,2.56036932 C50.9458984,2.99597538 50.85712,3.38955966 50.6795632,3.74112216 C50.5020064,4.09268466 50.2309363,4.37204072 49.866353,4.57919034 C49.5017696,4.78633996 49.0365708,4.88991477 48.4707564,4.88991477 L46.7022905,4.88991477 L46.7022905,4.06605114 L48.4210405,4.06605114 C48.8021958,4.06605114 49.1093691,3.99976326 49.3425604,3.8671875 C49.5757517,3.73461174 49.7456143,3.55527936 49.8521484,3.32919034 C49.9586825,3.10310133 50.0119496,2.84682765 50.0119496,2.56036932 C50.0119496,2.27391098 49.9586825,2.01822917 49.8521484,1.79332386 C49.7456143,1.56841856 49.5745679,1.39204545 49.3390092,1.26420455 C49.1034505,1.13636364 48.7927261,1.07244318 48.4068359,1.07244318 L46.9082564,1.07244318 L46.9082564,7.51775568 L45.9636541,7.51775568 Z" id="path-9"></path>
<path d="M52.7664895,7.51775568 L51.7757224,7.51775568 L54.4284213,0.245028409 L55.4404952,0.245028409 L58.093194,7.51775568 L57.102427,7.51775568 L54.9646429,1.4453125 L54.9078247,1.4453125 L52.7664895,7.51775568 Z M53.0647849,4.67684659 L56.8041315,4.67684659 L56.8041315,5.49715909 L53.0647849,5.49715909 L53.0647849,4.67684659 Z" id="path-10"></path>
<polygon id="path-11" points="60.949413 0.245028409 60.949413 7.51775568 60.0048107 7.51775568 60.0048107 0.245028409"></polygon>
<polygon id="path-12" points="66.5090177 3.46946023 66.5090177 4.29332386 63.3129949 4.29332386 63.3129949 3.46946023"></polygon>
<path d="M71.1487393,7.51775568 L68.8724609,7.51775568 L68.8724609,0.245028409 L71.2410689,0.245028409 C71.9489287,0.245028409 72.5549893,0.390033144 73.0592507,0.680042614 C73.5635121,0.970052083 73.9505859,1.38553504 74.2204723,1.92649148 C74.4903587,2.46744792 74.6253018,3.11434659 74.6253018,3.8671875 C74.6253018,4.62476326 74.4897668,5.27639678 74.2186967,5.82208807 C73.9476267,6.36777936 73.5528587,6.78681345 73.0343928,7.07919034 C72.5159268,7.37156723 71.8873757,7.51775568 71.1487393,7.51775568 Z M69.8170632,6.68678977 L71.0919212,6.68678977 C71.6790424,6.68678977 72.1667318,6.57374527 72.5549893,6.34765625 C72.9432469,6.12156723 73.2326645,5.79841383 73.4232422,5.37819602 C73.6138198,4.95797822 73.7091087,4.45430871 73.7091087,3.8671875 C73.7091087,3.28480114 73.6144117,2.78527462 73.4250178,2.36860795 C73.2356238,1.95194129 72.9539003,1.63233902 72.5798473,1.40980114 C72.2057943,1.18726326 71.7405954,1.07599432 71.1842507,1.07599432 L69.8170632,1.07599432 L69.8170632,6.68678977 Z" id="path-13"></path>
<path d="M80.8452792,2.08806818 C80.8026656,1.73532197 80.6339866,1.46129261 80.3392423,1.26598011 C80.044498,1.07066761 79.6805065,0.973011364 79.2472678,0.973011364 C78.9324004,0.973011364 78.6577792,1.02391098 78.4234042,1.12571023 C78.1890292,1.22750947 78.0073294,1.36659564 77.8783048,1.54296875 C77.7492802,1.71934186 77.6847678,1.91998106 77.6847678,2.14488636 C77.6824004,2.33191288 77.7267896,2.49289773 77.8179355,2.62784091 C77.9090813,2.76278409 78.0256769,2.87464489 78.1677224,2.9634233 C78.3097678,3.0522017 78.4606911,3.12559186 78.6204923,3.18359375 C78.7802934,3.24159564 78.9288493,3.28835227 79.0661599,3.32386364 L79.7941428,3.51917614 C79.9977413,3.57125947 80.2173199,3.6422822 80.4528786,3.73224432 C80.6884374,3.82220644 80.9127508,3.94353693 81.125819,4.0962358 C81.3388872,4.24893466 81.5134847,4.44247159 81.6496116,4.67684659 C81.7857385,4.91122159 81.8538019,5.19649621 81.8538019,5.53267045 C81.8538019,5.93039773 81.7502271,6.28847064 81.5430775,6.6068892 C81.3359279,6.92530777 81.035265,7.17743845 80.6410889,7.36328125 C80.2469127,7.54912405 79.7692849,7.64204545 79.2082053,7.64204545 C78.6802697,7.64204545 78.2239487,7.55741004 77.8392423,7.3881392 C77.4545358,7.21886837 77.1526892,6.98153409 76.9337025,6.67613636 C76.7147158,6.37073864 76.5910178,6.01325758 76.5626088,5.60369318 L77.5285178,5.60369318 C77.5521921,5.87594697 77.6433379,6.10085227 77.8019553,6.27840909 C77.9605728,6.45596591 78.1623957,6.58735795 78.4074241,6.67258523 C78.6524525,6.7578125 78.9181959,6.80042614 79.2046542,6.80042614 C79.5337262,6.80042614 79.8284705,6.74715909 80.0888872,6.640625 C80.3493038,6.53409091 80.5552697,6.38494318 80.7067849,6.19318182 C80.8583,6.00142045 80.9340576,5.77769886 80.9340576,5.52201705 C80.9340576,5.28764205 80.8689535,5.09647254 80.7387451,4.94850852 C80.6085368,4.80054451 80.4345311,4.67980587 80.2167281,4.58629261 C79.998925,4.49277936 79.7621826,4.41051136 79.5065008,4.33948864 L78.6222678,4.09090909 C78.0493512,3.92992424 77.5959894,3.69673295 77.2621826,3.39133523 C76.9283758,3.0859375 76.7614724,2.68702652 76.7614724,2.19460227 C76.7614724,1.78267045 76.8733332,1.42341383 77.0970548,1.11683239 C77.3207764,0.810250947 77.6220311,0.571732955 78.000819,0.401278409 C78.3796069,0.230823864 78.8033758,0.145596591 79.2721258,0.145596591 C79.7479781,0.145596591 80.1699714,0.229640152 80.5381059,0.397727273 C80.9062404,0.565814394 81.1980254,0.796638258 81.413461,1.09019886 C81.6288966,1.38375947 81.742533,1.71638258 81.7543701,2.08806818 L80.8452792,2.08806818 Z" id="path-14"></path>
<polygon id="path-15" points="85.4149214 7.51775568 83.4156317 0.245028409 84.3850919 0.245028409 85.8765692 6.08664773 85.9475919 6.08664773 87.4745805 0.245028409 88.5079612 0.245028409 90.038501 6.08664773 90.1059726 6.08664773 91.5974498 0.245028409 92.5669101 0.245028409 90.5676203 7.51775568 89.6052623 7.51775568 88.0214555 1.82173295 87.9646373 1.82173295 86.3772794 7.51775568"></polygon>
</defs>
<g id="最终方案备份-6" stroke="none" stroke-width="1" fill="none" fill-rule="evenodd">
<g id="编组备份-3">
<rect id="Fill备份-19" fill="url(#linearGradient-1)" x="0" y="0" width="130" height="20" rx="3.33333333"></rect>
<rect id="Fill备份-20" fill="url(#linearGradient-2)" x="2" y="2" width="20" height="16" rx="2.5"></rect>
<g id="产品图标/learn-机器学习备份-10" transform="translate(6.028800, 4.000000)" fill="#FFFFFF" fill-rule="nonzero">
<g id="产品图标/learn-机器学习" transform="translate(0.000000, 0.000000)">
<path d="M8.784,0 C9.4752,0 10.0512,0.576 10.0512,1.2672 C10.0512,1.8432 9.6768,2.3328 9.1584,2.4768 L8.784,4.7232 C9.1008,4.9536 9.3312,5.328 9.3312,5.76 C9.3312,6.4512 8.7552,7.0272 8.064,7.0272 C7.5456,7.0272 7.0848,6.7104 6.8832,6.2496 L5.5008,6.1632 C5.328,6.3648 5.0976,6.48 4.8096,6.48 C4.32,6.48 3.9168,6.0768 3.9168,5.5872 C3.9168,5.2416 4.1184,4.9248 4.4064,4.7808 L4.5216,4.032 C4.3776,3.8592 4.2624,3.6576 4.2624,3.3984 C4.2624,2.9088 4.6656,2.5056 5.1552,2.5056 C5.6448,2.5056 6.048,2.9088 6.048,3.3984 C6.048,3.744 5.8464,4.0608 5.5584,4.2048 L5.4432,4.9536 C5.472,4.9824 5.5008,5.04 5.5296,5.0688 L6.912,5.1552 C7.0848,4.8672 7.344,4.6368 7.6608,4.5504 L8.0352,2.304 C7.8624,2.1888 7.7184,2.016 7.6032,1.8144 L3.8016,1.8144 C3.6576,2.1312 3.3696,2.3904 3.024,2.4768 L2.3616,6.5088 C2.8224,6.768 3.1392,7.2 3.1968,7.7472 L8.1216,8.8416 C8.4384,8.2944 9.0144,7.9488 9.6768,7.9488 C10.6848,7.9488 11.4624,8.7552 11.4624,9.7344 C11.4624,10.7136 10.6848,11.52 9.7056,11.52 C8.7552,11.52 7.9776,10.8 7.92,9.8784 L2.9952,8.784 C2.7072,9.2448 2.1888,9.5328 1.6128,9.5328 C0.72,9.5328 0,8.8128 0,7.92 C0,7.1136 0.576,6.4512 1.3248,6.336 L1.9872,2.304 C1.6128,2.0736 1.4112,1.6992 1.4112,1.2672 C1.4112,0.576 1.9872,0 2.6784,0 C3.168,0 3.6288,0.288 3.8304,0.72 L7.6608,0.72 C7.8624,0.288 8.2944,0 8.784,0 Z" id="Combined-Shape"></path>
</g>
</g>
</g>
<g id="Open-in-PAI-DSW" transform="translate(28.589489, 5.982244)" fill="#FFFFFF" fill-rule="nonzero">
<g id="形状">
<use xlink:href="#path-3"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-3"></use>
</g>
<g id="形状">
<use xlink:href="#path-4"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-4"></use>
</g>
<g id="路径">
<use xlink:href="#path-5"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-5"></use>
</g>
<g id="路径">
<use xlink:href="#path-6"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-6"></use>
</g>
<g id="形状">
<use xlink:href="#path-7"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-7"></use>
</g>
<g id="路径">
<use xlink:href="#path-8"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-8"></use>
</g>
<g id="路径">
<use xlink:href="#path-9"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-9"></use>
</g>
<g id="形状">
<use xlink:href="#path-10"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-10"></use>
</g>
<g id="路径">
<use xlink:href="#path-11"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-11"></use>
</g>
<g id="路径">
<use xlink:href="#path-12"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-12"></use>
</g>
<g id="形状">
<use xlink:href="#path-13"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-13"></use>
</g>
<g id="路径">
<use xlink:href="#path-14"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-14"></use>
</g>
<g id="路径">
<use xlink:href="#path-15"></use>
<use stroke="#FFFFFF" stroke-width="0.5" xlink:href="#path-15"></use>
</g>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 18 KiB

536
assets/thirdparty/lab4ai.svg vendored Normal file
View File

@@ -0,0 +1,536 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 24.0.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="125" height="20">
<g>
<g>
<defs>
<polygon id="SVGID_1_" points="0,0 126.7,0 126.7,21.5 0,21.5 0,0 "/>
</defs>
<clipPath id="SVGID_2_">
<use xlink:href="#SVGID_1_" style="overflow:visible;"/>
</clipPath>
<g style="clip-path:url(#SVGID_2_);">
<image style="overflow:visible;" width="874" height="148" xlink:href="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA20AAACYCAYAAAB3RQH9AAAACXBIWXMAAExuAABMbgHZzlUOAAAA
GXRFWHRTb2Z0d2FyZQBBZG9iZSBJbWFnZVJlYWR5ccllPAAAcstJREFUeNrsvQeUHMd5Lvp1z8zm
vFhgkRMBAgRIAowACBDMmZKVRVGURFG0LVv2sy1LTsfv2r73Wrbv8fXxu8fXkhhFiZIlkRIpMSeA
AAEGkACIQAQihw1YbM47O92v/q7umeowM92zM7szu/WRhZ7t6e6prqqu/r/6EyAhISEhISEhISEh
ISGRt1CycZHv/tN76/SQ/rmQEtJ1Rb9CVdRiHUqVqijLFEWForJCW0UxfpK21mejEopZDSV1dRRF
kT0mISEhISEhISEhITGu0HU92VeD0HGSsZROQOvWdGUP29fHmM+2//cPLnhjwknbt7+3/W4o2j26
husYmZqpqiFQIYJmbBWTqKmqSdJUg5QpJmmzPotkzU7KJEGTkJCQkJCQkJCQkMhbKmcndOZWp/26
HmP7TzO+8zxjNb8aK4ELzIz+5L9v/aquj/41+7hUDYWRIGvepE1lfxvaNSJvMMmaT02bIombhISE
hISEhISEhEReUjYHWYtr43Tjs/E3baFHdV3brWjK//y7b13wbE5J2//zD5s/ERuN/jd2xmWcrJmE
TSBuFmnjZC0U166ppsYNDi2bJGsSEhISEhISEhISEpONvJnaNpPIsa2m0b+DmqZtUzR87+//KJjm
zRc7+sO/ffWHMS36IBG1EJE0s4TY34pA2CyNm6Fts5lH2v3Z4qaRSoK4uf3VJHGTkJCQkJCQkJCQ
kMhPqmb7S9S2eZE23SoalXO6ov9n9UDke9/+9rzhMZO2b3/75fL+Uv1Xeix2i0jUnKSNiJwimEba
fNpEf7Y4eYOdxPEdjopJ0iYhISEhISEhISEhkY+UzRGYJJWJJATCpmkWcevTdO2XalHkT/7+wXk9
GZO2P/qfWxqG+rq3s9+4wCBooYiLtHmZSDp92izzSCdpc5lIxmskyZqEhISEhISEhISERCGwtyQa
N09Nm420QdNiw5qmbWa7/vQf/3jxgcCk7cvffrm8ODT0Ifu4mJtDRmDb2ohbKE7e4oSN9jmImit6
pEDcJGGTkJCQkJCQkJCQkJgUxE0gbJ6+bUTYDAIXoxJlxG1TSNG+/vffWnI22U+EvXaWhIZfiGma
QdigWAFFhC0jZSAyBiJkIaGYf4OCjpiaNii2rRjmX/qxSUhISEhISEhISEgUNBQPU0kK3wFO1KxC
ZM2gO7rJetg/uoII214T1fCPf/qvp37335L4uLlI2wPffe6RaHTo2lA4EideFhGzE7KQQeDoMxRe
FMffInHjWjRV0KYp/H4E4iaTZ0tISEhISEhISEhIFBLExNv8s8XMFFMLpxvEjiuvGAeypT7TqZSz
XXeVRqL/wHb9hScvFP/4xl+98MDwQO/DRNgMk8dwxPBli/uzmVvLPDIk+LUpjpD/tuTaYn62OIET
SZokbhISEhISEhISEhIShUvYUiXbjkeVNP3ZyEzSMJHUDRNJ8m9jRTuqx7S/+t6fLP6l83dsmrbh
of6/Jg2ZbhAr0/wRCa2ZqD2zNG5xrZqgXbPt19m1LOImEDaXps3c6rLvJSQkJCQkJCQkJCQKARaX
MTVsTk0bJ2s8Mj6jaSZXQtx8UtdVxE0odX0e+/e+P/3XU79xmknGSdvXvvP8N6JDvYtIg+YkZErc
zDGxdZI04xxdcexX4sRNN7Vsum4PRMJvS5FsTUJCQkJCQkJCQkKiUNkbDzoSN4lUhEiS1hGkHNNM
fiQSNpWbTyp6hJVLikPRL7Gdj3mStuhw/1/FSZalFbPsLqEkSBgcGrX4d2riXF3wYTPPFwmbbtlw
6oJ2TZpFSkhISEhISEhISEgUIuIaNrsuSrSe5JyH8SOKIGkSNVjETecciR0/V9P0r333nw4/9y9/
ubTNRtq++mcv3Dsy3GVo2eLEzXKQE/KtQdyq9n16PKy/9Z25HwmipigJjZtRcRmEREJCQkJCQkJC
QkKi0Dmbxx+64iBuphIuHohE58RNN/gTY0v0pUpESp+PIuUuCNo2g7SNaoNf0h3aNbemzfFZN4ug
SYP4nSJo1gQNm6IrCbKmJwibnufmkTPKWlBZ1IuyoiGUhGMIh3WURwYQCqlGTjpV5YXuxdoqQlqD
ZFuJXC12JAbUwEgYw6MRREdV9AwVsVKM1t5K9A9HoOmyHyQkJCQkJCQkJMYKRZA/E1zHqXWL+7zp
Fneya9qMrY56TdfXuknb8NBGivwIJ/kyiyJsFV1xHaeIJE5JEDdFcZxvfe8I9W/qCvOu6aeXnsW0
krMoD7UjzJonHKaImaH4lkfPVBAyImd6E7ZUxE2St9ySNevvqlKyGR42Ps+o6jcz0jehd7gEpzuq
cOx8PQajEdl4EhISEhISEhISmUqhiXzUSJAzxSJupoyqwFJWCZxJUIKR5SL7r4J9sfK7/3S4wTKR
DN/37ec+PTzQVW7ELDHNFw1NmPBZN7Vk4meNCIq1H8LxIt2zgo8IW4uo6A5mmk+oKTqP2SX7UaJ2
IqyGGWHjRM1O2tQ4WaN9qUlbcg2bJG25J23WNrG6wbe14VFUl57HooZOHD5Xh+OMvA2NhmUjSkhI
SEhISEhIBIQZMTIh5HO501RY6ebXcZNJM94H/a2ZloiaEWEyzsmmxcLqOvbnswZpi0VHbuGRKM0L
xiNOCqaOlmmkbjeVdJpH6qI2Dd4mjwlTyPwkK4vKP0AljoJRNEbKODlTTXJmETRezL+F79zEDS5t
myRrE0ferGItdtBn6oeKEh2r5rRhTk0fdp2eifb+Umk2KSEhISEhISEhkQFxi0ufiJtMxrVtwlYX
lF8QA5YoZgIApZoxikvipE3TY3MTkU4cppG6m6wpVkRI04lOJGy8bonw/XEtm+XHlsd+axF1BAuK
t6JUa4NqadJCqqlZsxeuZUto2Lw0bWQ2KWrXpEnkxBI3O2GzCBzvJ03TjG1D5SDWLjqFnadmoqm7
EjFNlY0oISEhISEhISExBnnUJHMO4pawAkvwLF10OdP1kpiuN1rXCeua3qDHmaE92Ihu05YJ+wTf
Nl3QzimwzB4VG3lLyUfzgMBElGEsjLyMiNYDNRKJa9FIw2Zp1RIatpBB2MhEUiRsvCguE0lJ2vKD
tNnJmr0oSsJxtKo0xojbaew4MRunO2swKombhISEhISEhIRExvKn9cGhbXMEd9StbZyDKRFdQ02c
tGmxWH3clFG3kzfRLFIkcXEtm0XUdEubJjjXifWNJwW3m0bmS9TIuUWbENK6oBIBM3z1LE2auXUW
c7+qJrZE2JzaNpGwSfPICXhgFMX1tydpo/z0mhFi1RiZxeEYVs1pwWC0CK095dJUUkJCQkJCQkJC
wieURIRI2JNsx00fLZPJuNbN47OOIrYRSJum1cV90ERSpgM2rZsuEC7zs6iJ40TNDxnLL8I2v2Q7
wqNNUMMRk3iZpo0qD7RiRIeMm0pSUUyippgaOcXYly7cvwIZ6n98nxfFttoh5slwFsXIRUhbLT7W
y4ujuHR2E94ZmYvOgVLZnhISEhISEhISEhkRt4TSSrdZKbrMI+08LMRKmUDaYjVqKAJBcxffOvcl
HONE8mGaQir2DOCKbb8ZXRIJ97x8cG+rCjejaGi/YRJpES4j55oi5F5TUpeQaj/eFTkShZGfrX9I
xfuHi1FXFcPFC0YmzSOjQ3eQNx1e/9ncOM10GdMqBrCgvhOD0bBMCSAhISEhISEhIZGBLArTbUx3
cCpRGyeYTCp2zhUnbQkapdhVc3BoJGC3v4wzQ4EpFhoatK3xKI+qoFlTLI2boti0b0n3iRo6gax5
5WhLyscnkMsdaYrg356pxHuHIqit1HD31YO4/6Z+VJQWTqcmG3+KYo/IEx/LClxaZadJsKopWNpw
Hme7Kg3SVohjXEJCQkJCQkJCYqLYmkC8bJEjLe6le1iBKXafN4u0ieTM6RAHG0lzqOwEu0uRtCWC
keQ5YYscgRLtgBIOxxWCShpC5k3SYPusKFaOOnux6SvzSNvWM6Di756swq6TRdBHgYERFd9/uRJD
Iwr+/NN97L4Kg6l4NanoCGoucAjrG4kVDI39S/1nH+u8T4uLNCys60TPYDH6R4rkBCQhISEhISEh
IZERabP5tMHbbcdZEqRNR1LTSJGowWZ7CZvznE3TJgYjyWNUansEwqV6kjDXPsVN0kSypnok1TY8
/1xBSPKnHZ56qxS7TxQRczEytlua2p+/WYZPrhnCsrnRSfX0cPKm2AidaoQi0RP9qEJIBwDMreXJ
t/uGJWmTkJCQkJCQkJDIlLQJSq84vUqE+k9J2pxkzEbSRALnYIOKkKvNRtIKQDFDIf6V6DkoYdOM
EaI5owrHXsNfjX+PxGeYJC7pf0j4szkSiSt5klj8THsYP3mj3CAmZgwODvZ5IKrieGsEy+eOTtKn
SBE7hBE3NaFoNtdBuK+ihtKiGGpKh9E5UIZoTKYAkJCQkJCQkJCQyAJpc7mimRaLouLMIm0iKXOR
tDT7RRMzMRhJvqM2fIyRthirfcRuxijchlPTppgaRE9zyBTFahobacuTZnri9TK09IS4hs1ZR1Yq
S3VMiWCXeqJfkpUZVT1o6q7ASKxYTkISEhISEhISEhJBxEzb36l5F0+j5uRWYVtwEVcwBsSTvYkh
/m1+cA7NXCGgDE12vzMIhE3Qkrky1ileWezMoovf6/xsM3FzfBvvrYlvrA9PRPCrt0qNeuuOsJ56
CFgxJ4qLFwyjkKNvKAEeIV1Y6nA/CUBd6SAioZgMRiIhISEhISEhIeGbrfnWtKXzaROjRkKHR542
IWqkM0+bjoTQX0DCrIoBt4Av+KHZt2JESLjzryVJnm1977WdaIxqCh56qQL9o6wfNTthIyvBEOvU
+2/uR12Fnpb6FMxzYxJnPQnrEiN+AoqrbynhthLPsyEhISEhISEhISGRXdKGdD5terIgI6ku4Ige
WUikTRntSRAz2IkaF+ATW2+TR922tTLU8WvoHkFIEtt8wBsflhjFIGy2huFp/K5bOYS7rhqcnE8O
RJNPe3/EtaUepSg8avStJG0SEhISEhISEhK5IG1po0cqSGJfmewCgCt6ZCGRNl3r50mWrZsHzDwJ
3CwO1hZ6Us2MleXcOE/RzWAs/hthorRuPQMKfvhiGWLODidyogLlYR1/cOcAQgUeb8PZF4nQqrqt
v/mY1h3f67Z9BCJtqtS0SUhISEhISEhITARps7O0AAUeLK9AoGiDUMIRTzNHO6Fym0ImSFeCfMWj
RCr5lYPNC09tK8feUzzEvzh4DOLO/vnCxkFcsmDEdd6Z82FD0zS7PlaQz41lGiluxUHsJGOefVlg
41xCQkJCQkJCQmJiSZvn36kikXhxLpiBSCy3JudnXdjnhCrsV8y/LeFfTMstbp2fJ7YRdQ+qa5Ix
QdMGQfsGIUAFLCVd3LHPZD1wBB/Jo2AkTe1hPPFqqRj+M3EfrAPn10fx5et6XXU73xvC7/+fGtRW
afjPb3aiolTLf5Lm7HKzH8WtEF8ViUCrif61+t4qWpJnQUJCQkJCQkJCQsIvd/PSe2kC79I95M6E
eWQK/zXRh000izS4igJ7cm1Ras5XrYSoJUuiPXN+L2rUxGOT/kTeBSHR8fjrZWjuDSd82ax+NP22
vnrDgKcm7UevleNIWwTaOeDZd8tw73V9k+6B4ukr0g9YaR4pISEhISEhISHhl615mkcC3vFEPPYn
SBuEg+CV5C0JcVMcSbj1/OZpLsnb0SqKqHUStWgOTZuTzcZ94BQlryX63ceK8Iu3yhKEzbpf2oaB
VfOG8ZlrBlz3cPBMBE9tK4VmcrnHXi3DLasG0FBdYGaSokbVNtCFfnb0u7PPLR84CQkJCQkJCQkJ
iYxJmyNtmif/cvCqsHixZAm1rc+KY+tlmlkIEKNAWn87I0L6bxUzJ1set8BoTMFDL1eif0SFotmr
aiQLZ39/845elBbZiZim6Xj01Qq094fidrOnO8L48Rvl+LNPdRfmk+OKIpkIPBLvT8WZz9A8BtKl
TUJCQkJCQkJCIrgEiiRMwsk2vL43SJuYa82mcYMjtD+8FRBwKDAKqgWTtVSK7xUr7bb5uRCk+Fd3
l+K1PSVGPzt92TT2zx2r+3HtimHXvbz1USle+KA0YVxralh/ub0cd145gAtnRwuZsyV/Smz9be9z
qWmTkJDwg5ULgZn1wNwZgKoCbZ3AyVbg8GlgYEi2j0R2MbsBuO0qNu4WpT92xwHgpfeAjh7ZbhIS
4yF7Bta0pQr57zKB9HGRgidt6diZLQeb13fJlJf5he4BFT94qYL7HopRZhRO2OrKNDx4Sy9UxR5i
Y2hExf99oQJRTUkQPfBrtPerePS1cvzTV7vM9OuTmLUJx+iStElISKRBYx0Tnq8Gli8AiiPCFwv4
hojbb98CTrSQFYRsL4ksveF0/5KITeaTkJAoHNLmJFx+8gU4fyjP3blSyu82rRngqUVzaltcJY+t
I3++tRwHmoqAmLuOlJftixv6cNHcqOu7X71djp3HS9w2sCb5e+H9MnxqzQDWLB2eHJwN6TmcfMmN
HdNqgKVzgKpyYE4DUFrM98+o5/NIRzcwwobjABtW57uA9h7gyBn2uVsKuBL5DxrXn94IXDCba9e8
MH8G8NnrgZ+9ygmcnFOSI8IklLoqoLIs/bGDw1xzNDg8RRsr4DiabO+zilI+VooiqY/rHeDjJDqa
2/pQrluqT3WF/3NyPYbl85T/pC0ZD4uTNqePWqZp2gqvFfUkvmyw+Tr5s6HMzxY41RbBk5sqoGnc
BDbB1tj/IWBRQxT3UYh/R/1busP40evmTCMGLhEGynBMxX++UInLFw8jEtIL58kJrGlL9K/0aQuO
MHtxLWYC7EULuOahoQYpE7dXlHjvb+kAPjoB7D0KnD7HiN2obFuJ/MN1q/lihKqmPo60cesuAbrf
kSZqqUDC5U1XAFctT3/skbPAi6w9Pz4jOVsuz8lH0DtlzQrglqsc2m0PvHeAj5P2HD93DbXA7WuA
VRf4P6erj9XtXeDtffJ5muzPp1+fNlcgEk82J/qyeWgakrHDwms+PQPhPZXzW37h8TcqjRD/iiNi
pFXt+2/qxbQqt/T7083lONEeTphTwk74rPPfPlyK598vw+9c3VdgfY7M+l1q2vyTtRCweil7QVzO
BdR0Qmw60DWo3HAZcKwJeOEdvo1K8iaRR2Ne1B6nw6KZQHkJEx67ZdslnbWDzLlT3BoiyL3rk6yt
5k4HLpiTnrCN573PnsZLENRU8HNoXugblM/TZGJr2TOP1G25hW1R0OFQ1YnHKIVM2pxRVuAO/+56
uj2L11OQH7aSu46V4qm3yxNBRKz7NjOhX7V4CHdf4U6kfehsMX62tSKRGsDZboJvGxnXPvJqBTZe
1I/aijxPO+0V8t/G2dLbBMsJzB9Iq3b71cC8GUAolP3rL5oFfOvTbIx/DLz2PnC2TZpOSkw8SOAi
wuY3JWdtJRAJyTnFz7Tt+7WOKUzacinA5zFIy0bmyPReyJd7Ly7igWHIsiQoKHhRIysfn5bP01Qn
bUiWp03RPeRWuD8rTplXJG8OFEzOtkmIUU3Bf7xYhaFRRxAR8OiPZRENv39bD0qLdNcD/f2Xq9A9
FPLuPMXxmQnKH7cW4xfbKvF7t3ZPiWdPTmDJQf48n7ueRy8rCuf+91Yv4QTx+e3AO/uBfhmRT2IC
0T8YbPGAfDdjmpxT/AiOQU6Yypq2qSiQB9Gyjde9k7aMtO6ZgM6j8w+fks/TZCdtyQI/Im0gEkwl
fzaecStxP7rwt/2/hC9Tsv8gfMofvLSrAm8dKPVMske+bLev7sfaZf2uWr++rxSv7ClL+LGlGoQm
gSN/uSe3VuK2y/oxryGa133uvU3Vv959LuEGrQ7ecxM3+RqrKWQQ0Iv6E+uBMjbcN+8CevplX0hM
DPqG+PiLxfxpmJvbgcERubjpQ97JnVA6hQnuZGgr0rItmcv9pvNpnMycxksmIG39LHZuTSXQ2Suf
p8k6lwV10oqTNsvU0eusZJZiisOqMH5uIanXnGpFm44SqQ1LU5pHTiy6+lU89EqlEc4fjkTaYILE
tPJRfOPmbh6qX/iuf0jFw69UG1q6lFo2h+aO/m7pCuOx16vw375wPr/723Prt6+leWQy0Avz89fz
FU+/pmHZBPkSUX4iCmJCTtwysIPEROGtPXwBY0Zd6uNII/fuAe7PJueUNNO2Hux4XbI2X8dOhrYi
DRvlQ/SrZRuPe7ciI1eVZX6N2aa2LdvvMvk8TRxbS6tpQ5pI/SJp05PwEE9zSUHeVZyDQC+kVkwV
BTJd0BFnrP/8ufGn3q7CwaZiHuIfcAUSuWdDLxY3umO4PvteJXafKIUe826tsKozQUNJXE64ZRLU
f/1OBe64vA9XXjCY30+O52DV/Y0JOYG5sGIhN4lsrJv4uqy/hG9fkBH5JCYI+47ztBa3XsV91pLh
zV3AniMynLYvoTGggCnNI31KQAXeVuQ3tmwesHBmBtJfDu+dCFumppEW6H1K2rb9x7kJtXyeph5p
s4XaEC4XBpL7sDllVy8lRPwY6cQ24Th9vghPvFFlf8iE4COLZwzjC+vdvmfnuiN49PVqw9TRBXZu
bVkM//K1c3jnUCkee6OGEzuhvyloyWCMknjXYNU3hwokBUCGE54c43HQS+WutcCM2vypExG3pvM8
ZPKAFIglJgCbdgLHm9mzsQ64cJ7dv/PMOb6oQORueES2lS8ikkH+MUna0guShf4+u3ghcOkFGZjj
5/jeZ9Xzd+OYCGmEa+wpp9q5Tvk8TRnSBh/RI53mkck0ay5HOcWD2BXa7OZstWTRWOAjgmQekNZH
Xq9Ca184YRYpaNkU1sn3X9+NhspRVz2f2FyJ051hz/pTAu7PX9ODjcv7sWr+EN76qBSHW0xNnqjF
Y7/59sdleHV3Oe64rDeP+zyAeaRj2UOStgQoJPEnrgHmjMEkknx6yDysqT0hwNJLalo1ML8RKCnK
7Lqf3AB09PKcbjIlgMRE4HgT8H+eSixu0DPS1sWDj0gEF3iCTPMy5H8w8lKIIPPByy4cg4VHju59
ei1/J6Yy1yTtOmnPyopTE05yNyCNXWuHfJ6mDGlzUgskCfnvIl6TJ1XZlMGOo6V45t1KN2GjbYiH
+L/rCjeZ2n+6BE9tr04EHxH9E9mEMrc2ii9cw23NqstiePDmbnznieluHzd2fpRtH36tBtcsH0B1
aWxSP4BTHXeSFmE+EAkYJZLMFinSIyU4PdeRPNpeCXvpLWXXv+ZinpQ7CIGjF+Yn1wNdPVzjIV86
EhOJpjbZBuM230pZZNK3Fb1zKGE1mebn273PbeBkKxVosZJysFGKgorS5MdZJpK7D2fRRFI+TxM/
hyXzzEnHvSzSpsMeLFAVjnGOE1XYL3ICZ3HGqkj2eWInKz19MBLo/nO1TSCiTPB9+JUqDIyqPJG2
o8GLWI/93s0dKI1ots6PaQoeeb0anYOqt5ZNIR+4bsypS4Q4u21VL555rwLbDpcltG1mh5PGdt+Z
Yjy1rRIP3NiZZw/OGDVtRiASHZqcgnDFhcAli/lKoV8QOXvnI+DFd7i5R7pHZiDKXlZHeLn6IuDu
dTy4g1+tHpmWXL0CaO/NfgQuCQmJ8YGGQLE1jOOn6hwdNHJkobbVRQu5li1Q8JFxuHeKZDmzIb32
jyLM0juQjktF2kgLRwSQrnmqVT5Pk5W7peJrGryiR8IjTxtS5Gxz+LgVJhv3CqSpZ1gw4Tf/6p5K
vHmggpu5wsGa2YN/5xW9uGaZOxb61oPleOXDck6+nBFA2XkrZg/hs2u7bPdXFI7hD25tx56TJegZ
UPlvCgm31RDwxJvVuG11H2bXjeThI+MnEIl3kXnaeKRGIkNkxugXZPr45KvAB4eAoQyGBGnmyNzs
q7cDS+b492G4ipG9HQc5aZPaNgmJAhR0gphneURam3JtFbBdC62tyHR+1ZL02qyJuHfSilG90r2f
upkodqKFuwBMT+MPbplInmyRz1Ohs7VU5pHpgta7o0dafm1ITsicLl4QfNukCnXi0DMYwg9fqYdO
KghnxEiFQvzH8MBNXa7zBkdU/ODlWkRjHjMM21Wk6vjW7Z2oKnWvs1y+aBB3XdaLn71dnfhNa7iw
w1t6Inj0jRr87WfPTV0BYpKCHL8pFHE45O94i7C9fzAzwmahpQP45Sbgy7fwl50fjRv53V1+IfcJ
kNo2CYnCnHODykcyEMnkfJ+RJovm89VL8/NdTr5s6cgkBXsjF4EjZ7h554Vprllfzd+3pFUcGpHP
05QhbfCTXNuZiw3J/4aVq60gw/0j0UKAR6AR+FM15ol55H9tq8GB5iK7qaLF2dgkd881nVgyY8jV
P79+twq7TpQmCVoCVES6ML2MkT3dTero0K9e14FN+8vR0h12ETe6JqUQuHN1Dy5blCcpAIKaR8Jt
BjvVSRsRtTUBtWzPvw3s/jg7oc2PnuWR9z57XfoVSgtXLOORJMczBQCRRTLlLGIv2nkzuBkpra6e
7wa6GHls7ZyYgBRe9SKcOgcMDPGgMERuR/PQHdWob4m7PVva87fO+QhKR0DaCqtQhFVqR/Ivpe1Q
nkW0DOyCk0dzdHUFb+PKMr61TOGonakYY7gre8GSAufgSiG6kA+xUecy91jp7ktsxxPL53PSlsws
f9icU/2YTWZbbDPmpum8nVKB2o0CEtH7iPqeSFw6zdzcGZwMHj49tZ+nKUXavII/WnKYmCw7nWkk
HARO1LApevKczLLPc4OTbUV48s0qu6Gy2eAKE7AXN3iH+G/tiuDxzTXxQePsLy02glMtzfibH/Xj
sT+vQm2F+7cXTh/Bveu78K/PTbM7QpqldziER16vx8ULziKi6pPm4ZvKExiZpZCphl8t28fsJbPz
ELfhzxboehcvYgJROc/Tk1ZwYsddtIBr23oHkh9HTuE3XgHU+yCkr78P7D3mFnDpGmtWcmKbKrEq
CWsfHQe27+NtlEtBme5nGRN2rljOcxqVpvFDJF8LEg7IlPXgyczJ9ni2J9WZzGA37+KC0EQ+oxcv
5mknaNylw09fTe+rEqQdqQ0pubdTmCaiTosX61byqHupxiVFXN2ymy+QxPLAmSWI0DiRuceImC2e
zecmWliYXudvDFBAChoDtLC18zAfv2Nqq4Dt6mwrv2OFCNyJZuDd/dzvONcEjhYKqV7Uxl4g8kPz
FmnjaL7P5N7HAtKGzfaRm+1MG0/7QYtM9JmCkqQ7j74n08tDp6bO8zQVSBuc3ArwDKOBlOaRaRJs
I1kONxRYx7oorp8kCck0bxN18zoe31RnmCLaPEVNfzSq0v3Xd6KhIuqayX+ypQYn24uM8+JKNiuN
AyvRwTbERnqw43gVfvzGKP74bm8p/XNrOvH8zgocbC5xR59h1359Xzle21OB2y/Ng0zHgQORuFcr
pvoEtnKRP0GEQC8l0orRSymbbUar0iRUkjA722cC0wvmAO99lJo8khaKrkkBTNKBCI0ipDyhF+o9
N/Eol36iaVIbrl3Jyy4mqP16C3+RZ1NILi8FrlsN3HIlUFPp/zzSYFIh4kHC5G/e4oJ8UGI51vb8
3A1svC3kWkE/db5zLavzxbwt3/2Iaw4nAtS3JFimW3EnEIFO92wEaUdaxacFFeuaJOjexAjfxtWp
Ax6Idad+v2wpJ38vvzc2EpHNadu/lDl+czS1L7UV+fjOn+FvEckJ6hciGVQoIu/2vTzPX1bDvPsQ
4OlebrmKjZVVXGuUDqTtsupN8wQFmNp5KDcLUETEiLBdlsKWkJJQE+mlRZNskxc/IG0YkfVUIGJJ
8zzlEaXfpq0f0kbPJR1DYyXVwmOhP0+StHmTNu/k2kJRHNd1fuelgS88bVqQQCRA6picmJC733W8
DM/sqPTuNPb31Yv7cfcV3a66HThbil9sr4oTPUXsRJpYYoMYGW6FqvBYoD/ZHMZtl+tYOsstUdZW
jOIbN3bgL34604hECcCVbuAHL9dh7ZJ+1JSN5seTEzgQSeL7qey6SeYyJDj6FUxo1ZNM16I5MFs7
dJq//EhY90OS5pumdbqP0RF0BrlyOfCZjbxtMslXR/4ZJPg88RLXFmVD6NlwKXD3NTzpuTKGcL0k
hHzrM7xev3gjmCCZaXuSEPzZ6/nKddC6kznal2/l29c/yK6GN5f3nc3rWecsnMXTXqzKIPkwPStE
9qgdn32LP2uF9mbPKTFn7XLDZaxc7n8Ryy/hv30NsHQuX3zYczS3Y88654LZwCfW84jAgRNVm/PE
1+8EFswEntuefa3bpUv4YkKyhQfSVu45Bhxr9k/asim1EeGlNki3MNJm5iQdMUUhIm2nz/E5L127
0/WJGBI5nWzP0xTjbknbN51EGidtcTLmUC4o4hZJVHXC9wXXs46AKkmTiiPJcV5OguPEXokgff/V
OiPEv+33zJwLxaxXf//mdhSFNNsqia4r+OFrdegaDttHgkCyooMt0GOjUENF7M8YWnqr8djrI/gf
Xxr0nFhuXNmDNYsrse0oI5BR2FIAUFCSAy0l+NlbNaw+5ye2u/XkWx95tae8TxvZ1PtZgbVAAkdX
X+7aa/8JrkGr96HRICGLNE+qklyblYkTP5mtUSqCID5+XiAi/JXb+GfKX5cpcSMtCwlfNzJhsiqL
wuSVy7hA+eOXeRQzP201Ue1JbfCpa7l/y5u7uECXr6zNz3yiZ3A9MtH71EYujGcKmusp1UYsBvxq
y/hofjJtI9drPcdzND0LJERnk7CJIE3tvbcApW/yyLm5aKtsjhUCLZ4RkSU8t43P/dkABZ0iwpZK
i/XBQW5JQSaqfuembL7LyZctnZaNQIsfVKzfNUwkz3EiNyeNto2+p7Lv2OR7nqYKWwuiaUOK6JGq
k4TZSAs8nOFSxWpIsz9fSiZalnwqL+yqxJsflduTOFi+hWzyvPOybqxZ0uc6b8vBcry6p4LncvNQ
l45G+xCNdjHCVgxFZaRNjbDdo/j1OyV45+Miz7qUFml44MYOlJLfmioQNmscsN/6+fYaHG+L5GVb
BiqCieRUK2LginSgFzb5Q5EpR67qQyuOPQFMReY1JszRUhFzv3PwZUuAW68cO2FzEjcysQypwduD
nO+/dDOr01XZJWwWaPX//tu54BBkocNve9L1iWxmqz1vuzphdjTu75cMfEay0Y6ExnquaR2rEG6B
/FgNTYAyke/r7LXneNcpU5DmnjR55I+aq3qRZuzmK7M3ViziRtpC0jqNtY3pGhsYYVu9JPlvkkXH
2/t5EKVMFjjGWui5MHzOfJgv08JHS7v9fMNE0sd6thWEicyuJ+PzNJWLHx4lvgPUTC4wGRrJV2a7
PCzd/SE8+kY9dFV1mSJS8JG60iju39ie8FMzS/9wCA+/Xo+orroeXPozzEiXEmtl36lQGPMjwgaD
tKkYQRW+/0oVhkYUzzqtW9KP21f32Mw0LdMmIojNPRH8eEv9ZOBsU7ZYJoZ+QC8nCmCRy/rQy69v
wP9LaP701KQtiGxMpnuk0ZpRl11BjYgbmbSRwBa0PSii5jUXB9OGBsUiJtxdx4Sy2kofdQrQnqSx
+PS1/gQfv6ip4Hn6aDuuzwqyLDgGaEfq+2sv5RH2sgW6JvmykoA/YfNPwFXtyUDaCBQtkfxSycw5
2/WixRG69iWLs1tnIm5XLedkf6xkaN3FvCQzHaRFQfJfpaBOmQbYGGshE33SBqZzGyDfUDKFdL4X
z7bx/ZoPf2ZaMJvdMDmfJ1n8P9uqF2fJlMMUCvfxV1OM4a5z1xK/eLsWB1tK7GH2TbNIOuKL6zqx
pHHIdd5vdlTj/eNliUTaSqJQQuw7Vvfij24fRihcxsgfadlCbLKMsBI2zCR3HK/FczurktbrgevP
Y1pFjGvbnGAT0jPvVmMn/X6ejYIgfVfgnHNMhQhKiU9/tpOtQP9Q7utE0bf8RjekulvrHOlGRDpc
z4jLzGlj8xdLZRq19mJu0um3HTasYoL6smCEjUxzjjdzv68gwh6tpJPfCAkpfp+wdCD/O/LBysSf
JhUuX8qDsEzk7OJLgMzStYisUaRNP36eQbBwJu+fiZ6ps9We410vCvFOGpVMQvtT2y+Ylf16UY6w
TH3Y0oHSStCCyVjGzKqlfAGiMkXU2PcPcdI2qgW//2yNAbLg8GMaedokZ87zqe6Ug7TNR8AfygNH
BDHde6xQn6epXuCz3+KBSJxh/xXdbS4JwVzSIgm2CCWAPSiGkub7CbMx1T2MR52tlquli8xxsr0E
P3mr1jA5dIG164L6Idy7rsMlhbV0F+HRN+sT54l9wAhbRVEM37ihjZ0fw0u7i/BRcwVULcqOI20b
I3BMOh3Vwnh8cx02Lu9HQ6Xb6Wbx9EF8eUM7/v3l6XZ1rtn3AzEVP3ytHv/x9X6EJiIFgFfUyIB2
ElPVp420FUEEQdKCDQ3nvq1Io0f+X37ICpEgMbJesuExVhCRpBVgivrXWOcv+qEXMSI/DT8JwYnk
3bEGaKjxd22K1Pja+9x81fKdIzMkEuJIW0cruanIKLUhRZij5LCnz6V/3LLRlqSJo/sLSkbI33HR
TL6anY08gb6lnICaNj/T1lhAOaFIKKSFC2rHoNpYEpxJ017HCHD7OAcC1gNK4uMyRyepE43VY03A
gRPARyf48+H0T7U0wBToJV0iZgJFUiWNG2mT0vqJZSlqxFjHi6UlpEKpI4JiQSMnbERQkoHa+K0P
uVlk4PvP0jih+Yi0X40+LC6a2vg85PWbp1v5WElnuWH9Hmn3ms5PoudpisDVjo7A9fCKq4Ekedrg
CD5iS6rtlS/AIyDJJGjSMfDg8cFDb9SjtdcM8e8gxyRokbarrsKdtfeJrXU43RFxBx8xr/H5qzux
tJE7CH3rDg1/9Ag7LFRimEYqSrFB3hRNx5FzFHmyBn94q3dyoc9d3cFIXxUOtZRAH7XXT48B245U
4OU9VbhjVdcE9a/XYA2yBjI1JzDysQqiVaIE0hQIItdtNTjiP0w+aYZUJQVpG0M9Dp4CXn6H5yty
kgPKjUbREC+c5590EIkiHy96yacLpEF+L34SjZPw+NttjLDt4Pm4nMImBTwgInbfbdyHJBXZpHsh
PxjKj5YsaEqm7Un3+8LbPHebWE8SIEnDSVq5IH5vtBJedmj8UgBkYh6Zres5+5u0EBTN77QwXRex
MXjpBcAnNgBL5vi/HgnvpEE53z3+gk5g7Yk+fn1MZm0nWnjQm/cPpie1tBDz8rs8PD7NC7QAkg5E
7khYT7eIM5bbpgWwN3aaRKjH/txR1ELylaXANH7nMDqPUlWQyfPZAASDFgiuXcX9KFORym37+MKT
LdBakHGSBTJC/bKgMb22kgjwqXPJF44sE0k/USSJyFKhcybL8yRJmz8XNBtpE7VsCpIHIlE8crpN
HtKW33j3SCWee7/aziEs4hUCrlrUj7suc79R950tw9Pv1Mb7Lk7YTC3bvPoovrS+PX78LZeO4JZV
Gl7ZX8+I2rDh26aSQlbhqtWfbK3FLZd0Y8lMtxRUzwjjV69tw1//fE78eKuu9JMjowoe2TQNa5f2
oXbCUwBkOOFNwbFOkRdDIf/Hk8BIUedy3VZEDv2aHMXNI/XsSjwkrP1yU3LzlgNMsPhfPwU+vZET
rFTmPiIoVDsJdqny8lCQAhK8/ZitPr2ZCWUfpCaBdA9EmGh1P50WgPznKNnrmbbstefJZp76gEjw
qCNVBAk8VDfKCUUBW8gkyc9CAvkfUvuM23Ob7chsGdSb+pEi+FGycSeppsUUilDa3MHH5JqL/JnI
EVG2fBknYqktyAk5J23m9YmsUTvvyCDiKy14UE42etbSkWdqeyq50spSREKaw4gEuRbG2HN3mD2P
Z8/xZ++OtVxb6AdELmjR5IxPgkGLgxQpkkqyMUnz/VZGLLfvcc8R4x0VkeagVNpAkRC3pMhXSs8k
kbBzXem1dvT9XEbgI6FE6oBCf54kaUsRMdIjV54qdmQynZLz78K3HfURqsWrFSfIK5PIzkOv12NY
U+2JtE0CVqRoePD6NpSGY7bzaFJ7+PUG9IyE7CaVJuEjzcNXN5zH7NqEd2xY1fC7Nw2gqjgKJVTG
BkiEHatypRy7RtdIBP/31enccdajrnet7sZ6RsqIEMIhVFFQkn1nS/H0u7UoxNA++hT1ayPSFg5A
2kgD5vQzmGi/EsvEM1u2/oRntwJPvspftqmuS+1BxwXJw3YhI2QV5amvS0mT/ZhF7v6YawF6B9O3
J5HMj8+kr+diJmiWl2WvPYlMPsYI274TPLdfsmvuZQLmOx/5jxxK2qGiosL2wQoC6rdXdwCv7ODj
Ltk1SQAn7apfLQgtNlSW5bc/23jJKeSv+xtG1v75SWDLh6nbOd2z5kWUXIuhFmnLQZuRNvbxF3ld
Ul2zb4hr6mnhxO8cRgSDit96X7E8/cIWzaGbd/P6jIc/abJCdSQtmx+tPz1rVFJdjzRxp33kQyQy
O38mMGfG5HmeZAnWd6pTBeqZmwqTLQrLWFwEx38ov/xhFbYfqXT7suk8YuRtq3qw/sIe13nbDlXi
jf0V0MTgIxaBYgLsJXP78YnL213nrV7Qj89c1YdQSDX82cQE3GTmuOlAFbYc8g5KEgnF8Ls3nkNZ
WIOu2MmltUrwky11aOoqQmG5hk7dqEVBzSPzMbIcLWDQQkO2XFNp5ZRCTZPJkt+6bt7pz+GcQCTZ
SGYeSRLNs5GbH6XTspGA9TwTtJrb/dfz/QPpTbHod0lLkKx+QdvzrT28TdP2kc6P9WumRyGyySQw
H0PB58JlmrSUlA5jNJb+un5Jg4205XvY/3GoC2lOyKeKIgKO5TqkxSI/tXQkiAT1ijKeciVb6SYI
RNjf3pfIvZiu0JiisUVjzA+o3kRqqn1EcCWTa0o/kErD//FpJnvsTOQ6y/T+szEGKCjIXB8BSMjU
lAgbLUqluh5pMsmM2U8USctcdrI8T7IEjB6ZkfvWpA3X4sy4DIeH4Phr3boGQnhoUwNiQp11y/KQ
TYrVxaP4xnWtPCGacF7/kIKH3piOEU3l5qxiMBhiTpqOB29oQyU73+t3v379eTRWjXCNmUC6SNs2
FFXwyBvT0DsY8jz38gV9+OTlHaSgS5wr5G1r7SvCw+z8wkumMcWXePJteS6IJmeACR2j2bvHZ7by
F22Q+pKgR6ZGfleq6cVs5MbzuNbCRh7sJB1ONHNn/VjMfz2JsI1E0197gVWHMbYnmbmSqVhHt7/6
tTBh80wrNyvKy+XiCVKzkd/e3iPA0TP+rt1lCpS9A3najuOpOpmgQmPfTyJqes6SPmsZtBWRg71H
OQkLUl8aW/uP+Z/DSEtYX5n6mpSc+s61PNF3KvKzaRcf3/kwVmjuW+DDNNIKQJLueiOCiWQ6kHUF
peDwNR6m2PM0FWSwsMhdNEueh91+UvPYKkkKHFZxituaTyIAfvlOPQ63lHLiZWnXrMZmnfGFtR1Y
2uh2VPntrlq8f6LM3uEWeWLn3bKiG9ctS75cPatmBF+/7jz+8dlZ7uifbADsOF6O53bV4J61bvsa
lVX2vg3t2PRRNVp7IoZ2Tux8Im507p2runD5wr6C6QsdbuvUqQBaEY7F/B9fzV7S4XBmNvdBEMRs
k1Y6R2LJ+y/Iu4zMBw+zMjASvM57mZC0crE/PzSKJkamfV51Jk2bn6huBxhJ7B4INm47+vz5Cqaq
XyCtZQc33YwFOKnZjBxa7CM6Z1kpT2syGsv9s6IFPFbzMef4RSfrt/NMuI0GqAQRdCp+fC0pmA/1
99DI+M09WoA20PN4jiZtb0Mt155Pr+Fmg9bffnzESGulqKnvLYhMTn5m5MvW0RuwP1gFjpzli0Hk
U5v2XcDuraoieb0pwuuta3hut2SguYjMUN/am3psBwge6evZSwYjN9tMf/MvmT2ePOfvt86YBM9P
NEpD2zada8un4vNUiNBT7AsU8l9UIijiVjTNcLp6KfBOUF1QzZc6SmCwpYbc4GRbKX6ydRpvbqdP
GnsJLKgdZqSpzVWH1p4iPPZmA48F4jyPTfw1JTzEf0hN/Th++srzePHDKuw8UWEjjdb2kU3TceNF
PZhe7Q6LtKhhAF+65jz+7cWZCaYvnNs9FMbjWxqwcm4fisP6OD4ymUSPNPO0TVGnXCJtWoCZe7yS
0AYx24wK5pFjlXjIjG8ww5QGJCBHfWqIKIpkyCN4ChFVMs0xtHA+hv1CduxIgMTVRLj9RIkz/AST
pVEI0J5kMkpEIEh7GppTnySM2ssKsJXzWSaIOaOfZyRAO1LOve6+YPdJ/lh+SRgRh5TBfHLUnlkP
7pJDULAWyq1mRBacaZK0Wv/BO5KBFnksU+SUbeUT5JdnmewFBY0zvz6lVO9kgYBo0e2WK3l4/1TB
cMgcmiJu9g34GCsBnrtMxwn17Twf6RpIg918Pn29LVAuNyJulG8xHYz8fY08tcRkfp4mFWnLNHok
nNEjIUSMFCNFwhH230HeUsrBhUJ5s5lNPAd4lBGv1v5IYqlD9Eljv/mVjaYJo+P3f/LWNJzsFBJw
C1oy8oH71JWduHh2X9p6U/42CnDyxz8qw6iu2o4nMni2m5FDRrz+4q4znud//urz+O3OGnzcWur6
joKSvLa/Gps+qsFtF3eOP2fLoN+nKmkzzOUCaM1opZBW5X2br2UI8lcq9pkHjUyQUqUhCNKvpNEY
yTClAb3Ioz7JRkmSNAW0el1a7I+wfupaXnJKUvSxtSeRtuGApK3XMnfNo0WEoO8DPwJRoDbpD54s
nRYfgmjOxnsOzNRPcDxBZsyUe41C4pMwne3k5n6JRhCBvKs34Y8bFET2+gf9HUtt4bWwQ4tRRNYo
X10qq4M9R4FX3+d+hNkkJJmOE6r3Ap8Jtcn0+GSr/98ZZET6VAuPLJoujQtpxmnhjhYDOnsn1/Mk
SVuakP9wEDNRy+Ykc4riQdqUwiNtuuO/uCYlzX9I+lf28cGJSvxmd61LQ2UQHjYRrp7bj09edt71
6weayvGr9+o5YXNUjMwrZlcP48vrWn3X+rplnbhxZS1e2VeTyL1mdTur27M763DbpR1GUBMnqkuj
+L0bW/GdJxckfk30b2ObH7w+HeuX9qC8eDTnfW7feo+FtH0+BScw8pUJYh7ZWJ9+VTgrpK3GP2mz
TDyzQdraOoNrhuKEr2fsaQoqAkbzzBVqrYicYyRtnRnk9esdCGbuOF7CR9bztAXRPmagPaHnwm/i
cV2fGNI23vm3/GLhLJ43kMhaeUnux1W690+Q9WPStPUPZk7a+nySNsr36DVHrF0J3LEmtQaSTAUp
EiolFs/meslYxsnsBv/EnMwnyexzxUL/1/eTc9MCaftI69fRMzmeJ0na3No17zxtSAjQSELWXFo2
izxMmg4dayST7GJUU/Hw5ukYHFETDoRCEJGwquObt7SgrMgu/cV0BY+9OR3tg2H3eQqv6zcZiZpV
6z/TLBH1b93ShA+OlaO9L2I3t2SfOwZCxm/+8z0nUBRy29DdsLwTG5bVYMthJmGPOpqLHX6wuQy/
eLce91/bMk7iVObmkUbY/ylqHkkveEpkHfKR08nQtOWYtNHqLSVu9eMbRjjelFqgDUoyRjIkbfSC
HfGpgSxNommrKMkP0paKeGSSMyloVLOgddTH0wp7AkhbRtYAeb7yHsicyyOnUS5ARITSbXxmIzd/
HE8xJeW9BSTrAxmaeJO5XyBNm4O0UXJ3ImxEgJKBzC9fehfYtieY2WPOnhMTRgCSmT5J/Uxecrlo
QHWhlC6F/DxJ0uYvw1ictFnaMmeSba+E2nGNm5iMWy9AlzYgfTR/P/K81/JOFjSPL++pxeYD1Z45
2Yi03XVpF9Yv6Xb9zraD1Xjhw2o3YQOPODm/PoqFDcM42lLGiKE/ZyCFdTAJiJct6screzySQsWA
V/bW4u7VnbjhIreZY2lEM6JbfnC8Ev1ayG7qadbvp9sbjHPn1w/ntr/HyN2m8qrTsWZg6Tygqjz9
sYvYi6S63IxZk6P2IidsXz5dVv2buLCRlLQFeZeNYSyQWYtfwkXBOaKjHqZFoWApGCaEtCG4b1dg
opeHpE0PeHBa0paDaxYE+c2UVCK39SPCRgmmP3MdmwvL8uNZG+++JNNsv4tltEDl1KIvX8BLKlDb
PnAXL9kG5YKj4gXKW/ezV/n7wuu+ySwyiDYsl6A+IFJIli3N5wvzeZKkzZu0eaaNtkib7pChndtk
4TmcShyvfs1fy8lkgUQmXtPWOxQyfNk8w24ywlZbOoqvbmyBM2nbwEgIj2yezjiR6q6Swsl1U0cE
Dzy02NCY0E4lCd9UhJxuZBRIq/1airBD9NUPN83Amgu6UVbktlm6cmEv7lrViV+8N43nbtMTg4hu
40xXEf7rnen4iztP5bjPM2RrcTPJAl2gyAIOnwbWX+KPtJWwl9uqpTxyVmdvbupz5UX+EpsSKEcZ
RUkb1bLzLhtLCKKKAKTNCJ6ie4/YfFz7GksdM4lGPR79NRFtN573mevjx6ONxqt+ZGVAGjYKT58p
YSN/pdZOvqV5iULir1qSXmPnp2/Hq63IX7nE54IZLTqRP3ShvDdTtTMRNtJu+bE2GS/Mn8Hr1XR+
cjzvk5q0+ZBK0+mROGnTkzA+eAcgUZJFkCzIJgxK3pBz0vb0jgYcaC5PBBERqkSr65+9uh0XznCH
yf/NB3XYcbLSSKSteLFpnYRABVFBw2b0pZKOtCmpb9OMCbvndDmefq8B961v9jzs64xovrG/Gm19
EbdPpMZTG9x+aTsWNwyivDiWw0cmKGlLHDOVNW2HTgXI5wTumL/1w2D29n5B2qpLFvsjkIRjZ1Nr
2TKRjjMdC+UBTBvJ/83L12t0ND/GYUqzxhybR45XfwVuk2xHZsvxyriuZ9aWEynoZHUcBcAFc4Br
LvavaaFIgDsP8WTUNH96BdNYs4Inls7GGB6vvrQiQvoBLRA7fYnz+h2aop0X5NjcMRPMa+R1eu8j
mIvxhfM8TTnS5lPT5uXbBpt5JBwkTfRpg4Ooefi1xf3blALqXD3JUsJYEo1nAac7SvDj7dMTSjSB
PJECbV7tEO5b1+r6vZaeYjyxrTHed3EGJm497t02GBSBoMPj95OpVS1Ox7Y/2d6Ajcs6Ma/e7TM3
r24QX7n2HP73S7M9I5D2j4bwg9dm4o9vPYsLGwdy1+djyewwhScwIl+0QkwrjUU+nLDJSXrVBVxQ
6c1ydxIhnFHr30SQTDvTBWgIIvDUmHnoMomOGSSICJEzrzQFFCHQbzCTzbv4in4uxi35xZzvSuHT
FjCSWy7JRkEHIsl1OyJ318+GoKPnAakk7crKhcBFC9IfS8/bb7cBm3byZ3W82iLIfRspBIoym5vJ
X9kvabMC3RTSe9OrnWmBkHJjUkqHfAKNS6oX+QeebCmc50mSNm/S5pxnvQORmIK5kob1ieRONG8r
zLD/6bQtXvuBXJtHPvn2dDR1FXnborLy9Wtb0VDhJkQ/f7cBxzuK3ZEmnVnOdXibXepIbYPlR0PB
hMhTncX4+XsN+M7t3maOn7n8HF7cXYv9TWUJbZ6QwfGtj6uxYs4AGipHUFcezVGfZ2IeyY+b6qtO
+49xoYVC7fvB71zLk39S9K9s+bbRKvctVwENNf6OJ6GE6t3bn72+o98uZqStL4PrGREvfQo8Rv4y
D4EnSOTEw+xRfO19/9HeJmw6zuGz5SfyXlYFvgkUiHI9R+V1yP8ckkoyX1w0Kz1ZOd8NPL2Z5xXz
o/koifCAQzlpjxQgjX85ExnaMvTLrfRpHkrm8bTgVzCatiR1XJiHWjYL5ENO5URz4TxPkrT5IG2O
z3bSJpzs1KrZApDATuBSOroVAm9LZ/XoR7uWJU3bnjOVeJpC9XtosRACrpjfi7tWt7t+61BLOX62
vcEd4MNLw6a4iaBnvylpOI5XP5tmjr94ZzpuW9mJi+e4nZlqy6K4f0Mr/vKXC4xIl7Y6sXNHYgpe
+LAWVy7oQdWCUSNKZs44WwbWsFN9AqOV42su4X4YfrRctCr5ifVAe7d/m3s/RHDBDP9aNtI0nT2f
XoAK0q+1pqYt6FigiJdXLvO/WnskScTL7gCatkWzgbJ92dd2ZlM4kNEjJ07ICnL8RJhL5UsyYFos
8mMWSWZq2/f5X1QhrX1tVXbGcJC+od+lojcHawfS7CyZAyye7ZO09bhJW74L1l7tSITd7z2PN6bX
cTNJWlBIl75DJtfOY9LmmMOTadpUL5k2KI8ptOLfBhI+73xsrUMBEn64qRH9o2F7xEiT1EQYeXlg
YwvKIlHbeXToo1tnomc4nNCyOaoTf9EqwplenxWP75xFScFRzS/6RlQ8tLnR1Ky4r3LHpeexbnGP
QURdmkB2D0fbSvHKR3Vo7wuP+6hI14/2rH5Tr1AYZvJTaw/gp0bBS754M3+xjPX3P3cDsHEVUF7q
77d7rfp2Bx8FqbB0Lg8oErT+ly3nTuN+zCNJ6GttN/PBOa5DL+bT5/zl1yJhg9pr4uba4Gto2b7+
ZLjniW5DTLAMMNF180OuaGGordskKT6uGYlwszu/pobZ7MtZ07j2MGg70CLQikX+6kzt0dLJLQbG
MuYmcj2fysxp/L5LivKzvkbC75k8onIhz5uy+OuzeJ42kYU7E23Hv0sWWlK4RkEo22z2Mh4efy6a
66TA8Eholzle319n5DHTY7BrnxSeEPum5R3YuLTD9Tvb2Dmv7a8x/F5cigeFJ+Yl/yNn1/lZEE7V
j/QdEc3RUW/G/+YhqlctblnR7jpXRQy/f8NZvH+iEoMaz0NHZND6PVpFeOnDWqxd1IXaJVFWfy1L
fe5Y0tCRfGkDDt10/G+56kTatg2X+te2EW6+gnILAj96kft7BAVpqD5zPfA7G4L5FFBd/WjZgmpI
SOCheyLhjHy6/GLNcv8RL0+1cI1asrofPQtcsYyHoU4FIokUYYxI3vDIOM+xAVdzJ0MgkiBSUU4C
kUxCn7ag7Z+L+pE5YUWaBNqUuyxtwCNxQWUmX1TJVt8GtRag+YPM12ku8QsKxkKaNj+guZEWnpxa
R9q350j2+4jmwhl16QNUkckm+Vp75cskE8OBIXtbks8YFT/4+DTw45d56oCxYt3FwJdvBRb7GCM0
jsjf/NCpwniepiJSadrgRTOc1CRO2hw+ba6IkbCbSKbMAyARGN39YTz85izDNFBxmh4yYbWiKIYH
r29mu+0N3D8SxiNbGjEwovLzxEAhJgn69m1ncNn8bmi66m0yGUQQEc5VFQ1nukrwP56Zj+6hsOvY
EU3B42/NxNWLelBd6p4ZV8/rwaeuaMfP3mswgq6Yeb/5rbO/2/sj+O2uBlwwfcAzqMmEcv0pPs7J
sf6ND3gC7SD5aq6/jK/QPsGI21t7+IvRD1ayc+6/E1g+P1hC6ZOtwOvvs7HU5a/PgnYr3Y+hxevy
56+3diVw8eL0JMvCjoOpzYpI0CLTyXT+hdRmX2Pt18IEpY9O5C5v3pjaM8M8bflINnKRpy2XQlYg
Hzwg//O0TaCQSWSByBClyEm3UETzAAVUuuSC7C1sBO1LSstyiJGMU63eBMYJsjC4+iL/C2cnWoDj
ze46v/weL9kGmS/edxsjOytTH/c+m1t//BKbEzvSX5O0WESIFjT6v+cTzdkZg9Qvp1v9kTZ6D1Ad
aXEhpf9yAT1PkrSlSq7tUChAd19EJGwisStI0pYsVItrdkzRgroHu80Av941A/ubyqFqjiY0ydfn
rjyH5TN6Xe378r46vHus2u5naNWfTTQblnTh/vVnXGQvW1g1twcHzpbhsW0zuYZQ/H329+7TFXju
w3rce7XbaJ7q/OU1Z/HGRzVo6Ym40huQYPnWkUqsW1KFuvIRVGQjBUAQTVuSIicwjlfYC/eC2TxB
aVmJ//Om1wB/fg/wpZs54dl3jBfnS4Y0Wdeu4ho9MvvwE61SBJkN/ugF4MhZn1q2DMg4OeJ/50vA
95/lfiyptFhXLge+cTeP8OUHZNa54wD3B0lWr4MngbNtwJyG9GSW2v2+24H/eNpfhLFkAin1xzJG
np/alP46udbg5NpnbjxIRrZ92nIdhTPfo0eSwEoCdjbqR3MSaWOsucnvNReYWplU2qtQiC/i0PPk
e27LQch/CoDy6WtZfdgL+dm3gO6+1ISI5m0ibX5AudkOnOBz8HiNF7/3H2QcU18u9pmbjd47x5sy
sybxAs3vRNroHebn96meNP4/PJIlEib9+POCtHkn1xZNGx352JxaNsXxQ4kTC6r5kB1/tLElq2vq
KcWPt89INKXYnhTiv2YIX17b7Lp+R38RHnlzJk+z4BRK2XnlkRi+ef1Zdhktp6341XVNeGVfDU53
lkLR3IPxsS0zceOyDjRWu9UqC6cN4AtXtuD/e2MuDEWgljC3o2v1DYfxwt5puGROL5bOGGDf6Vno
cy+pyr+PopzAzBdyFHj0eR4JkUxsgmjALFL2hRt5yQX+6zX24vo4mDlgJv1KmsbvMBL6WybwvPAO
e1k7TDHpewrEctsaoKbC/3V3HebJd1MRTuoD0iRSRDNqz3RYvQT4lz8EntvGi9/8eXQPt7P633o1
j5p5rIkLEHlBNgI8+gUd8j9HJCfXJCongg5SE4tvfio7v0uC7xMv8XnEEsippPOnvXoF9+F68hXv
Z6yazQN3XRPc1Hss81S6xZgv3cLb7pdvcM2bqHWj+hK5pPouCBA98egZbqo3Ooq8hN9xTO3iNwAJ
maCTdixbfURtR5pKIm5+2p6OoUWD3R9ncbELUuYZd9IGN2mDS9OmIDOvuIIM958/eGzrTDT3FnHi
5WhDMrH42jVNmOlBeJ58eyZOdJQmzhNIs8IE6E9f0YZL53bnvP7Tq4bx4MZm/N2zi9xDIAac7SnG
41tn4y/vOup5/pfWtODl/XU42FIeXzBQhAG941glthyuw4yqEdSURfPgqZNj3UJPHxu/z3NBPp+i
am3ZzX3ZUq0aZ7Nvyczp8zfyQkSL/DgosiMJQ0SmigM6r5OWjTSZbZ3p6/P2XuDGyzmx8kOc65iQ
+JXbGIlkBOyd/UBXb8LMUhRSaiu4AFDP+nbudIdJp19v9kyjcuTi+PGMgKBn+Xg9B9ccy7FTaQ4U
7pXmE/IxnZYm1QgtaHxyAzfrpnmIni8iQeTLSgL1pUuApfOCWw/4GgsZ9gvVmfynqMQ1jAP8uW+s
T+8j5lpQYvPfzsPAvqPjPFay/JzQ/VOfkZ+cL9JmmjNm855Pt/gnbdRPNH83VPOFAynz5OF8kqxt
/eiPLNImXoc4gOr4rCGu+Invt6LFA/bo8YA7IKATSr60n06JazW21aHpZlRAM5OtppmRAnV7oeNV
xqZs+41rBR/Vu05V49md06DFuPbS1kZsQr90Vi/uXt3muvbhc+V4aud0I/iIK8E5E9xmVQ7j3jVN
sDRDucbtF7fht7un4f1TVUauNhtYHZ/dXY9bL2nDKg8SWVkygt+7vgnf/tkF0BSHT59GvE/Bc7vq
ccWCblwyJ8qIrD6G/tYdW6vp3P1sFTj3ATnWXRYWPj4DPPQb4A8/zYT7GRNfn/cOcA3gmbbcv+89
FzFqg/n5eeGZLTw4wLCPFWo65kcvccEqCHG2NIBjeYdrWW5PLeCzFZRraOP07GoBuaQ2wfeZSX3H
cw7UMHEyo9g/5CNLWhS/z1kQDU222j4bbVVRysoY602m3W/uBvqHx7e//N6/33G8iFIb+Ay6QkSV
xkhTR3afjxPngGPNjFBf4tNEcjaPdNnalZ0xImWe3K4peHmYpYrgT1A9g+bBndjNqbIr5GIL/q9b
pm+a+bdm/i2axWme+zMtI6MKfvjmbPSOhBOEzQogwh7MEPuNB687g/IiR4h/nbRss3CuN+J+khR+
b19e14x5dQMYr0CkFcVRfH3DWRSrMTcjZ3XsGg7jka2zMDSqep5//dI2XLesk6cAcI5ijUhqGV7a
W4/zdM8TEGjV6nte9EnzDGSrUJSs7/0E2H/cv/9YLkDJbP/t59zXKuP7meBJncwiX/8gEYDET6Fo
ZWQOmi0/Ct/ELYvtmfG7JcAC3fi9W4ItHGZ1XGZaZz0Hdc5mm07QgynWgRaCjpzxl2bDL0jQJ40I
afCy1bcTPYmROd+r7/F5KW/His9rLQ6Qm40IPfmzkUljNu+Hrke+bS0+53daPCWtnKoW7vMui588
bclicCB5tHuvCPnJPuseCYontJAmTUtWqcQN244TbtoQ4E2tXCZl86F6bD5Ya1/2sPqA9cjNF7Xj
2iXnXed9cIK0cw08cIfzwWPnXTy7D5+9rHncG3Tj0nbcuLzDO/caq+umQ3XYdKDW89yiUAz3X9OE
ykiMj0bFXsjP7dX99djfVIGRqDLO9wbbQJcTh3f5iBG2f3yCB+QYGWcrVnqpkf/ID57lPmXjQdjo
Pjt6snsfFFjk+8/wUP9B6/7qDu5Xl+06jelFjhwLBrkgmeNM3PLmPgNqWcd9jplgwmYJz7s+BvYe
y95vvLkLeOldf0Qw2wslRECpZBM0/zy/nScXn7D3UZaeE/I3JPJT7dM0lObtMS0YpijkR3zcZxJ0
w6ST1XtGbWEtdk0J0pWEF6X8DHefqXoSxzdP8oXkOQR8788jxjr2iV0wpfOJgRGueVJDcNmWUr4y
Ii/3rz9r+LTZVuViKn7w5hxEdcWmmbNQFNLxuxtPo7x4/D1/KULlg9eeRVVxzD4hmOOHeO9DW+ai
eyjief7l8ztx2yXnDS2j07+SFJwtvUV4ae80tPbkMrulwzQS7n6VE0/ycuYc8PePAr/clIE/WYag
Fcj//XPgp6+YCbTHaTWftIu/ejN7fgNE2Og+Dp/i2spM6k/E9T9+xYlrPpC2XK/mZpoHLp+0QlnX
nORai6VPLU2bs4/oOSWi5SdUfDpQsAgiOGfPZa/9g7QTETaaw/YezR5ho3n42a08AFRejxM/WrbZ
PCedH9CcTZo20sbm4r6IDJIWz68lC9Wb6p+t50nKNznWC/jcbyEsWgsqjvD/nvnZJkUQEt1m6pjw
c0pmAmmdpblZRZJGMIJqeDjwPfXBdOxvrkwk0hYImGoEEWnGytlu/6+X9jbgvePV7miRdC4778bl
7bh2afuEdciFjT24d00zvr9lDvSoUDczBcCh1nI8taMRD2zwzv749fWn8fr+WvQORxClE4X7pLZ6
bX8d1i7uQn1FG8qKRjPsczj6007SbGaRnv0vIymlXJAYAn7wDDdV/OKNwLWruY9EtkHBOn61Gfj1
luxpl4L2Kwk8FEyAgg5UV2T+u9v2AD/8DXsxN489hxqZJVHi2gc/CaxYGDyqpx+0s/YejuYg6iGC
a6kyEXbHQ8hHFu8j18lwA4X8R/7nacsFYbNA2jaa2yLsubr3Vh6EKVPCRkGcyByaIrMGNenL1tj7
wMwFSXkwL5yXeTsRYXnyZW7aPRKd2P4KSty8QCkZiPgs9Bktk7Rsx87mLlImkWBaFKXgVOS/nA7z
ZvC6U1qdWGxsz5OMmJ39uUx3/u2RgQxp87Qp3v2pp4t0X+CkzTKBtEweif0YW2G/IcBTwBJiYBSE
hMwiVcFMzwx7aISsVxNkQPEIuXK6oww/e2e2sWJi82UjsHNnVw7hq2ubXE9JR38xntg+C6OkZXNO
TGyCqS6O4v5rziBCfmUT2B+fv+KMYcp45HypPSiJWeefvzcTt6w4h7m17uyP82v7cc/VzXhm9wyc
7Syxkz6GwVgIz+5uwMWzu7C4IYM3g5CnTffI0Sb2tz03GzeF5UWXE5gP0Krg934MPLMV+PR1wPpL
skPeKKrZG+/z65KNfzYTRWeSFJmCsOw5ysOMU34cJUCUJVqVpVxyW8lhP4v54ylU+Z/+Ow/T/cWb
gCVzx07eSLAjAeDFt3nwGT+C2VRNrp1tgShT88iAb8K8I7+2Nsqj/iEB+unN3If3y4y4rb3YfyRI
8mGjZ4gIjuWDmolpajb60roWLRpR1FsioYHyxoFHmqRIt5Qq4My5PHieENw80guNdZz02KLmpgAF
ILFMI3MFy0TSD2mjgCUW6STfwrE8T5K05QFpg5emzQPO1GuKwNeQ5LuCaj9dd0cSdEUYtJvI2ffb
r6UoSvx7xZTerKAn1ndPvjsbp7pKeE4zj0b72vqzaKxyE5pf7ZqJA+cq7L5s5vlE/r60phkrZ3VP
eJvOqBw2kmb/w3MUDdJ+f6QtO9tbjCcZaf2L24565l37wlXNOMEIX0d/BIPDIdcA23myGq8dbEBd
eRNqM0gBIPabPWqk10TlbfYq5y//+OgEK4/zlx8lZaYk05SclV4mRZH059PK5f4T3Bzx3f38pTWS
R5kfqFDofHoxblgF3HQFv8+SJFa85LtCq+tbPuQkKFdmpCQc0qo3/c6qC4DLlwGrl3KTmRIfFsZU
TxJCTjTzKHA7DwUnliTEkXDrRwu577ipvcvR9UngGR2n9azDjNT+YpO/RQoS3PUs3ieZvPUOBrtP
MiumfH8f+/Bv2p9BP40VdD/0rIxnoB0CLQql6h+K8Pr3jwHrVgJ3X8Of+2R9RGbUdA8vmIseovbj
Y5/jhbQ5hgl4lvpSHCuH2fz1Tz/mfrGUj5HmaEpRkGxuofluqzmH0TyRL/B7/+meE9pP/ot+oxDT
wt2J1tw+FydauGUJjTu/C21ez2qQ5yndMyAxJu7m+XeqqJFxcXjJDT/RFTUMRY0YRTW3vBQ59oXj
W4ovb/xNW4o1b2xNpyS25cQlEU0irnlSvJIDjC/uX/CXCIfDiBglxATICCKRMNua+8wt3x9COGQW
dmxIZcXYqkb4f5W2dH8K/6wonLRZ92sRuL1NtXjg0ZUYGA17BhG5dHYPHvrKhy6ftBPt5bjv4VXo
GCiy96LZtFXFUfz7Pfsxv24A0Zjij4H7gZj7TUkw/jhp1xXUV46grMiufx8eDeF3n7gY75+qdceX
ZSfT8Q9/7UNcOsebZG49Uo9/efECHGsrs2t1FZ7SYNG0AfztXR9j9dwuhNVg4QotMmYnbbCldNDM
bUxjW/Z2HaUyyrf/vOkGHGybLmedbBD8ukR+MfpMQgKZ2fQPcrJGgnY2NVDpsGYF8MefA+Y3pj/2
33/JBTDKZZTsvoiokvats5ebFJKgRsKE1znjBcrjM6chkTtu6Vz+YqcXM5FhImlEJDX5lpaQCIz6
aq4JoWeMQBosImz0fLnM1PIYNRX8PirZK3hmPdfc0D3QXGEJ8hISEgFpmiB7Jv7WzPRSlmtWzNBw
6FbRRvnfWmzzBz9deD2dFnbSOmcGbpu8n96Vq3DaUNPtmjUzqqQV/MLab1hLKjyqo25pyOhcw+FP
iUfZNNqJ7VfMCCKWiSTPAxfCDzbPxYAW9myzsKrjG+tPoZx8tXQ7c3p821x0DBW5VZsw+hsDw2F8
95fLxk6B9QA8mh0XG1WxuLEPD244jXWLE57ZxaEYfm/jaez6cTVixEYdprQDjNQ9smUe/v2L+xkZ
dJOutQs72fU6cZ6R1J6BcMK3TefaOiJzr+yfjnk1A5hR5V+q1/UkyxrxgCe6/RnQdHvSc12aCmQT
9PJvac+7adX/wUnGgnhfpB3MJxgJggXtHmnQJCQksoPzXbzsO1rY90ELTVQkJCRywN28uJzjs1dg
yDhf0J3CuiDIKh4kLv4dCleAtedgE/NxqaZvm+bIz6UkiqniSuzXDA0jN5FU7SaRJnF77eA0bDtW
x/3eHO1N2qMbLmzDtUvPu0TH907U4qV90+052RzEalRTGMEpcZEq2+hQkMiK7jjG4p5eed88rycc
23mqDkeersC/fHYf1i5KhNK7an4H7rykDb/ZO4ObdIrEjZ275Ug9Xj/UgJuWtXgQ2Bg+saoZu09X
4cBgJWLmuVRHxYxE+eLeBly5oAMby0ZQHI6lJmmufhdWOpxjwMrX5xgfohmlJG2TdD7NMBqhhISE
hISEhES2SRu8o0c6POAU0QROt0s1ChDQdTlvW093RI80CpG1OIGz9msG9bIRN1P1liBuMP3WNNNE
NOHnRmaNj22bhygjVy5fNiOIyCgeWH8KIcVhZhhV8fMdszAQC3kHgBHvJpaEXOm+x5E/OP3UKHn2
SBF+9t5srJ7bjZIIZ3PhUAxfu+YUth+txfn+Ik7chB+kgCoPbZ2Dy+d1oLbMnaRmRWMPbltxDq29
xWgzw/xbLnDUhp0DYTy7uxFLp/dgQf2AL+HZaRaZGPL2/uZHmGTdSd4FuicxKefTYMRNNpuEhISE
hIREhrJGKp82L1kjHDcNU2Fq10zhVXeTOVvCYVOQ1tPL9vnXaMJ96JpA2jQr8XYiAbdlbpowkeSm
kYYeTaFjuEZNV5W4iaThw6fw3/nNnlnY11KZ0DiJjcTK76xqwkUzukzTy8QB3QMRwxRQ05KQMCXJ
aFACSKEexya1kkxiOmmYLJ6rRHd/GMVVI/H9S6f14FOrmvHw9vmJyyuJc/Y3V7G2mYmvXHXC80Zu
vagV7xyrQddALaKjqque7x2rxbvHpqG+rBnlRSO+qbp1HVuwETHpuu7sf2ErtSuStSF/oxFKSEhI
SEhI5K+MoXswNN1D04akedoc+cS8kiMno33ch8txsKIUxBJ0QoumCtoWLf5Zs2laHJq2uDbOMpVS
zM9C8BVWOgeL8Ysds7yX5dkhc2uGcO/Vp+JfiprN0kgU1SUjieZMRs8VD4ruB0nInZKKrHlcm+pX
URxFUThmmpYmcA+7t5f3TcOp7vKEts28BpHRx9+agxsuaMXs2n7XdRur+nHHxS040VmOM+0l9t9m
5w7FVGw+XIsr5rdhYf1wapImknXYCZu9sH7X9LhWzQhMIhSvSJMSk2Q+zdO8XxISEhISEhKTmLSZ
QoiNrFkxNhyRzG3JtdMWG2nT49qkQjMUIsFcU/WEdoX+1mBEDoyxD6qmImRED2R/K4zWKfx44zz2
OUaJ1tQQv39jP1eD6TyVm0mCdbx/fBrOdJQmeJzQTJRI+96rTmJ6xYBntLbSoig2LGnDB2fqDL83
PZaEeOken1N9l4aEeV5f8f7OChZ6/YWtjLiNuO6jvnQQX1t/Ev/9+Yu435x4OmvCc33FeHT7fPzN
Hd4RG6678By2H61DR990DFgpAATSd+RcBU6z9p1VPYBISPMk5k7B3J7qwYwaGY8eyfdprK1jGidw
RtGtrRTUJzNpC5rjSI4FCQkJCQkJibGTNsStHJ3pxezJtVPK7rrdv03xWzfv5NL5I6AJ/mq6mvBn
01VB42IWxpY0xjiImGmMjWkq900zQvsbLIVC/5NQz1MAaFBgBfw/3FbOCJcCfdRRAUZ2ZlUOYsPi
Npd2SsRnLztlJJr+zd45GA6HXB3l1JJ6DRBXN3ho/JJ+r6QWciPsbu9Y0YTPXUEZHDVPIfaWZU14
ce9M7KAUAB7E87d7G3Eru8bl89yhBMvCw1i/5Dw+PFuDgZFS2z3Rx47+IsPvbXgUnuH/3fnXdDtp
A+JaU01LhP23NK6a4OumCZpVCTn3Sp82CQkJCQkJiQy4m40vefC4pLJGwjzSXD7WBSNKRWB+ca2a
KLnqSaR5RcnvRiMBnHzRGEnj5IwRsxiRMvZZiRnEzNqvx/crRsJo0sLYjSBhmFiCkQa6hmqq2qhp
ugcVe1MImqua0hFUlwxxFWgSlIZH8d1bDmDNovM41FpjnqsnqLSDmCnWP5puo91WImsjgIp5tpkG
PJ6DTbGuq5upCgAhAbZ5l+bNGASHtc/Sxl5suOC8oeVKxj3LI1F8Zc1x7GuqxgAjtoojouVgLIQn
3l2IFTM7jXQBTjSW96NEjbmIJF0nxvpvZFQ1+sSrHZ2J0TlBcxI4wVRWc5hExmKJbUyaR07qiTQg
C5OaNgkJCQkJCYkgbE13CBy6ZbqD5D5tsGnadFE3B5/mkeYPOH3aAhG2IInBsi+gGam4SKuiqnHB
XTN82VTTLE4ztGfcLE4TNG3snjXN1LQlrqkaOcm4xk0xTSRrSqO82Ty0Wd1DEXT2hxkxG05b32sX
txglX6GlyXG9ZkELrlvSipcOzuRaR7E9GOHafrQeWz6ejhuXnnWd2zkYwUhMsY8/808ii0UqTz6o
JamELj4YXmQNiWAjFlkj00gygU2YRSY0bVK9IiFVbRISEhISEhJBSZvnZw/VWiJ3sMOnzQhaaGrS
FIc8bQa758U8JiE3mxobY7+lt0lAEc0pTUFb8bTFG3/iFte0KUoiyASRMkurpsY4SbO0cKOJ4I9G
CVvZD4RGJV84VTXaQlG5geSC2h6u9fIIznK2uww7Ttbh7pVTI4vlfVcdwTZGznq0IntOONYuUdbu
j21fiCvmnkNZJIqQaepIZqm7Tlejezhiyw9njC/GkevKhlBfPoCQMpqctOm6Y5sga4j7spmETfxs
aNiIvMUQG+WaNuN7OfVM+vnUz7EaIMeChISEhISEhE/Zwe3j5BVj0JaiyvF92ArnbzN99CqCCSWs
FAHxLNuWJA0h1KGYQVpkkk5t3PgTNx3FTCgfMcwgEyaSikG2LK0aCetkDhgPLGKZRdLfMaveVk42
xQj5r5smhIquGDnaVjSeZ6RiGOf7SxKmVCb5oEuQWeBlc85hZlXfpB+si+vb8ZnVJ/DYu0vjJpnC
6gA+aq3G07vn4dZlpzCjcsDYve34LGw/PgM9g0WuMUakjaJOzqjsY6Qt6qnts6cZFImaW+NG/R4z
CZulaeNbRtqIuJOprDSPnLyTqQz5LyEhISEhIZFFtiEKGLptH0wDx4S1YzwQifl9IoqkGD3SJGJx
LZruSA5gFYOo6QKBs8L46dzBSBGIXBISFg9QMsHEbUQvRQmGEyZx5KvG/lENTRsT0o0okFzTFosh
roUUtW3GPYS4ho2bWCo8EqXKCRudX1/Sg8vnteHlA3NddSDTyhOdlfjb5y7DH127D8sbO1wJticb
7rmctG0zcPh8tSsoCbXp07vmYXZ1DyKhEXxwqoERvAtxor0iQcisIaJyondhQycqIoMGsUr1vNiI
WzyRuhA90owUGYubR+oJ4mb4s8UwErWiS8ppaBJPrZK0SUhISEhISGSZsOl2IcIka7p4jOjTBti8
1wTSpts0aIqqO4gbYKd9Dk2bWMG4KaBJ4pCCmHkSt/FBVK9AMTrtvmpGGgAS2BUj5D8J7CojUVyz
ZkaEZH/ESJMWU+IkIBSyfOIUI/m20TSKahBBMpe85cITeO3gHIxaJpKi1pKRgL0tdfjTX63FuoXn
sKC+2yB7YkhQzSK6JlGmz5pBspV4zBGLVDpbOqabmj/FTISumInCSSOoKkgQdn5N1emP6LgofW1k
O1Ac3wvaVU3nW9V5KfZ3USiGkvAoPAORsuu29JbhsbeX4aniRfiYEbteMosU7X2VxLVmVfXjyrkt
qCwaMkwXvYVr3fUIOXOzWeH+YzHLRNIkbYamLWaaSGroHSlBNKa6rikxSaZXXYyVm34ylqRNQkJC
QkJCwhd3c/9h91ezrME03Rb636Vpiwu3AjFT4lEkYV9S1nU3FfT0olMEYmbtUuJcU7F9P/4+bYNa
LcrUk1B102+NMQwS0BXTBDKmcqJDrlWKpWkzzSLp+1FDaFM5YWP/hVSuQ9QMvzbuF6eaEShXTGvC
dRecxmtH59GJ7txn7Df6oxG8emQ2cHQ2dOE7JYlwqTiaF6K5oUDgFLFHTNLmcDOMK03pmloy7qwk
SJuwKGD7LVvddHudRHNIxWxTI6ed9XPmOZSL7nBbFQ9u40HW4tdi2+uXnMbc6k72R5SRU6QlbTaT
SAhmkVoipL8matgsnzaNk7f+kQhGNVUK6pOYtAUmeXIsSEhISEhISKQgbboo0Do4lC5wL1ueNsGF
x6Vpi4fzFyR6RTw5bhopfLa2guObbgYs4WHprXXrNATNS/LJMZFrH25EXbFukCtDWxUztW1WWP/R
mGG9p5raLLFQfUOUz80wiWTETQ2xzxr3f1NVbmap8qAmnOzFcO/qD3GgtQ5neysSNpYicaPPpoWf
4tyfTPnl0FYpHjFeRJIGR/MrcBB+L5LopSyFkFpAPMaje+2BaYSdauIaznvQTSJrq5yooGXnXj77
HK5fdAKVkX7ERp3aNN1zeNk1bIhHjRTzs1nJ1S1zyVEaB+zzaCyG831VGGLkWgrqEtI8UkJCQkJC
QiIpj3GLzq7jdJtSzBnyP6FcsPu0mRey5WFTxFTd1t+Oz2ZSYsUMdW/P/wWH2sWUvG12cWkyN+eQ
uO3vvhKLpz1vmhUq3CwypiQCjRh2gErCPJL+GRWEf9KuWYJbCIbGjkwiVdPUUNW4KaVimiY2lrfj
W+vew/96cx06hkoSxM2zV2FXg3mRKD3JqFCQ3D1QSXKMnr47XAm39TR18FNfYfwmPcRJAllbL6zr
wudW7jXaVPRlc5lCOv3Z4MzLZjePtFI7xM0kTbJmkDZG3g63NxomklJOn5zYcwT4q+/rKClKf+yp
VqB/WEb8l5CQkJCQkIQtNWXTvawTxeAkekLxJcaMFAPoJUhbsmxuhlZNS/i4URQGUdtGdm4qt2Mz
8rWRdk2xwv+LpA92okaVUHyEHcmhRNTa34BoQz27hXZO2mImeVO41s3iVKNCXXTVbESzPWgbIm2b
EXVSjWvaVNXyf4NA3BSsnHYMf7Y+iu+/swZn+6sNU0BDo6Q4mikZKVJ8tIvuJkuuPHF6wPb2Imyp
yJqf4511VFL8rqDmvKihDfet3oll01qMLOdRLdVz5Eys7VPb5gxEwrZDUQUnuhoxMFIsfdomKbp6
eZGQkJCQkJCQCMDcUlI2b8Jm+bNZVo2abb9uRL7T4ts4aYPA5BQh5KRF0gxtmkDYdIPI0QVUIVW3
o9APKGrCRDIJceN/KWkaITcatzMjl2NB+CUzCAdxUMWIGqkIwS4MBduo1YgmQQ1xs0jaUiJtImlG
UUzCpvDk2jxnmxLX5hFWTDuBv7muG7/YtxrvnF6AYS3CIyOKys10t5xMmxZEe+Zh2uj5G8mukez7
IORNSVNHlX9H6wKVRcNYN/847r5wL2ZWdBljM+ojSZZoGhlf79DFqJH2oCSiaWRMCP1/rGsROocq
jSA1EhISEhISEhISkqilJGuiIOokbKKSLM69NJO8aXHXNLd5pEPTplgMTxWCkkAkcpqxJT8ug8wJ
GjfjhwyyoybYo6VVS+IfpaeRg3MlJm9uvg33zdvKbnPQEMzjVbL8qmJ8G7IE/pAp2OshI/BIyIi2
qMW1a/HP5CdnaN00M8KjEieBhPqSNnzzildw7fwFePv0Qhxom4Wm7mpoISX++4qfDAg+FT5xP7Ek
7e9JwkQ/NfHQVNo+xeOayX4rGdkUxndEiTGC1oPlDc1YM/cYltQ3ozgUZSQ6wCPllVgbuou0uUwk
49EjGTmM6XjrzEqcH6iSWjYJCQkJCQkJiSlP2fR0AqhDAPYgbNCsZGxxrVs8CImlabO0bRZps0vP
pmZNsTM+RTHVc4rg12bzdTMD01s+bBYRNM5T7cTNGbFC94q2ofhvmAwxHIvg2PB1WFL0vBHD3ohq
SK5sMbvQT2TNqocR4l/n2jZNt0wiE1o21TSTjGvYrM/x21L+f/buJTauq47j+P/c64kdJ6nzImlV
ER6FUCG6atW00ECbBoIQSEhIXSDUBQvKggWPsmHVSiwAIdFdkSqgAglQVFWqRBVBmzbkIUQAQR+L
qqRKWprYsRO7Tho7Y8+cw73nnHvvuY8Zj/NoHPh+pOTOy3NnRrE0v/z/53/y4/YNx+RjG4/Jmfkx
mXx3LAkFY3YT6cWOq9SV2hv9Z6Z86Ai3BciHUPrtAPIc5Pea6+piHaEyxWeaP4dpWOoWpOzyOsXm
Als+HTLc66801SZ4ZSZ87cb/bfK03GoZWTtyUTaPnpcb18zKhpHzSUh2CXpxWYGtHtzKLZLSsK4t
G0biL6dVttlbZCJtp9WRsIoJAAAAvYNaOVcV9/UIbH4PsCxzuXZIHVTjgtCWXJ9TokdNVkXL2hsl
HP1fPKESXa64+cpSMRc/mFqYthT64CZ999LuUW6pxYMr69Cp+2XbB/8mw3rSBwhdfMZxNSC7aZM6
jiXWxlbSbECzlTbtg5sOQlsR3kqBrVJ5GxualrH10/LR9fx7b5KuN9Td5fzOmHpwCxd1mt57tblK
W7HZ9lxnRA68dYdMXVjv9soAAADA/3tCG+x+U57zYcLAVglqpuextE9bNJM8zWi+R5vyD1BB4vOp
L09/aVDLbhO/kVk2LSLYO84V3lxwM8bdr0qbfTXs3tz4xq9OcEvXlD078ZB8aetPk5c4XzqPsR9m
ui4vXcMW28/EbUht7L5uUXX4iL+crWHLLvcLbKrh/SrFuqnl/ceG6Xt7GNxKbZJ+b7ZSiNPFfm2L
Wsmf3vy0nJi9WTpaCVU2AAAAAtsSX0wrjwuGjlQqbSbMVhIci9Cmk+NiEdpUPG3M4s0iupT+8mpa
+kNpqSNdt5Yf3Vo2E3VtEDHajcY3wSCPfJ8w+33XB7q0Jc8Us+zLAUX1GZ159YLMxLub5c9DX5dd
mx5378tvEhYFi+2yL/bpbe6YBDXtQpwd89/1bZLBOjY3rCVsiZRKcGt+T4S2KxPaSsFNKgGuUmkL
q2xZm+TR05+SVya3y9xCS4qN4wAAAEBo63+/6TGEJFuKlnc29qmyJV9IO8l303NFaJP4pOj2bcX4
fjfmP1/f5oOaDWbZ0S2YEjsj3y8GM74d0GUyN4lSsm0A7F5uqhTe3BsKRwiaPgHm6lY5Xp/5SHKK
h+Qzm5+Ulp5zIS1SRfgMQpvOWyOjIqiJKqptfnpk1h5ZCmyVitsgWZQQN3hIa/qdMtIwjKSyrk37
8rNdz5YE8n9M3i0H375TZtujBDYAAAAM+H20UmmrVNlMvgwtCGhS6W60g0i66X1tZfRUHtqSsHXO
ZOvXRPv2SF20R0p4TBcXZe2OvsrWa2h/nsNMPmhChZtuVWe+N4QTY967wPJaEtwm5x6WL3/gCRk1
48nbjcT1m2Zf7NNNs9OgZvxm2sZuE5Dty1aENl0aRNIY1mSwvcMJbJcX3EzP4NY8iKTdXS37T+6W
lyY/LvOdYQIbAAAA+oSzpi+f/aZGFqEtr6hJ1+Yq8X+Mya93krsv5KFNqda/ki+tD7gvqNnea11X
RYt8cLFVNuWCTOTvU5KNKMxDmMmWpeUVtuwGdzR5xS0MME2h7dqElbPz6+Q3//6W3P/+/fLh4cMS
mwX7PtLwGNm1eWnrXBbUdBDUsjVtqlxhk+aK26ChTAmhrfnXZOnQ1jiQJKuwSfm6NrG8PXeL7H9r
p/zn/E3StYG9ywcNAACA5YW3ILiZSmtk3hIZVNeMlAObPWp7vJBkkON5aDNm+JfJIx5NAkmrmBCp
XNUtq7Cp4kmVKapr9uhHuCub2GK/oXYs5Qpbbah8tu22VCtu1zqoLHRi2Xf8c7J21T2yZ9s+uan1
ssR6wW1T5zfNtoVIuyl3ObSllbfqiP+mCZI9QxqVtUv7dVmi8lZtjSwCWyST7W1y6NTdcmxmmyx0
W/7fKIENAAAAg8Q20zO0Na5nq7VHVgJbftRTEg0fkDApve+OJ/9u9MXbVRK6RA2JPUbpcchdj/wx
vZ7fHhfHyB3TSp17jsjdnlXW8qPyG2/Xw1pRhlp5weXWjcfltk0vyfrWaVkdTcuQzBdVtfQVR8E6
tmBCZK09csnKGq7Q/3E0hjZtWnKhu0FmFzbKiXPb5OWp7TJzcZ0N3wAAAMBlfxNtGEJielbauqVq
mzGdrNK2kHxz/cOpF+/5Sva0dnNtNbT6adOeu91tkO03yvZPqGyfpSrWr5l63cxVniqVtvQF2oAW
ub3d7CjJyK+Xa6iumZUb3F47u83+wf8i1q0BAADgygQ206PKVhrzn69r69bWtRmdBjc9bSQ6Gj71
kHvOkV8kdz6ShKaW+0El2R5srvUxa4lULqBFUgluRYJMA5u7HLlKW7rXWb6PW+QDmioFNJMFtVJw
64WqCAAAAIBrEMxqd5vKY8uTI+tDSIwLZ5JPiayvaTN6XFT8bGMC2rLj9z/RC9Pfd22PseTHyLdL
9miJLD02u2zbIf0fN1ux2IDb35afvjZZkVAGAAAA4PoJdaYU1oIAV9ujrTqERFcD2ztJPvrV+IF7
vxueZShPbzdseFTOzHwjeaKxfBqkbYfsBuND6i2RdkqkP9r92dL2Sr++TYxbw2Z8lU18lU3lYc1X
17JqmyK4AQAAALh+Als5qJVvtxW1SrUtXNNWGUaSXNWvqOGRH1fPlIe208/tubBlx2+f0AszD4d7
rxVtkMqubytuM379Wtgamb6I2L4I5Sttxq9lywJcFgalaXqkyS4R2gAAAACs5Mhm6gFu2Rtrhy2S
ZlJFQ/vGn995umdos2Hphk2PqOn5zxvd/oTJw5PKA1s4sD8MbCpLjbY9UttWSRNU2oohJFmLZFBx
sycuB7ji7RPeAAAAAKysuNZ43YStkvVBJPUWybDSptvJIw6pdRt+1nTGWirasuOZrXpx/NXkBzen
Qau0Vi0b5Z+1P+Zr2aLS/eLXtKlwDVu2vk0Vm3HX2yFru0/zbwIAAADACspsPUJbQ5XN1IKbbtqv
LU1vR2T16IMTz9/75kChLbX1rqfv7LTHX0wi1mg5uEWVwBY13B75NsjyHm2lYSThmraGl8Em0wAA
AABWdnbrE96a2iNLEyTDzbXNqypa883xQ7uO9DpXz3R04117H1xsTz1eDm5+37VqdU3C8BZW1iIf
wJoDW709csmXBQAAAAArIbZVU5y/dan2yGJ9W3LpDRWP/mDi4O69/c7UNx1t/eRT93UvntlbtEpm
Aa0a2Kr3RUFlrRzYmtsje70cwhsAAACA6yC0Ddge6adKdo1R/4zjtd8ZP7z78FJnWjIVpWvcTGfq
oNHt7UX1LMoradXA1tQamVfVwsBmz77EmjahVRIAAADACotrPVsjRWpVNh/kjA9sSXBri8QvRPHY
t8cP73p9kPMNlIi2fvaPa/Ts2cf04jtfTX5gtKmy5qpo5cCmGtew1QeQ9G6TvKSXCwAAAABXKqIt
leAqjw4rayLlSptMSjTyczGrHjv9lz0zg76CZaWgrTuf+ZBpn/+RXjz3QBHMytMhi6mR/Vojg1OX
gpoirAEAAAC4jsJb09q2IrD5qtysxMNPRSPrfjjxwu4Tyz3zJaWhdLqk0XPf0535XSKdzQMNHOk7
6r//y6FFEgAAAMA1jWxmwNBWtER2JIpPSjT8XKyGfz1+5AuHLvXcl52GbIAzi18zev6+5OmM6bZv
TTJWq3db5FItkeq9eukAAAAAMEAQG+Rx6oKKWm+kR1FDf9Xxqt9NHf7iUT5LAAAAAAAAAAAAAAAA
AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
AAAAAAAAAFfIfwUYAA5mpSOKNj+qAAAAAElFTkSuQmCC" transform="matrix(0.145 0 0 0.145 0 0)">
</image>
</g>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 40 KiB

789
assets/thirdparty/online.svg vendored Normal file
View File

@@ -0,0 +1,789 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 24.0.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="190" height="20">
<style type="text/css">
.st0{clip-path:url(#SVGID_2_);}
</style>
<g>
<g>
<g>
<defs>
<rect id="SVGID_1_" width="188.5" height="21.4"/>
</defs>
<clipPath id="SVGID_2_">
<use xlink:href="#SVGID_1_" style="overflow:visible;"/>
</clipPath>
<g class="st0">
<image style="overflow:visible;enable-background:new ;" width="1315" height="153" xlink:href="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABSMAAACZCAYAAADU3y8IAAAACXBIWXMAAE0MAABNDAFsjJP5AAAA
GXRFWHRTb2Z0d2FyZQBBZG9iZSBJbWFnZVJlYWR5ccllPAAAqfhJREFUeNrsfQeYHNWV9akOk0cz
GuWcc0BCEkgiJ5PB2IATDusMZp12P8dde413ba/9O9vrgHHCEZywMQaTswQCJIRASEI5jyanzvXX
qdc1U93TXVU93T3To7lHFNXdU1396uV33rn3aigANn6kqkZLxMfDp/l80Ko16H6fphkvUePTAB4a
D2Q+E8lT73sLGjQIBAKBQCAQCAQCgUAgEAgEgsJAN/6lvNcR1zT0JIyz8aeo5vN1G1fENJ/eMu9L
3R2F/O0BM31P3VxRZyR0knHU+3zaUr+mn2ckOujTtNk+6NXGOWjcfU4KGZmJlLQSomVOkKYJGSkQ
CAQCgUAgEAgEAoFAIBAUBHoGMhIIGadDuo4eQGvXNG2/cU2n8ZdnkcBmaDgBv9ZYCGIyZ6bv8RvL
R8cSmO33aet8Gi43Plrh82GiX9NM0tHvU+Rj+qH1U0hqKQnISkbmnWKBQCAQCAQCgUAgEAgEAoFg
hEJ3/0hP/k9Pf6/rMeN0GBqeMj55GAlsND7dP/eroZaBJscztffo+6pqY0hQ6XgBNP39GrT5JB55
BIwP/SQb+d5+tpOQ4Fnrb56dbqbtkirhIgUCgUAgEAgEAoFAIBAIBALv0F3+0I+EtN4nX/Oc0PWw
8fIl44NfGuf7NU0/MPt/Q525psUTt/fA+8pnJRK+C30+/Ubj7cqAnYRMvvZrfedUMlKDD1lMtLU+
8lHLNVECgUAgEAgEAoFAIBAIBAKBIC/YiUrdTk6mEJEpr7uMV08bL38BaI/O+d/uA7n8niPvd98N
8GvllQt1JD6Z0LW3B0wC0jq0JCGZSkb2mWprfSbaSPMbCRsRiUxBawQCgUAgEAgEAoFAIBAIBAJB
sdHPZDtNLWkSkckjgT5y0jiajLe/hqZ/J1Ae3Dv9C+1xL7+Xlfd77F9rgh3t0XV+v+8LCV0/t5eE
9NvISE2971VF9ioitcz+IuGRjBQ2UiAQCAQCgUAgEAgEAoFAICgu9DQyMpOJNvrISL5O9JGRfN+u
Q79Hh/bNOOJb5n0lHHb7yYy0HxWR8WDl+fGEfqumYUbQroj0KyKyz1TbdiYJaakj+/mLTCMitTQ/
kbYUCRcpEAgEAoFAIBAIBAKBQCAQFBd67/9s72EPYGMjI9GnjrQTkokEwgld32h8/vmyyuDjbgrJ
jLzf3e8sXxPXtTuNP84I+u2m2UDQr/W+TiUj+4LY9IukjcxkZKYEiCpSIBAIBAKBQCAQCAQCgUAg
GBw4EZJ6NkLSdsQVIRk1Xj+pafiYP9L94vRvmpxlRgTSP7jnneWzorr2bePGM0zfj+hTOlpBafwZ
Dot49CP5neSRkYiEXRWpXgkJKRAIBAKBQCAQCAQCgUAgEAwBtPRI2nqfibamXiS0PkLS/Ery8+Tr
oHFak0jon4n5Kz4NhF5z+Kk+3PeOivqQrn8vlsDbLAVkMGmaHUyqH9OVkilRtbUMqkgtExmppZhj
a/1SIhAIBAKBQCAQCAQCgUAgEAgGE+nBaxT5qPe+TmRQR8YTfa9jCTQnEvovfbp+y6yvh1oy/UaK
MjKq61eHY3gbyUVT0YgkqQgbwYj091qKEjLlbzYVpDq0vtc6Uv1D6lLgAoFAIBAIBAKBQCAQCAQC
wVDBEjuaZ916ryVJSt3k/SzFpAX7a7+GBmja1bqub9r/Mfw2k7l2Lxl5zzsqx4biiY9rGdSNvgwR
st0Oyy+kaaqdVEPa1ZHWA4oiUiAQCAQCgUAgEAgEAoFAICgN9BKRmo2YRJKUBBWQiqW0iEmfjdsz
/w5MNq59awwVjwKhg+n37yUjI3H9XZE4ltMcu1e9aAWfQSZFZJ8qMuV66zu69ZmWpo5MM88WRaRA
IBAIBAKBQCAQCAQCgUBQMkhXSJpInn0mKambkkefdTH6lJQ+DeW6pq1IaNp1+z+Gb6erI00ykqrI
jnD84yZxmFRAWqShpYrU0hSPms0fZErUbPSpITMGrrH7ihRVpEAgEAgEAoFAIBAIBAKBQFB66I1U
g1SJpHkmf9jnYNJSSJo+JjUz2M14aPrlUZT/Dggfsd/WJCNjCf3auK5NCthJSPRFw+41t4ZSPKpD
M/0+ar7U76S+Tl6jod/ZfBZRRQoEAoFAIBAIBAKBQCAQCASlC5sy0uIjTamjyQ3qvTygRUgm1HUB
469zEvC9zvj4F/bbmWRkOKZf20+9aFc1AlmVkVm/Z5lnOygjLQwHhWQ0OBrxQAV0X7lxGNmm+ZHw
lyeD8GjJ50g9pz6jyECHClo8DC0RM1pFDL5YyHgfgj/cZr4XCAQCgUAgEAgEAoFAIBD0wTLL7mWy
7IpIoJeURFrgap/ed1nyGKNpOGP/x3C73VQ7cPc7yxvau7GqV7UI29l+ZPrM/jf792EjItMVkdYD
2bk5vTTj2ESC9eYR81eidfQqhKsnI14xHonyWiPnKhGpHA+/3wefZhw+dZhm7smzdajndScsBYVs
OH2yW393IwKxTiDahUDXIfi7jqKyaSsCPY3whZuM8wnJMIFAIBAIBAKBQCAQCAQjHnbO0Xxv5+yS
ZGMvj8ezpY60EZiWUjKhocb436IoKsYBoWPWbQKJOC5J6KgPaBkUjsigcLQ+0/oHuknn14arr8hY
oAbdZeNwvGEdWuuWo7NyOnyBIAKBAPx+f+85aBw+nwa/z5+ViHQiJNNfCwrQaHS9//vaiYgZZ76O
jFlqntuMo7xjLyqObUTNgQcQ6DoMf7hFMlAgEAgEAoFAIBAIBAKBIAlNsxGSab4jdds1dqFh71k3
/zQxAd/Zxts7rXsGYjGc03uhPTpOBpWkdZ35mW4jGm2qRztJmSlSdnoCSw1dFZPRWLcSh8acZ7ye
iIA/gEBAEZCpZKSvl4T0+93IyOyKSCEjC4uMZGTybDpRtb2P1c9BR91sdE29ADV7/oaagw+hrGMf
kBDzbYFAIBAIBAKBQCAQCAQjD+ksVbaI2rDxhbqe2bI6eb86Dfpi+z0DcR2z+hSLWkYVY2/AGdtn
1vt+qknYVX/Z/USWGnQtgPbyKdgx+Xocr1qkyEeTaKT5tb/XDJvEozqS721/609Iop86Era8FhSx
PDOQktZhvWcZ8JyonoC2xe9GaMJpaHjphyhvesn0MykQCAQCgUAgEAgEAoFAIFBIV0f2+ZbkPz2j
a0fj4wrjusn2+wQSOsbAyR8kbEpI2GzCNZv60e4HMo0BdX2QEuDkdM2PpuqF2Dr+enRWTO4lIdXh
Tyoj+w6liuxTRGZSRtJ8266GFNPsIShXGyGZTkTyMIoKiUTCLKtEQkNk7FI0rv40Grb+ENVHnoQW
65FMFAgEAoFAIBAIBAKBQDDioKdbO+v9X1t8oZ7JRLvvddC4vs5+K5KRdXZurF/0a2SIrI005SRs
akn7vdLfpyksSwVNVfOwacLbEfLXIahpJtGolI59Ksg+RaTf/DuVk3YikoRWP2Vk8qH7kZHpGSUo
SqvRM7y3KySVOtL4LKH31slE7VQcX/1pjN/0ZdQcekRMtgUCgUAgEAgEAoFAIBCMWPQKIe1KyAw+
I+1+JTU95fsB41Rtv2cgkTDDbGc0pU4nFtVnWlZS0TLRduLZSi2ITUf5JDw/9k3o1moR8GlJUlFL
BqaxDp/t0JLKSC1ppq31EpFKDemzqSJtptpIZWuFiiwudKuCWWpI21v7YZQcEmZ0eRXribJiBCtx
YtnNCHQfQ0XTVslMgUAgEAgEAoFAIBAIBCMKdlrFIiTNz5EqkuwjIm0MpS2ytoGApmk19nsH4gnU
B/3oC1pjC1yDtMNult1rkm2ZbOsOD4C+60sJUX81No+9Hq2+scwZRShqtiP5PpWMNA6t73Ofr++s
kZTUfBmjaPcRtUJDDiYsfaT9bP+n+ew+DRLGBT7jPx3x6nFoWn4zJj71afhDzZKRAoFAIBAIBAKB
QCAQCEYeUlWOKYFset/bjwzfS+cDAynBaDLwZOmKyXQT7WwYDpzb/trTcMA30zTNNtWNlq9HX+r7
lEPrr5702dSRvUQk+hOSfTk4PPOrpNuG7lwPrb/3tg+tTx1pdz6g6z51hfGHcMNitM25Bg0v/4y2
3JLJAoFAIBAIBAKBQCAQCAQOcPAd2YtACnOpp15oqvns0sp0VrM3vrftznbG1E5k6mmpGWKVZChQ
h6015/WLet2PfHQ57N+xKyLtZKSmISXrhXgsQmXPkKd9RGOqbwPYqi0pRpZdeuB5XVdl2jbnjajd
ex+CnQclkwUCgUAgEAgEAoFAIBCMTKRxKuZHNktqOzXoLMdTTiQLCsv/ZKljT/2Z6EQVgj4kyUT0
Ixf7EY5af/KR3+slMjVbsB8tPaiPbssjYSMHvaXY6qdSRKoy8BnXJFKIY90sW4uMjJXXonXuGzFu
87clOwUCgUAgEAgEAoFAIBAIbEgXf/UTPWYQIwb6XezxSPcZWWr+IJ2ga37sKj8lRRHpdPiyvoZN
WZnpXipj08lH4SKHrOStEkh57bMKJXnoveXrM8PbdE47D2O3/hharEeyUCAQCAQCgUAgEAgEAsHI
g57hfTrp6NVnJP+X7hPSK9KjbQ8Xjq2zbALa9FGKWLT/s5lY+7S+o8/sGr2EZMrnGf/Zooun5Ywm
sbRLoAWlVt5+Jva2+hAvH41QwyJUHn9esk4gEAgEAoFAIBAIBAKBIA29ptlp7h6tQNl2BOyRsvv5
d0yLrG2Ppp3J1+RwwcGaU5DQNPihpZpVA1lUjtbftbS/wfUA+ishRRmZP3p6Quju6UEkEkEsFjOr
YCAQQFkwiMrKClRXVXm/me5ejl0T16PymJCRAoFAIBAIBAKBQCAQCEYgPCgjU4hIZOAZkwg48WIp
ykd71O1hnn9NZdNSiUfYiEibqtEe1qTv71kO3f53PRn8J2miraeZauu6VGIzG3SEIxGTWAyFw/D7
fCaRWFFejmAw2O/6RCKBE03NON54Att37MKuPXvR3NKCjo5OJIx71dZUY3R9PWZNn4alixdh7JjR
GD9+HAJ+v2Mr0q2w2skjU/mGG+ZLgQkEAoFAIBAIBAKBQCAQuCGTkNGGVJ+RybNdDZnR/nuYKyO7
tep+n6WqHbXMPiBt16V+R8sYlMZ+nf080hGPx9HY1IQjR49j12u7ceDQERw9dgwVFRWYMmkSpk6Z
hLmzZ2H8uDEmuUg0t7Ti1V2v4R/3PYgHH30Mr+3eg1A4Yn7HH1DVOGHc1yQ1jXyePWsmzlx3Gq68
7BIsXjDPuNdYM/91Xc9c9rCXo96vbGmqDeGQBQKBQCAQCAQCgUAgEIxEePUZqbt/NzAS868LVX2E
I1IJSMJ+zmx6raecNVivVTTm/sFr1LmruwvRaAxVlZUoKwue1HlMpWI8FjPPPp8PwSRh2N7egZe2
v4q//eOfeOTxp/DStpeNvPGZSkgqH6PRKCrKy7D2tDU47+wzcOWlrzPNr/9419/xy9/+Hq/t3osJ
Eydi2szZWLxkCaZNn47q6mozf7t7unHo0GG88tJLOH7sKH78s1/hr3+/F2++7hrc8KbrMGf2DFN9
qWXQCWsuptrxijrpeAQCgUAgEAgEAoFAIBAI3OCqjHRTPLp9Dgw/ZSTKTIWcqZJLElPWe8Uh6r0E
o57FpFp9rpR2uqbMsjNdG4lGTSUgIzPf//BjOHT4KGbPnIFFC+Zi2tQpJjl2MiEUCqPxxAkcP9Fk
nJtMpWJNVZWpdqSK8f6HHsX3fvRT7Ny9BxMnTsTKVasxdfoMNDSMQTgSxonG49i7ezdeePEl49pH
8PhTG1BTXY07/nQXxo0fj3MvuABXXX01Tj/zbEyYPBm1o+pQlizDCH07tnfhxNHDeP7Zjfjrn/+C
5559Fl/95vew+cVt+OwnPoaVy5YgGAz0lreyzE4z2YbeVz+SSARrRBkpEAgEAoFAIBAIBAKBYGQi
H2VkGkakMjKiBxDMYm5tvVfof42FPvWk1hc1O6mw7OjswvHGRrS1d6ClpRU9oZCpDvyfr30TejyO
8WPHYPacOXjX296MU09Zav7tZMCRo8ewcdPzuO+Bh00y8dDhI+ju7kFtTQ0WLpiLyRMn4m//uBfw
BXDu+efjDW94I9accQamzJiD8rIg4rqO7s5OvPbKVjz28MP4w51/wEOPPmGqJZlf7/yX9+DNb38b
Jk6eatbtqPG/aBwIJyu6z8j/yppqzJo3D3OM48JLL8Vf//AH/PgHP8CDjzxm+pf86hf/E6evXmXk
udbbYjLxzdlM7wUCgUAgEAgEAoFAIBAIBA5wISUDGRWObgpJDcNaGanSqyefIZl4Xe+NqmxXRsKm
ljTNrWELT259rqnvRqNx7D9wEE89swmbntuMAwcPoDwYRNDnQygaxa5du/HFT38cq09diu/fejs+
98Wv4Ntf/SLmzZk97OvZgYOH8N0f/wy//v0f0NUTxtgJ4zFp2kxU11SjtbkZr+7aiyc3bkJtTS3e
9La34aYPfxgzps9AxPhuJG7kXQJmzlbW1mLN2vVYvW49zjznXHz/O99BU1MTPnjzzbji6tcjZlzV
Fc1c5xI6g9wY91K3Qs3oMfiX938A8xcsxBf+83PYuOEp/Mct/4sffvurmDd7Zm+ZW2VrD16TKZiN
KCMFAoFAIBAIBAKBQCAQjEiIz8jCwEntmP53uwLSfq2FSCSKTS9swQ9u+wW2vrQVp5+6EtdefiGW
zZuDmppqMwr0y6/uREVFEGecvgqzp0zCez76H/jOD2/DN798CwIB/7DNx6bmFnz569/FT375a0yY
PAVXXnk1zr74EsybMwcNtbVoP9GIrS9uwcMPPoyFS5bgwx//KKqqjc8zkYoJoMM4+X3AaevPwPQ5
89DZ2YlZc2YjFAdicY/R3I37hiPG9cZ9zjrnHHzl69/AzR98H554egO+9q3v4+tf+rxp/m2VrS4R
zgUCgUAgEAgEAoFAIBAI8kc6SZmG1GjabqxmNl+Rw04ZqdvUkUkFXFLt2U/1mKaMtD6z/y0WjZqq
v89+4cvYu3cvVi5bhFgsht37D2H2tKlYcupyTO+ciOqqCvT0hID2DkwaPw4fePt1+MSXvoU9e/dh
3pxZwyb74omEGYiG5tOaz4ff/fEvuO3232Ds+Am46ROfwjVvfweqa2pA4/NaI4PK58zBqtPX4oo3
XIfKmlEIlgXQFclOKvJzKhx5TcP48Rg3YTxCUVu5eIR1H5Keq09bhU9+9nO4+QPvNf1PXn7Jhbjq
kot6lZGajn6qyP5lLv2JQCAQCAQCgUAgEAgEghGIXJWRevbv+kZi/tmjYlvv0yNku9us973esXM3
PnvLl/Hitm1oGDMGK09bh9POPBs7DjTiQ//xJTz2yJMmmVVZUY7ysjKYgW+6urBy8QJ0Guef/PI3
6OnpAbJGEyqdg0FpNjyzCT//zR343q0/xw9/+kt870e3wef344YP3IQbbrwRgYoadHQBrZ1As/FY
7WGgOwrUjWmAPxBANOq94vHaUAR5EYE6ic04cOXVV+HqN7wR4XAYP/nFr01/lox+nv6MqZHT08tc
IBAIBAKBQCAQCAQCgUAwUIxcM22vKtC0v2vJf9braCSK/7vtF3jxpW2YO2cOPvOpT+L6696AikAA
73jb2/DO974f3/3J7Tjr1GV45/Wvx8r5cxGPxaCRnCwrMwPa/Oz23+CSC87DeWetK+ksIxH5o5//
Grf/7k4cPHIU5eXlCIdCiEQimL94Cd783g8gEteMz/rIvO64+m4lsy+MXmljQu9v6l5MxKJGGsp9
ePu/vAd3/fEPeOzJp/HyqzuwesVyR/5VqSVTTfQFAoFAIBAIBAKBQCAQCARZ4Gqm7cUU2ylwjYZh
KhpzYR0thVzK9fa/qfPL23fgL3ffgwUL5uOWz/0nrr7yCrR0tCOi+TCmYTSuvPxSfPsb30Braztu
fMebEe3qNs2bg4EA4vE4qirLUV9bi1t/+SuctuoUVFdVlmRuhagmvP13+NLXv4260Q0484KLMHX6
dBw+cBCHDuzD1W++AWMmjUNXV3+SkarEiJFdZcbngaQkkgFryozXFb7Bqz4MfLNy1alYddppuP/e
e/HQY09i9YplcFaD2spexJECgUAgEAgEAoFAIBAIRiIKaKYdGNEZmFQ39p6R9j55XYo6zpa5hw4f
xQ9+djumT5+B/77lc7jyiivw/Kbn8Me/3IXly5fjujdeg3A4hKrKCjNAjR4KmwSk+dO6bn62dsVy
rFq5Aj+748948aVXsG7NqSWVVQy8c/zECby2dz++/YNbTTPrt3/gQ7jhgx9AVc0otLV24vDBPZg2
ewF6Lc3TwPyLJqNca/G+IiBBGdDUARSf66P/yPKKANafebZJRm549nnnxuMWDUogEAgEAoFAIBAI
BAKBQJAKTwFsvLtHzP73YZcrehZfkXZ/ktkz4MChw/j+T36Onbv346tf/h9ceO7Z5l/uvf9BfP1b
38EZ69fh+jddhyNHjqGubhSqKyuRIBuWBIPAVJeX45Z/vxl1o+tx133349nNW7BuzcqSyaVjjSdw
2+2/wwOPPo69+w/gRFMTTl27Dv/ykY+jYlQVIiFg1Jhy1I8fg3hMRbrOBi3D+xgjZxvfqw0oQnIw
RLbGz2HJ0uXm6/0HDiJqJNyv+eApgpMQkgKBQCAQCAQCgUAgEAhGIkQZWagczEUWpyORiKOrqwvt
HR343q2/wLYdu/Gl/74F5565Ht2dnShL6Fi+dAkuPP88rF+3Fprmw/HGE6iqqkRFRQUSsVhfCnTd
DOIya+pkaD4NC2fPwI4du1AqjBfJUhKRX/zat1DfMMY4RmPKjCAuuPwqVNVVobtLBZim0NMXVwSu
36fUh7rHRyD5GOb1RrZUGzUxqKnANnqWawuRM7xPw9hx5utQKITu7m7UVlfDWwAfQUmhzCg3X1oX
ljAqU7RHRS0SCAQCgUAgEAgEJxcCFcZRnra8Nxak0bCxFohK/ggEpQJPPiMz+YL0wssMV47GTLfd
D6Cu3lufW2ya9Tp57N67H09tfA7PbXkRz27eitt+9EOce9YZ6OroQML4e6i7y3y/ZNEi1NePMpm5
puZm1FSUG/1lGcLRaL9kREMhlBl/HzN2LHbsPuCdySsyOju7zYjZfK63vO8DWHf2Wdi/bz/WnnM+
rMfw+dQRi3Rj32t7MKquFuOnTjeDxWR6DF5rz14kq14kkQwwY/y9zA/4k59b9ZKXx5PfKStA/Hef
T+k0E3oC8VjCVhfcDulPhgT+MqNy1BnHaKBqtHpPjJkNlNekXhvqANoPGxXKmIyEjdc9bcZnrcbR
bhR4XPJSMPioGKUOzZ/9GhLoPS1GRycTaNVJG3kVrFRnJ7Cd8xiKcXM4pHG4toeM8zZjrO48rvIy
H9SMN8qtAsg1KN1gtNHqMUbaqoykuUx0YiGgu3l49BckDDh+8zxUiHaruUA8Iu1PIBh284GAmutz
/s91QFmV6r9rJ6gjvZ/uOKbm/Gafzfl/m3qf79gxXMExhX2wtXbKBPaN7CPZVw7WvQTDG16VkZmu
F2XkwHH77/+EO/5yt6kaXLxwAVYsX4ZIKGQSdhZoij150gTEYjHEohE0Nzdj2YyJRk4HHBcjmt+f
YsY91GCE7LraGixZugxvff+HMHPaOHTHdQT1GLojCRjJRbinE3Gjg9/w2Ab86kf/hxlz5+JDn/o0
xk+dYTyLr5e/M31u0t1mImbMsQP91JP8O6NrM9BNT0KZbPttZtskIskZ8h51AUVI6nm0nbaWFrUm
qihHdVWVVOxSBBeqtePVwrF+GjDlFGDaKnVUNngLxX78VeDwi+o4shVoPwK0HlQkpSgnBYOFmeuB
+ecbHU5d9muObgNeuMOYRB+V/CK44Jh/AVA91vm6g88Dh7YMzaR3OKRxuLaHjAucGPDUD4y28vLA
+28uoFa9FRi/wJ3wG+w2ygX3KdcCk5cZr4PO13KxvfGnQPM+lPwu6ejpwJIrVJ4PFQ5tBrbdbYz/
B6T9FZ34qOwjhQSCAa8BjP65st5YBxjr57rJwMTFav4/1ThGT/O2mdV2WLX9I1wHGGuAln1qHdDV
pKyoRgrGz1d9MNdS2cC+kX0k82uw7iUYGfAgYgxktel2U0pqGLbKSN38ZyVft71P/Wc9nPWeBGNl
RQViiYQ6G+/1crU7oCXJEZ7NaNllZdi5YycOHz6EN118lsm+0TQ7K4miK5+Veolk6OjRo3DLZ/4N
YX85xjXUw09VWWcUNbE2+HyV2NvUjIfvvRevbX8J9/z5z2g9cQIbH38EsVAIb7/pQ5g+ezoCldXG
0wSg6WG0t7Zhz/Y9aJgwCbMWzkMi4U8Rqlm5QlIykh6t3VahO43v1PmUenIgOcXlx86dr5qvJ4wb
i7LygJGOeL+yz/RPlJGDNAHh4n7cPGCpMeDNNRat4+b2N8XwNAAvUMeK6yhBNiYjW4AtfwJ2PwE0
7TYqWqeolQTFByfRy9/YfwffjlfvB16515goCxlpYpSx+Dj7I8Dk5c7XPfIN4ATbcrek8WRqD9nA
Prx5v1K5DGhiM1MRflNW5P7dYrdRpm3VDcZi24PfcC6kSXC3HVUqyVIGy3nBRcDss4YuDSS+dz8J
tAgZWTT4g2rzmIQRFcx7N0ieCAawBtCUcp6WT3PPBRZdBkxaoj7LFSQxeSy+TKn1OA6/fA+w/T4l
VqDSfSRYTNVNVfnoNFehaOPgZnUM1r0EwxuijCxEJqab3+qpptsZzHSvf/3l2L5jF17Yug1XX3Ul
xo4dg1BPD4KBABLJe1r5e+REE777gx+jpqIMl55/FuLd3d7UXCVCjviNtF5ywTnm645ws9FhBzDO
mPRWJMKoQQ8eeOIBfO0/PmkSeeOMfFh17pnYtXsv/nrn77B924s468LXYeqceaiorkVb41Fs37oF
Gx57HHMWLMDHPvd5zFu2zJi7VJoqSbtSkhbUNOc2idmkT0o7aNLdETXmPMG+yus1x3jfRDSGDU89
ab5ftWK57cczmGWn1xNBccEdde66Lb0KWHG9UlT4/AW6dwUw43R1kIzcfAew52lFSo5U0w3B0AzY
+V4nGF55JuVauPygKuaV+wZORs44DahqKM2yXHQpMGqix8mMMftZeDGwf5NS+0j9FwwZeZRUsHGT
Ydnr1fztmZ8LGSkYwMKzTKke518IrHyT0d+vcDYHzvXeExaqg+Tklj8AOx5UpORAxxOBQOBtDjDg
aNpe/EfavzsYoZALnjO6h/d9ny9fsgD/9amP4hOf/zIaG09gx569qKosRzQURldXD7o6O9DZ1Y3m
lhY88vjjeNI4PnPze8wgNZGu7hxLrHRQG+8E7KRgIo6lMyfiDZdfDH8ggHPOWItz1q/BM89vwQ9/
9mvs2LUHv731h9A1v/n3SDiEqopyVJSX44kH/omu9ja898Mfw4q1q1FVN9aYV1eZilLF+SUQDbej
s7UZFTV1qKge009BGSJ/GAGqgyrojdfqV2Fcv3P7Hjyz4WmUl5XhvDPXeaz0yc9kUl08cNdzzlnA
eR9XypBCTUAyYfaZwMy1wPZ/Ak/+yJg0b4TRQGXVJJCF+3DMh6EaNodDGk820ISZKreB5uWkZbmb
hw9GGy2vVeMffUZ6BVVDz96uXI+IP2Rpg0MB+vEbM0u5q1h5vVKwWeazkteCXEA/kFTbrX8fsORy
431N8X6LhORFn1YbQE/fBrzyD6DrhIhOCt1uJTtHVl1xIx1FGVk4nLJ0Ef795vfjxz//DQ4dPoSJ
48aiKxRCT08Ie/bsRUdHOyqCAUwaOxpf+uS/4rKz1yFq/G24tMlQOGw81zFEotGMQk6ShuPHjsHy
JYvw429/pU9VauDqSy8y8mcxHnz0SWzZ9jJ279mP7u4ejBnTgGWLF2DC+HH4zZ1/wYsvbsZ/fORD
uOCyK7DmrHMwfe4CVNVUQE/oaG9pxysvvoAtz2zA4lNOxds/dBP8/qoUhaQVhZu+2yuNeU950mwb
WnZSkqpIRt75w+9+h4MHDmDlsiU4dfkSqdClAJpl0yT73I8aE9s53hTE+YLKEu6O1k0BHvmW2iGl
yYZMRgQCgaA0MW6+8iVMxXyuBByV91yElteU3nNxc2zs3Nw24Wh+OH0NcGSbCmYzUha4gqEH3ebQ
l9+s9cDqtxj1d12fr0gpb0Gu4GYMRQJn3QjMPSf34GYDAX+DLgW4/hg1AXj+90DrIRWN+2RDIa1z
xNJHMJD6V7Bo2sjwdy3D34ZFxui2NKeZa+sZ3qeZ8F5y/lmYMW0yHnjkSTNadl3dKMycNBYH9u5B
T3cnvvnlz2Hx3FmoH1WLWCicW2CaISZDXn5lJ776vR+jpbXNmO/7+iUt4PPh+msuxzuuvyZjWmdO
nYz3vO069ITD6OjoRDwWR3lFOUbX15nVZc2KZfi/n/4aT23chHv/dCfu+dMf0DBuHOrqRyMej6O1
uQmtTSfMADqP3ncv5i1cgHOvuBw9IR/s2Ui+Kmb8fEcMCDEKt/E+6OsLfJPSBvi3IPDk05vxq1/+
DBVlZXjXW67F6Lpa5cfTSyRtUUYWB4yOt/oGYxJyk1pcDTYYGOeyLxjpGKPMNujYWghJwWBPBHWI
cmegE1lRRp68ZZsOutugemb/87kTcGPnATXjBu76o1htlBtjVJaRZM0Viy5RZutdLaVbwXRJy0kD
Trzp5mDaqcCKa5WrgJqxkteCgYObQ/Qpe85HgOmrBv/3G2YAZ35IkZJP/QQ48drIDnApykjBQMvX
A+mYDaKMHCCWLpyP5YsXwufTjMNnKgbnz5mNj3zqvzB3+hQ01NUh1NU17J7riY2b8Oe770VFIIjR
1bW9gVv4fF3hENp6ulBeXqbISAdUlpebRzqoRvzm//wHHn78aTz8xAY8+/xmdHV3o6vpODQjHxtq
qrBk9qmIxOJ47KmN+P5XvoQpM6Zj9pJliETSCMnkOWp8FuWcPg6QP9VsfyMRaTwK9u/dh69/+b9x
cN9enHPmWlx31aVSiYcaNMs45Q3AGR8YGiLSAn1TXvxZINajAtyEOqRsBAKBoBQx9VSg6i+5k5F0
/8EI6KUG+kijL8uBmI/ze+MXAo2vlX4gm6FeNMnCOH9QuUsC/IJ/V0EGBYJ8QIXt7DOAMz84NESk
BRLqp71TvX7kO0DboZOv/yvUdaKMFAy0/jmMw4Gs7vLsr92ibVvvh4vfyIwBapJ/SFHBeVDM9aon
YZov60YmdHR2YUJ9/cDTNYQIR5SSc8Hk6Xjj6jON1zoSRpoqgkFsfG07/vzcE8m0DjydtVWVuOri
83HxeWehsakZ+w8dxrHjjWYgoCmTJmH61Ik4dLQR7/7XT+CFZzfifz/zaXzqS1/G9AWL4POVIZ5I
3bjSbNlHYtKqin4/zbPjOHpgL77/31/CA3f/FbNnTsdnP3YTGupHZVHDupS1oDCgGmT22YqIrJ/q
/XuMItrdCnQ1qjPfp9+3sk4tOqsbvEfhpkLzzJuA1sPArsdlYSeQBfNwmTyXujJSVK+FxeSlQMVo
tdPoNVMZZGPGGtXPl1obpe/HukkDc0/C8W5RMpBN877hXa7hLiDUWhz/l53GfCEWkTaYN3lUqaxJ
vBCRktcCx77LWKBNXKJIwJmn57AGMPqHcKfy8djV3H+uTvPr8mql4OVRXoVUe7ksoEJz2VVAywHg
mduBnjaZe5XSvQSlX1eycYXp12aYRwVGbg7qWXIn16Mvl6NJP4vaAH3eqa8Nbeu10j69YRwuW346
YvGYSUZWV1QgbEzmSEb2mjbnifKyAKZOGm8e6Rgzuh7/9YkP45O3fBWP3n+fMY+M4H3//u9YtupU
Y8wYYw445CNTksJI3FARufki2tOMV7e9il98/9v46x2/x/Spk/Hpj96Is9euTj5DrmUvKBgmLADO
udmY1M71tgjjBKTjOHBsu4rSuOsx4ODzagGTUqlqlRN17rLOOVOZ9FF16Q+6/wa/d/ZNylT70Iv9
iU6BYCBggAkGSXIiQo68xA5L8kogcEPDTNWnH94KxCPevsMAafTJWFZdWs9C64B55yoTwYGC33/m
JAhkc9gYc1/6myIYCt4HHzDmD8ek7QgEpYJRRh++6i3A/PO9+YikAqWnFWjcpTZfKBrY90z/dk3f
pWNnA9NWA7PXqjNNsdnXuqF+GrD6rUD7UWDr3SJKyASujw684EzWntitrhMIPELMtEsBSTJGLwFf
dfG4khzG4nG0tLchmiQjw+GwGdym9xqmucjpveJ15yGeSODL3/oBntvwFPZ96EN4/VvfirMvvhRT
pk9BeXU9AmWV8HGHjckxro1GQ4h0t+L40SY89dADuOu3v8aul7dhwdzZ+NiN78Y7rn+91LdSWICd
/i5g6gpvDvsj3WrhueUvwOY/GpOPo9l9uoQ7FFnJY+MvgVOuAU69Xvk4IlHpRnzSd83hbUDbYTUh
EQjyxfN3qEMgEOQPqgGnrVQEf+dxb9+ZuEgp5QcjOFouoMqMabMH/8gVJGZnlHAgG6/TRC5gt9wF
tOyTOl7KKKQpp2CEMg/lwMILgSWXeQsoFgur/uHle4EX7lR9XbZAM9zU5d95vHCHCoyz5m1KnEC/
vJrP+bcoYGBk+OO7gEObpazSsedpdQgEuY4Zjmba2S7MJP7T0s76cB18bCa3/QKYIHsgm5RgJv3N
tAeSCeY3fD6Mqq5CU0urGfDF7/cNWc5YZGTCSEdXZxfC8ahJkuqRGCJhpUKIxqK2vCgurr74fEwY
OxbfvfUXeHrT8/jR//sq7vvLn7F8zWlmtO2J02egqqbG9N3Z09WDxsMHsH3rZrz47LPYtmUzxjaM
xrlnrMWH3vN2XHrBWX2Erz2IUdaANRnqhSB/zFwPLLgAqPTgyoDk4qsPAf/8CnD05dyUH/T9SEKS
i9ZzP2JMfC41FqRj3CcjJC93PGwsdJtUyHaBQFC6k5xS75tFXJ87uKDkRlW2YDOm/8d6pZb3Ai4w
nXwyuv0e3CfTOYPjEDe/aie4/KbuTqIuvsRYqJd4IJtcFy6C4V02Uo6CbBg3H1j4OqVY9DIWUIn3
2PeAbfd6V8MTEeO72+9XxOSaG4DTjKNhutrQcsKsdUbffD5wfIcSQ0gbFwjc64pXM+0M70UZOdRl
qZshYrB03mz86Be/w/Zdu7FkwdwhS48V+dtvBuVJ/ZsvSeIw6jV9SfoGSWSwdtVyzJv9Ofzx7vtw
930PYteefXjwrj/j73f8FsGKClRUVZnqyJ6uLkRDPQgEAmgYXY91q1bg0ovOxfVXX4bpUyZJZSsF
cAF5+juU034vk5Bt/wDu+wpwYtfAI9wdexW4978VsbnqTe4mcVSacMf26PaTz5G1QCCTYpm4lzqO
71QL1cosBCJdaphuDzz4jaQJ4ORl2e/l5feKAUb2nnum8muWfYKolII1453NDBnIZkKJBrKRNiIQ
CCzQVcai1ymlohtIPO7bBDz0TeDVB7OrId1AS6enbgVCbSpYzvi5cPQjybFlztnArieUKfjJMJfS
C3CNQJDPHMBBtxfop3TM5hLRKZjNcJtw9AtW46KSy6aGTFdGDlApGA+FcfrKZViycB6+9r1b8fl/
+xCmTJ6IsmBw0LOGRCPh0zTlP9L2SAGfIiOj0RjisRh8Af+gpWtM/Si8/4brcNkFZ+O5F7fh6Wdf
wPadu9DW3oGunh6THK2aMBZ1tTWYPWs61q5aiTVGns6dOT1ZNOmKSLcyR4aylz4lb9BkYspyd7M0
5veeDcAD31A+YgZKRFpgYJqHvg3UT1EO/91+f9mVyiycptpOakxG4QtWZVeucDJlKizTdnOpwGGQ
ndpxykyFZivWPfjskS7lk4V+V6jwzPf5HSeHVUoxSr9qlaP67xrHY4oYZnq6WxSpm69vMv4Wj2xK
JN7fDDoQ7v83lh3z3bxHbX9/oEwr02nl3VD7UXN7Vnua3ZS4IyXf9CJfP1LSmA3cWDTbe53qfzLV
BztYn3iwXvDoaVH9QrHyjUoWq65n6lu5YUTfkQdfdPe1WmvU+Qnzs/uLZD0//JKxAK13JyMLuWBb
eJHxHFOdVY8M1PbET4A1b1EEbDZYgWz2DfNANqW2IGbZsN5wE9VsJxyrKxzG+6hqJ2YwnnY1ZhbT
FzD7dNZbthO6oQmWp1p+WOmJhhRJHepUafKqLqNamG3CIsLZT/B33MDvjM6geqPKLJff7/e8Faos
eH/OmXjws/S+ir/DICfmfKWrMP6/zcCIDtY8zF+zvEO2OZ7xnVGT1NkasxlIyRxrW9VYzX6X93Xq
BzgnpL9E1qmBguVmT0fGvjCmnoF5Vyxw02TWeg/BxIxnPrYDePJWRUTmOx+havy5O9Tmz9k3Om8C
EfQ7T4Xkgc3O9ZVtg/mazeWU5euS42am9sV84JGpb2FdIYHKNUQ+7aaQ47Pb81rrHqbXTVV6MuQd
+yCmgW2Y/WN6f8R6a86d2lV/1NM+OOU4FHNgHd4VkaKMBHILYAPkFmI8d8SiUYyuq8Unbno3Pv5f
X8X/fu9WvPGKSzBx/JicguHwWp/PB7/fj4DfZ5KZVVWVKC8rN14HEAy6F3c82eH7jQmNT0tVHfgt
MjIWMwPbBAODb05uBby58qJz0RMOo/FEM5pbWxGLxTG6vg4Txo5BTXWVabaduWycQjt5KX9BXpix
2n0SQLQfAR77AdC4s3BEHHdHN/xSmYjQT5dT2+Jid+Zpyn+kkx+uU16f9H2ZZSHPyHwbfqHOZiP1
q8ipdLA925iQzTsHGD9PkUQWCcgJYdNetUDe/aSaDNFchANZIUlJEpBUqE5cDMxZr85c8JanLdpJ
PpLMpd9OOg5nnjB9rYcGPqjy2WkGk23xz8kDy79pT+pkmiY2k4x0zj1bpZUkA80v7WXJct7/nJF3
T6v0Mu84EdWHiFxze1aCeUq/km5+SkdSvg1Hkq/U08gJfN1EoH6qWmxRMcggYpnqgx1cNHeeUCrz
o68A+54FTrxm9GsHMxPfeffVR1SfybQGKjJfw/75tSdUP+QEmmhTDZ/t2RiVtdnoz6KrB68cSCIx
8EyNi0r/1UeAl/5ujJurVDlx4ZMN889VgWxYJqXWZoebn0GOxdb4yP6SPko5TvMYNSF7wA0uONuP
qfp05GVlXkr3MiyTfIiklIm+T/X/3FjlPIb+sJnG8UYbHjU+dWFvpacjeVABvPcZZfVBFwduKlo+
67Ir1BzH7D+Cqj25Yc5ZmYm7g5tVYBBrPuSpLPyKbB01UQU+5BySaWDbYfrS3Rx0Jtszff4dfEE9
a9Nuo085mh8xPPcstUmdDewb+WwcN1kGrCuLL1b5x37W2gDnPI7Pz7wgycbxefqpzm2b6zL6S6Sl
0EBcB7HOzFoLLL3MWWHNvpTPcOD54rWtsXOMvmyO+3Uka7f8Gdj+QOE2Rpn3L92jymPp5c7ELAmm
qaeouRMFEdnAdscyzmbtRUKOecqytvf/XAeYxOxaNRZn6luYXgbTpI/Gvc+qvoTtOdd5t17A69ye
11r38JkPbs7vXqWQd9nAdss+mHOnGacb/bDRR05c2L8/Yp/Dvpb9EOfVTBMtGNjWwh0jj3bLUs8C
OXNvJ4MyMu25MgribOeM16ULJ/PJAmOSHO0KYf2py/CtL3wSP7z9Dnz31l+iprpS+Y/UNQ+30E1T
ZX8ggLKKclSUlaHW+P7E8RNQX1eLMWNGY/LECaitqkTdqFrUj0rd3ezo7EJbRwda2zp6iUeaZduf
yZ/suKORKCLGwd8YKpB4raqowIypk82jX/Hqmd9nOrsd4vOrQCDJxYmzF1O4zX9RJFy0wAvdVx8G
lj+rJhhujrOpNHnpH87RPWlqwsE024KZA8+WvxqT4wNqEshFw2lvNb5zVdKZtpZ5IcSFJw+SnXTc
TULzlfvVRDcayr8caJJIH5qr36J+xx9wHnS5COCx8lq1S8mJ8abfqTLiYj7XySKJsZVvzO4vjQun
5+80nn2PmjCSHF5wHrD2XWqC6DRx57XLJquFA/Pu2d8qlSsnAEOxI+n2rAT9GjFP245Kvg1kMBVl
pDu4uOJCnvWHZnI0E/a62clreZD8YJ/EPmDnI6pf4kS/0BslVt/JiX5tlr6VC0oSqLoLGTnFxUSb
v9PdNrjlONEYByYvdSYGqDzd8YjaJNj+kLHoWu28AGT7nb7GeJ4SC2STq5/BoW4nnBewn+W4zr6Q
G4c+j1ZAlpqXi2MqX01/d8YYuek3xvj9gNpsyscPNS0XSOZwU2nlG1TddlIo2dNjgfWJaeF4w7rP
NGWbU9B6Y+45Ki9yAa1fpmQgLdledz2p5kNe+yzOO2avU8QeSVG3eRtJSh5sL6vfrPom+h3f/CdF
DpOQGMgGCglQBkLJBkZ3Zl94Yq+69vwPq3xLLx9LzcX8WfEGRVRzTubmuoiEIutS8/4B1Js65ad9
3bud6zJ9Mr54d/HaIFVjJKAaprtfS9Kc9bSnvbBp4CY/fcJzM4sEtxNY33gcdyAjubFHt05TspD0
3CDm97c/qMqQmxyzTlNzb0YSZ5t2qv8ULPDg3I5EKufdB7cUT73qVvZuz2uNqayrBzbnd69SzDvW
YZKh3HBZ/SZVj5ws7fg3rrd4sA/jmMC6zbSwz+DGazE2dIdibM+m0dPdvzuClZFwYVt1D3/PTxGZ
miId0a4erF2xFIvnzMBz217FoSPHzYA28LJe0HXEjcVAOBJFyDi6e0JobWnFM5uew7ETzfAFAhjX
0IBxYxswb85MnLpsMSaOH4sxo+tx5FgjHt/4HHa9thubnt+MQMCP7kjEjGRN0s8ycbaUkZFoBJEI
F6dVw7DMc1VGChNZMHDA4eDvNrHnbj7JGa/BCXIBFwLcaaPCzD5Bz7hgXKRUnE6R471WCw6i841B
8fyPqUW0P4eul4uhS/9TDaoPfwd47SljQBugU20O0AuNSelZH1RqDydyKusCpxY49TpjMD4DeObX
yvSFO8e5LLK8+rDhRH6KsWhf9y+KBHE17cmUd59VE+BHvqsmokNBrBXKX89IyrfhQvINFyJy1fWq
/xk9Nf/7sQ8gUcNo0E//TPUBXhV5Xuv6oa2K0MlGvJsRspMBybIRoVRKTFqqCJlsOLhVLXoGiyhj
etkHuwWuIenAdkcz052PquALVGI4BWBbkgxk030SBLIZKiJy/nnAeR9WBFje5Eulug/nGgy69PTP
B9iXaopg4wbiuncpFY5/gGIAktZr3wEsN9rvK/8EnvqpMu8vpQUx54icKy6/Glj/bqU4yqfv4wYM
/bOSbNv0W2D/C8oVTqHBtsl+6bx/zUxEZiI2SNywrZKYdHKVQcsVHgMhI7mpM2mR89ybKlmmhRve
xcLo6cpfo1u+UBFHFRnVxYUGN85Z30lMsz25tRWSxMy3fNWZFBmMmWnMna8F1rxVzbFyAccLzueo
COQ6gIRqMc3pSwmlknccH7ihy819bug6kaFOYwLXclwPsz/a+Ivi9UelSsFkVEZm4mVy4emS4+Sw
8xmZjb0d6FGABYdu9sE9qCkrw3nr1hgdtj/3OySUnE+PxREOh9EViqCrpxtHTzRjx569eGn7Ljz4
6JP4270PYuWyJVixdCH+dt/DOHrsEJbNnob1y+egKqhh1/7D2HJwDxZNmoZo0jeUpYwMhyPmvYef
n9AClLtg4OBkvMpDBO3dT6nd5XisOOmgU2pO6NwUD5zEcuHur8jPvIc+p0jeXfAxtZgcCEhe0jyX
i2oG9Nn5WO5R/qjE5KT8nJvUwK7lGYGKEzWSGzzTNPjwy4UlrEgkcOC//PPAzDUDX3wRq65ThMX9
/09NthPx0u6nRnq+iTKycLA2DwpFRNpBhcsln1XK8KduU7v8eoEe1HSR0ZI9mjT757Fz1UQ+28KC
PnnHzsquXLD8ReaiNs/38aob1GaY2wYBN+So4uLvNe5WRAEVexUOPvvsgWyiJRbIJte5+WCDdYSK
w0IRkXaQSFz7TlXm7EtZ57z2pZyjUEFEFQ7vMWZGYdJkknTXqvnJA99U5MxgbTjpLuVA8nbt21W/
5ebKIBdCgoQ+5z6P/0jNoWgKXEjQPyQ3aWgO7WXcZfvmRgN99dM/oZNikBYsJDqpEstl45d9JzcV
3TbfrT4mVESCa8wsdbiBc40DW1Q07GLgyHalkOOc2mlDnuRT3RSl6M1mIaV7LAOW37n/atTpN7qr
e51g9k262jygGw8vbTYXM229xO411HlngcQjN6rob5T38eUZM8PaIK431k+PfF/1R8ONXM5VGelg
TT2Co2l7ZRfdDg2FVEgSsbix+OvsUvceAF+gJYPPlPn9xry1GmPqazF90kScttyYyF5l9PFHjuGZ
ba/gvoefxFe+8yh279uP7/znTbjhsnOMdWcCOw8ewye//hP8+pmH8cmLr0V1WYX5bH7e13gVNaNp
J1KetycUxs7dexGhktNajDANgQDmzpqGqsrUxUBjUwv2Hz6SNIO2rjfG8poazJ05zfR92VtSxt/3
HjiEptY2GwGu0jJ5wjhMMg7vrUZ3qAeAMJFFBP1eeVFo7XsuqewoEmhiyMUA/fS4pYc70RUPD5yM
9AWVCobqtLrJBcjD1cDFn1ILTfr286pmoKn16cbk/pwb3U1TcgFJ0tPepibeD37DmOS94s1ZvGtz
Mlr35CXKFC1fQs0CiVj6zHryJ8qvVKmRVgXxq3YS5VspE3zDJY1c2FP9wYl8oYlIex9w1vtV4JTN
f87sbH4g9ZzmpPQ9RfcCwYrs/SF9imWbxJs+ih36eNZn+lelf7vB8mtIhRYJUid1PMebnY+ntje6
6aDLBScykvdcUmKBbIZDG2U/ybpy1gcKT0T2zgWMxeuii5SvsIe+o+q3F1CZdfYH1Tibq8LdS5oW
v07NIx78llIj28dvfZDrA5WBnCee/1Fj3nR+Ycav9OelKokEJ/Ny810qQEYh6jFJXapNOa90C5Bo
gfPQQy8pcpFEoBMZSTKEJus0Ec1FHWm62VmoSLWsz6Yrn+BMTzFBAYAXVRuJUbqKKRbYv9L8l23R
LT0cN7kZ0DVA1xck06gIPfN9+ZNpFmaerkyVWQ/oi/BkRankHdvzwguBCz6uVOmF7o+sfu7Vh4sb
8KwU5gFZlZFu0bS9KAGHnb9IexTttEjJvc+o53D0LgOVWXOhGuGAH083D9NwiTtuUVv6jGPahHGY
NmMqLj7jdNzz+NP48e134q8PPIUVc2dg0YwpmD9zCj753utw9Qe/gG2H9+PMeUuU5Z+m/EiSiCRp
aVdAbH1lB27+zBfR2taORNJkymf8WsPoenzjlk9j/apTUtJ459/uxXdv+5UZldu4WzJtGpYsnI+f
fet/UnxadnZ14b+/+SM8vvFZ4ycTySqrmf4033LNlfj8x2/0WObo7+BzAOUsGGCny8mQ2ySNaoHj
rxkL2q7i5vfRV9Wi2W1ybypRRqkd7IGAk14uJDghzFeJaF9gc3BmlDiasbgpLLhA5SKIqopCEpF2
UEHHyRojlpNAyFcdxXpy4b8pZVOhFiTMB5p4cMDvbC5uhPKhWriPxHwbDntFQ5VGuoQ47yNKDVRM
mLv8bwL2PGMs7rcXLs8YLZuBI7KRkXQ1Qb922XzQ8e9OfbzlL7KifnDaKNsk1Tg1LhuoVIVysWz3
mfza00DTPlWWPgcik4FsNiQD2SRKPPhUOhFForWirjD3I8HDTTsv/RVJIJouM++KCS6mWf67Nyi/
2G6grz+a71IhWGgi0l4nqeTr6VAKSQamGopxzlJArX+Pmq/kqzxynNctMeZlN9EBvvLpXQgTScvP
t1fTTRLAZjvfySimqq/jxrXTHHlS0lS7KQcycnwyYKNTfnIzh75mT+wv3jhl+fxzq8dsr1TYtx0r
7php/sYRdzKyPklGsnwG2q+teL0q10KQab0k1tmqHzm2qzDR4gs1By3kvUoh73g9N0RPf0dhiUg7
uAFGdyxc09HvbCmuTbyUrygjC7laGIhSMtluzEjWPsTjCWgl/ISmv0fjqC4L4vorLsaCObNx4ye/
gM9//1e47ZaPolbTsWDGFNSMqsSxthZTEZnQdTO6ti/pQzJdGfn0phew5aVXUF1egUn1DeafjrW3
mIrGzcbn61f1OailL8oHHn0ar+3Zh/GjRmN0VTUi8RgONp/A/kNHcLSxCfWj+jqeY8b7+x5+DCea
WjCtYSwqgmXoDPVgX2sT7rr3AXz+4x/ModXoOZb7cGXcSwzcNfZ56G64S01FR7F3h7jr6mUCykmI
113uTKjNsOgkUUcfZSRDGU2NgyEDGZD0pEIn4IFEYiTAYzuA9uPKObMT6G/p7A+4Ow1nuuins6tJ
pY9m8kwL00bzcprYO/krY2Aemr688Kf8zQ3MRWmaAogmFQzw0tGo6odpvmmkp2aMSh8nK26E77jZ
KnAETcpPxkh2km+C3kl0QCl1GqY5ly/rBzdbWEe4QO61VPCpwApm1Nrx7uT2rNNVfeLiulAk2KGk
P8dRWfwrmgFuxmX26cX0knRwUhIe8ugvslBgO2IQN6c0Mf9pipnuM5nWAiQk6Wuq1oHMpAJ/RgkF
svE6dWJgFloQFMoqgqoXBgpwU+qy7jC/GIjAibDhOM0Nt9Yjqg+0WQCZcwSa39MVCtuMIzk0V5FK
W+9xNhNk/SU5tera7PU/hVDqUuXNczg5t+HYzXTxcPJHyPQvvRQ4YtSZJ27rS1ehN1Gc7keCfs1b
VJAHL0Qk859+DhnghP0Wv8N5JvPf6Vkt0IKABDT7PgaSyCewEJGr/zgqs8z+JxmghXMnzummnuJM
eLJP48agJ6sYLRl8cL57W2FaimmmT1ce2TaV7KD7ISoWi2kdRbDcvYgMrPYz0P7NjP5c2b8vIfHE
4FFsqxy72I9UJCM0k7B1awPsE+gXnGMBSdVC9MGFtA4oxL1KIe94T7rImHuGe3/EuQTHbZ65EcaN
/KDRB49KBgB06pd4f6o/2S+4pWm4UmwOY0DAE+eWibeB8DT92k3Qb0a0DkdjwyK9USOdWrQDK5Ys
wFc+81G85cZ/x0PPbMG1V12AHa9uQbgnirE1oxBP6GZb95kRthUZmV7k3aGQ+fnSqbNw84VXm5/9
+OG/47FXX0Q4nDrIk6wNJ3f8r1q5Dq9btgrH21vxxb/cjmPGORROHRAj0Sjixr2ryspw4wVXYda4
idi0Zwe+ds8dJkka16naLKFJtSADSVLnbYLYelj5iSl2Xjcf9OaPhi4KOLAVKj2cQB4zFus7HlWq
xmOvqskXiUKShtwho5kSJz9OxJ9lHk2/J50O0aw5mFNp4BY0x0rX838wFrxPql1gDvQc2E3H0cYg
ufRitWDMRpZyMnDGu9V3ObHWC1iIHOAZme/5PynfKjRxI1nKOjVnnYqaSfNETi6c8o1g/r58v/si
dSgG60LX++Gab7n6JSpln5FDlUaaAtMk2EmJwsUnFQLP3anIERLW1q4868h4o73PWa+UWVRklzlF
jaxQi+U9z6ogZIVoC25+I/mbXJybrj3SzC3rp6mFRDYSlX0mTSS72wvrB8sJNIl1U6fzOUg6kvhK
/y2SlCuudiYjCQay2XafcY9hFMiG/QuPQmHjr5W6tMelv6oZr4h0J/UwF8AkbF74M7D5r6n+c02f
jpMVmXnKlcqUngowJ5KRm5wkLqlezQaaZy+73DlqrVWPuSG54zE1dtPEtTFp4kr/kny2RReqyNtO
Pru5QKdrD/bxVvRgzgE4T7Gi0XMOwbmKm+9rLqibMrgJ4L3CGaxeSCIy4AvJaDdSj30BN0zpV5wq
TvYRHOd4Dz4vlYAk30ZPcQ/Qx75tzZtVWhsLrAhlubANm/OzqBpfy5KbO9wAZF5wc8/KCwbS4ljN
Tb9sZArzhiQq64ZThGf7nIzEt1OkbrMffFHlYzG7Cs5FvWzsM88y9X2FBgktHq4kaplqs4VKD8ln
lj2jO7OPZlvlJjHLnGsAbj4sv0qNvW4CDsvsvbWA5FUpKSOHOu9Y9lRpczx1qrvcyGg5ZPSd96p0
cQ1EopvtnG2PLidWXqMI0GwKT95/8SWqD3j298PDqkGUkQXIwXQTXGR4DYdrMpjvBgMBUxnZzeAu
2rDJCcTa2nHOutVYvGA+HnjqBaw8YxV++Pt7UKdVY8nUmYgYDa3MWJhoSWUklY2JRDyFbAiFFLk4
cdRoLJs6y7xuwii1COro7Ey5Nh6PoyuknKuTWFwyZZZxbRsqy9WuWWfa9fRH2W0ctWUVWDR5OuaO
n4KOkCKSorGY8dshVFeUu09gUs62z52OlHohGDA4AfOi+KMyMhYtfnpIqHjZCada0RcszIDL3+TC
n/6ieLarPzl4vXi3WtTQryPJP074nXb4eC3JBk5oOxszXzPtFLXj5jTBZ1vm4veBb6izPV0kAnjQ
lxAd3F/4MTUZzkYsk7CgKQMnyoVyxsxB/elfAI/9WO2Gpk8CuAgzI+Iadef0G5x3sc0F2kylGDnZ
MdzzTfxB5gdGe6S/KydVJIO/3PWfxnlz/0jYrCN09M/jgNE/Xfd1YPpKZ9KafRYJy3zJSAutHvxG
zjoN2Hp3fzJy6rI+EiXjYvQEcGLP4PloIjFAM2C3gBymv8f9mQO4caFC0oRkixPRQmKMizO6PImF
Rm478fK7U5ereu001pL4eui7wKY7+6vHOH7STQAPtqOIkd9r3uS8+WqprbKRkaZac5VyM+AEtlEq
kZ/+pdpQSFd6se0wYB+jtp7/EUVkOyk3p69ShCTbBZ+Lde3uW2xkWD1w2WeAc29yTteGXwH/+JL3
cuIinWkbN9v9eZlnz//ReObbM5uUm35yLwbOfK/KQycVsmmyeRawZ6MqPydlYC51mPfh5i59UnIT
kP0hCS2SpTQRZX3jpk2jzS8i1YCcBy5tUkR1NkxcoNSOXshIL1G02Q+yf289XOT+b5Q63MDN+XB3
8fsGtlMvQb4453Gb93itG1TMcdOPAUvYl6f3IySESYzTDP+CjygVtRNYT5zqSi5p1AvYr+oF7qOH
Ku8aZqg1lhOZz81bikoe+xHwwl9SFb1cAx19RW1e7X/e6IP/VW0MZatP7P/o0/KVBwfuGqzU58o5
RdPOpoi0vy983JZBzpWBOMh0lo5WlJch6Pejk8FnfL5hkxs0udYCAcyZMRUbnnsOn7nle3jm6Zfx
jtMuwOiqGkRIDvl1pYw0jngsYRKK9mdv71BmIQFj4dPc1mrmgz+5CDrRkro7HzPu19beaRKWiVjM
+HszuiNhlCUHzOaW1pTru7q6EA6HMK62Gt3dPWhqa0EsHDG/z8jezO/qirIcWoLXupBW7sJHDhxl
Nd7MtOkrkouwYuc1d9i8ROvmxFZzUEZ6VtQYF77ygDFB/4pSWGTb9WoxJoR//x9FXNLX5CiXwXL5
FWqA5oQynTCnisFL4BwO4Pd9TakqsuUJ+4Dn/qhUVhf9mzL9zEqCXKiILqeojF7zjUQBF1qP/sh5
cKYC6KHvKRXJvDOd6xrzg+U6WO3Zq7lKISMZDvd8y3UyW+rRtIcijWzLjEhKkx/uxpO8qqhRr7kw
Zh/zsDGxP+TB7+zeTWozgotbp8W96au03PlZc1UhuvmNpJKIfiN1LfXm3BRxUoJZ/iIHS+FK1TsJ
fSeFOsth+0PK/YaepR/e8bgijJzUafwNmruy3JqGOJDNUM+b3H6fLoca96gz24bpn6xK+XkOlivC
gn4FOc7GXMxYSd5xo5FKmHqHcbcyScxkS1v1WGDWWmNxOsf590ga3v8tY2z+gzORRpN9bkwxTfPP
yb5BwfpJInv7w8ptw2CUuxm0ZrWzebLVNqiKeuSHRnnc1Wfe3I9k6lGkAMlFLv5JTDr1WSQa2L+w
XdFMOu8FVVxtEN//TSOdf0stF7bHF+9RpAPrld06h2uqg0mF4gKHeR/rBNWOLz/gbKrtNYo2zbN5
FLudcvxxU6qac5ckSVjs9JD0jHR7q5888l0DsB6QZOKmhpP/Sc6b6U+W5XzOB51N2zkfN8e+ArXF
UoymPVR5Rz6CqnK68HACrUm4+bLxd9k3NuNJwQdJyNqkEj9bH8zNMc7H2+4fPgQjstBk2T5DNjJy
pCErr2hTz1mKRz15kZNiLvk1Bl2pqqpE04kmo/PyD688SehG2iuw5ZXXzONday/AGbMXoi3c17j8
vcrIuGlqbSc/WlqV36UyLYD2jg6UB4LGa9XYmppboSdNvc0xIBJBS0sbgkZj99FXcbsKelMRUDvJ
jc0tKffu7Og0xvc4KsvKTBKyvb3DJDHLjAl3T08PWo3fnjCm3p0MSjkjraxdFLDCROYHTr69mGnT
j2NiENwccIfdi5NgTmL9BegmqUx78mdqB81t4c+JGAc3Khqrz3bONzom5wKCk9/0ialpdrXaWR1E
wuLZ36pJvhdylooELrRIOmQb6OecoSb4XATna2pAMiWT4iMTuLNP589URDmZybFMfSf50Cf5JuBO
PY9xSXMk9gdUC/DghPjYdqWM9KoMtNTOTgt7Kqe89PO54KCL30j2f3UTVT9tqd1JKFF95NT3HRxE
f5EmOfg6d99/bK9URvY4KEt3PAKc9ha1OeCkel1w7vAMZDPYoFkyD7N9zFOEHRVsJN4ZuZgbfXSH
0uXRhx3Heh5OZKSb2opqtilLnRVtHO+3/dNI+z+9+fqjEo/PwTmBZSrIuRYVYpFOtZDn/IvzXfYP
+ZKRXsF84gLcTTHMtvHET4FnfuOtPtN1wyP/p/orbpA65eXs05XbEvZx+QaPoAk5zSyz+QRlf0vC
MRO4Uc1+iXO/bO4lWHasHw0upto0CZ+YNFd3Ink4d6S6ttjwojA08yecGriraGveuLd65NW83A0n
9ily2ksgHPY1nMORUJ/gQCbTCqGsEic9hiLvOHega4sGFxcHOx9XRKmXedTujaq/ZvvNNjeh6pm/
S1desQhGCmRlUUDUVlWhtroKhxub6EAyu5+jEgUJw972MKreTDv9QFp6A0awZjRtRUamduKNJ5Tv
jVGVleYj06S7skyRFd09PSbZ6E+ad0UiUbS2tyEYCJoBb+iBMmBMFKqCateMqkk7OruVD8HKYBnK
jXylAKLMfF1m3Dts3KtdKl+pwysZaaoiTzLil8/DgZQRvOMeiVYSRNxBp6mdmwJm8mJjQHyiv6k2
oy6avic15wk+VRBefOdYAz0XblzoZzNdIFFJU26aUuS72Ge+5eIPh6ZRXFS5+ezSfCd3e5N8E/QO
zrtTzQEJi1jOxf9n2MNGUbCi8FFw3fxGmn3gMjXRJ3FkTmDmKYfx2dJi+YvsGaS5A/vK6ae6mynS
pJbt1okQ4TjCPOGCxsnk1vRjuEZd29U88up9rnvImQJamEREhTdzzlzIFH+Zc0AoEkijXfwy0hyR
hJtV513nVhHlboXKTc4pzOitJ9Rcg4pOEltNe9UiPpubhYLvy2vKFN0tSi3TTnPnXU/mRqyT2KOC
kIt7J3KY7dNUWNflHziF7Y0KyIG4f+D8ioQLfZo7maxP8GCqTZ/BvMapnvF3+HtdLcVvj16JM7oL
0U/CzRNuMvHwCivatxOhxvHN58dJj6HIO7q44aaU0zVc33Aj36uimpu53BTiBgA3P7KtlbkRRt/S
dNdyMo3HyD5+BFJMrp0C1mQzzday3FxDCYvJvEZNzsWMm/PkBOpra3DshDHx82nDu97oMAlEe17w
iUhGRmMkI2MplaC9S5lpm+SiKRbVe5WOPeGwSV76k3kSjkSNuVoUNWUVJsHI3wn4fMZrRUYq/5M2
f5RRpXYgEVkWCJhm5VRVVhjvm8PdCDE6uGc9+ECiaVtqSVnXDhgkOrwQcZaPxmLnNRewXjYKmOZE
nmVv+jr5h1ro5HIfEkqnvkGpM5xIIKojSS50pJGRU5Y6K4NMEmqHmogmclADHH5FBQVwiIthKky4
g9+dBxnJXUGal/XkEGCCTqTdFgHVowvrkHyoFs8nc75JAJvioscDCck+h/0KzZlqGhRhkM35eqHr
sT3P6Lqi2cVvJH3D0Y1ER5KYoZlTpYO1hOUv0jKTLHYAmwVJs12nMYftlhtDZhAhl/txQ4i+7tx8
7C29BHgpGcjGaZOPCyAqA/NVALHP4Tg3WH44C9lOM4H1wy3QHcuUii+2E27+kXT2Ev06W9roW5u+
yjjuu43dPHJ5vj3Ggr77q8qNFDcoWFbFUs16SRfbMzdT3eoxlVGsx/Rfm2v5UQ06e60xl7rGeR7F
OQuPPc8M/JnNTY5tSuE40HpHApUEIcnEbP0FA3RMdjDV5vdoys0NC6dFnhlscOvgjE0Rj4pHS4lY
7DSRZNL83srUjJGQx29xE4+WQs2HvN/HCuTjhPKk25VSDTpTiHsNVd5xg8JJFUmQ8CShH8+hD6Wl
glPgMoJ9/6hJasOppIkiB5rFyUw7wzxq5JppZ8uwfFxJGmgYXYcjxxuRYKRqY8DXh6nKy5/m85KP
QX+R/LwnHkUsFk9xpWiabRuw1I0MckNTbSok6U+S15NIJLp7QkopadyLhKWVhRVlagevsys12l5n
l/LrQbNsKihjxsAQSH433pNQ0bpz5SIHUu6CgSPs0fzajF49COorr9G9Q23OgW681IvGXWrQiudo
fk7FAne+TRLAQQEzaWFmxY3p382FPKDT9EiOC0cSA26+drijWFadXxkxQiaDUuSyWKKqYQSZNki+
CQoG01detSIgLRJyyhJgxqlqMc8IwP4hmjJy0TzPwW8kA3WRMD2e3AWny4GqOuf7dQ+SiTbzdNH5
7hGwSY6SGAh3ud+TCjGqJtyUG1YgG2429LRmv45jzFnvcfcv5wb63XviJ0ppXYzF7VAurHsnx2Vp
7aRO+SgjIc5gSlTjeCF1s6WtdoJaCPvL3BfCuUbRJQnvVUmZT356vY7kLVWabvlFf7V0KTEQkLCl
lcbSi503U0g8MN+zkZFenomKUhIn+SgNSRLTzHTRBdldYriZanPuxb+PmZ79d2iST1+R6ar5YiHm
MWBMoMy97hdkvPNoNu62IeGlXnCThhtl8RzmWPSf72Us0Ae53Q72vYYq7+gGxc3fvhfSs18f3KRc
OTihdqw6TlbeLVOzz5mQ0b3duLTzJPWf9WBu/5D1nW0MCAaxc99BdHd0oILEWXx4ys19aTtyfEp+
FvAbzxQOIRaL9T55KBI1/UCaHE+SgCTZWFVWbhKOza1tiEQjqEwGmWkz8iaRUKbZZSQjdd08qsoV
kdnS1p6Sq61tymSESkiaekf0mKnQpKqSpuUdXV1ppZC5zFPPmetCetmmlLIQkgOHVzKyqmFw1Fec
hGgeFtaM6pfIM6BO2zHll2kg96DfJk7gnMjI6gx5RgUAzQyCLpMtTlqp5OCi1nPnEHR2DE1MmJf/
jm1PkgguVl0oxfZciDSNtHwbDptFpZhG9ilUxFE9zYN9AEkV+hnjBgd9Sw5UKacXOM8OuPiNpOqw
3liYB7Yocs40YXYgHkj62YPX6EV8NgbnYARct+AN2x8B2hu93Z/X7X5GmZ/WOLlWCCgShmpLBhXL
Bm5mkYik7758QCVHYICqJqr0+P1CRf+mf+ZCBMLgWGcFnCHxyAUq2wbHuEkL1KZf1ejCtRO6ynBy
l9G7EG5Vh17CfZ4brL7HDay/HY0De1auwRgQiiSAU59gRdfNJz9JmvDIV43LqOxUgc5yCJ5hmWof
y0BGUlU50cVEmyQkVZyRQVIxe1VGclziUfSAOmXqcEMsme580hNOkmPDYU5Wavcairwj/8FxlWsr
xzpUroJE5QKqcStdYlxQGcljOMxrM70XZWS+s958WNnk3M/vw8FDh3G0sQlzpk1BrGf4kZGWOXa6
qJOfUZFIM2mSkdazd3R2mn4hSULSlJogGUmT7aAxGW5uaTH9RFrXN7W0muRjRbDMJDctyq+2XC18
GpuaU/K1qUntMpLc1JJ+LElymibhRlpa29owKGbagoEj5DF6tRkYpRJF9/NQP0mpMN1gKsai+f0W
J8IDdcjN77otzphf6eoYLqCoUnYzRafpEo9CgwOpE2HpSWXQ4axKzWfyM5jNebB3kk+GfBsO3e1w
HBKotGGgCNOv6yJl6kd3DjRrrqovjI9rvQj5Zvo9bHH2w00/tvSdO26W8rdUDH+RuaadC4/FFyqS
wwlMU+NOtQj3ujFERRQJ2hoX4oqBbEg6vLYhN3XJYLcTmpz+/UtDH/3bGlOZr2wnNCE228iSPt+D
xfTTxnE76EJcc4M02j345VnoqbAXMpJtgxsHoc6B/w7biZv/as4HvcwJHYmTDnXkC/Z3VG9TlZ6t
rjmZalMNzSNrOSbUhsyhbYNXd+i6wYv7BrNO1Bd/DcANBC+bCGGPKjvHZw/l5ndWMLR5RxI/07oq
HVTC8yh43axXx8kEzz4jc+FnNAxrZWS/QdXtdQ5m2sFgEIcOH8WBI8cwd9aMYZk1Pk0FqtGTfiLN
x9QVGek3GifJyGi0T3nT0tqOjq7uXj+O5vyB0bnLSEb6EeoJIWIz6z5hIxf5d6pHzQB+STKyrS3V
11lzUhlZbdzPTkbW8HrjdUtrx8DNtL2Us/CR+aPToxkoJ/pcuPkCuRMquWDcXG/+z6hozNdfjOlD
a4AKDaoB3IhMKxiF/f5mVNuyoStvM5BFmRosBuqqgn7t4rGRofAr5Hg6kvKtGAvkkzGN7COolpm1
Gph/FrDgHBXgpdAB9gpZj+15ZvmNnObgN5IqIioZqO5zMtGmmWrjntwVQQN5tvqJwOx17otejnWL
LlLBPHLpYytHeRtTmSemSedrg9Om9WHaTrgAbZiizNbnnanaCcnIYpCP2Z7VS/RekiOhrtLu93SP
/VI2U2QL3Fyj25F8rMyoIO1udS/7shq1gTBQP5rc4Ohuz79c2N8deFH1VdnU4Jap9ug0U232C1Tu
NkzNfn/mBe9PJfJg1SGSyWEPhDL7SvZZLA83d0D5gBtWdRPdryPhz7WLnkc9jyZNvYulwC/UBqBe
gvcairzzGvm9WODaLcB5js85mF2p9fFelZEZvjtClZHZGKbCKCO7ekLYsXsfzlt/mue9Hfpj5MIg
MQhm3T6/30xVIp65kvuTviHtHAKJSQagoZIRFhlpU0Yy6AxNtMv8Koo4v2uZVZOYDIVCvddTRWny
JcEyRS7Cir6tdoNN03Yz49T1DIBjXW+mxfic/ierkruYPbZ7576SECZyUMDJDydqnOS5Texp/rTz
yfyjGjqBu8Zuk2DCijKcD7hwyNVfpH0iPpDvUmEz1FH2zDRkIZWLpcQbzqbXg+2vp5TzTR8G6R4O
aTTdNUwBFp4HnPdBFdRgIH4fORazHZuBvxx8+pobN3px6jmVQvMd/EZyYU4/Swx0UzkAf5F6EcqS
viJHT3YnfflMNKcuFqgapTuO47uLXyF1DE7eFrahKCXknNOB9e80yu28gbkoMCM/xlR5+xzaGQPG
ZVtkchHq5gaF6dWGIEhmofs8unxx891tEXz5gPOokAfFokUEZyLNvDyTl4BHXsF+ioThkosc5rEL
1GEnI7npRMWkU/2lIpL3Tgyi5R5dHpEUJrHnZh5tmak2vla89HBcrJ/ifl1rMjJzqc4xZGla+Lzz
1AcXGVYaiknID1V5ZMj3QF4WytpwbghOZJMXKWRmcktLThA2vbgN74tdY6oM426Tc+MrPZEo4pwP
BYJpUawLC59PQ3ckYhJ6lcFgxoWDaY5tmk+nsJG9AWz4OmIjI6OxGKLxOKqNtAcDgV6zawac4fUM
ZtPR2dV7fSisyAmTjExqLzXzvZqUhI17MyCO38dAODq6upVEu9zMG5Uqv9anwgxHYygeGWm7Rjr9
gYMKQ/rAmbnaWbVC0DTl+bvycwLuBJKQ9InmRRlpLlzz9MtEUy/6FtEH2E15WQCld2HcWdN8Q1vm
nAxTZaAPUOFqERvFaHelur9QkOiDIyzfRBmZHQxgcNGHgdXXqWjoOS0eu5KbSG1KoUPSgOaqTps4
1saLXoQ8O5D085hNKcTno7Jw2jLn4FkH0vxF5truvF7L/o9BKNwC1wwGaNLJcXXb/e4KsUFYeJRc
OyGJffqbgXPe5x7ZOR20emAbYZ1iMAOSkSSERjsQHbR2iGTxQ2dF73VcqCZJvOGujIxF3C1muAHi
D+T3rP5ydXgZO+N5WMIUIvKyhSOvKlPqBWdn9zfLdk33AS8/2GfKyk0ZpyjarJ+87+FXBrf+sJzp
goHEnpsrCrYfHseLREaybx4zE6jzEPWe6tETeSpIuflgbkAUaS54UkfTHoK840aSb4i1elYgJ72E
ychclZEOVMzIjqadLdN0eDPXzpLRFWVBvLD1FbR3dqPWeB13UDtqZp0rw4MbNmDX/gO46a1vhD+m
uxOYAwBJRi3gx+/+9HeMHV2Pay45H9GeUMbr/GmRwK0ANv4kwRGzDdihcASRWAyjKyrNv5vhXnQV
/dokEI2OpKW1b/Lf3ql2HamEtO/tlid3y7pDITOCdl1NlbGuiaO1vd28jtG2eS+VRs0MfkOkR992
LPN8zLQF+eFgMgiBGxk5+3S1iOYkIBErfDrmrgMmzHXfnaV5AAnUfJWRdEYfKB/Yd0nmuSkgmM70
xYsZMMjDrje/G4sWp4KbQVTyuK/XoEfDjZgqNk6GfBOCMX/QNyTJlVVv9EZE2slHEitHd/Spc3im
b9krPuNMRnJBnCjSZirVPN0ufiNXXaMWmdn+no+/yFxBkoCbXgMNAlTQyZ8xjswgUbscePWx4tbl
4db/sj6vuBI4693eiEg7+cgzCRazjWxVZwaAYjtxIiNp+pnN3yOVMG5qGNOkuAqDrggpdJ8X9mC6
y77LDCSRx7OagbpcXBqwb2BaonkoGy3ipBBgH8Wo2if2KvVjJnBDfeoyRdwdeUUpqXjtWAeyjwEV
2ZfmE1V9oOCcnocbGUnrqKlLlQ/gYvgLZBvl5oybIMFOoObTJqLhgfuNH6x267VtD/a9hiLvvPqp
NDdTwsUxpTZN00+yxYyDq5vM0bTdKs5w9xfpqo5zemDnTIjFY5g9bRJ6erqxbccurD/1FGNQcRjc
6J8xGMTRpib87/dvw9mnr8KKBXMR7ylsB6yZ85dyvLRzN776g5/io+99B/yBIKJ6/7RRPakUkHra
58pnpBpzE+gz0+4yzcsry8pMklB9TTeD15BwJKl5orml9/rmZrUzTx+Q9ijVNPP2G9/v6uo2Cci6
mkpEY1Hzu/xtXm+pRk2z7qTZNgPY8DecOZt8zLST1wkhmR8OJFWGJBqdQH8xptP9l1UAl0KCjuGX
XJxdXWPH0Z0qamm+rhOouggMMEI4J9BuZkxWwBL7/UPd3sy7920GDr9cnAlf497sKgFPvmJClEYP
f4XfoPvYOUny7WQjQAYzjdw0XPUGYPW1zgFOOFi3HzUG0aNJMmWL6qd5DNRpv5d55EDyjH7Umlz8
RlJ57wRGuj2+Z+CmlF7rP/Of5pVexpnBwuQlSjm6a4M3/81DUdcHu3/hpHH2WmDdDYoAcWonnLu0
HlLzgv0vKIUt2wl9QqcTHfnkk9eow71mfD0oSXgpR5pfu20MkHilD2xu6A50nlLpISAECWCaEudT
/6IFiLycMj/bouZojNiebYFDFSSDkVHp6MVE2+zntw7NmHkiSUa6gQQ0RQnbHlTpLSQ4nyYRSRLX
DSQhT+wzyjTP/tLafBBl5PDIO5a3lzLnRsGeZ4tjbbBnk0rDcFK/u8VecaBiRm40bS/W117UkGmT
p1AogqXzZ6GlrR0PPPkMzli3xhhsucupZZnj6IiHI7j8/7N3JnBylOeZf6u6Z3pOaTS6R0hIHEKA
BBIgxCEQYC5jwBAcYzuOjR07iX/x7trJXtms1971bhLHsTd2HMdeX8EnxuY2NxYgBBI6EQKB0H2N
rtHcV890d20931fVXd3TXdU90z3To3n+UOqePqrrru97vud93xuukZ/8+lH5+3/9iXzn774kDZUV
khgYLMpxaKjrb4W09fXLN3/wU5nWOEX+8PabJQ6R1BhaYEI7IEOp2kaGocKpK8ywhENmaq6u07Gr
W4mR9ZEax61oKbEQod41Th5IuETdz6OaNphUXWPPxbR/LyUu4vs9vX1KkMTnEa7d3t6h5jWpqkbN
QhfYMZQ4qfooHV1KHFUCatBJU4gzMttxQYYPnDZozKPhFCSwLbtDZOfLOsdMsSpGooN45qXaGVmV
R+L//ZuDQ8XzOS7cMO3hgATbQblLBrI4I1Xlwn49YucXrv3eKyKrv2d30k+cHscYz9PTY7tZ42DZ
y3kZ6xpFLv6AvxiG+z5y4r72M5Etj2sHSCmc6MXcvnBonnfN8PM54ft9HaXfP9j+KICiHF1lAjr4
83IUson26dxoI63gicG7TKGznK/JkXp7P12lQ/v9zhOIElvtc2T9A9pV5ueeG2lexWiehT4wyAn3
Mwo7FdIbQA5TtEncKsFYF4hwsWjxrmP5fs4txJPPsYvCJsPK3Wc43w84tiGK9neObJ3iA8UV+lv2
acH7ovfndnbCBQk3JK6JcGLP9qmijTYh5oe2+FiA/YdrD4TfoAIh8+22+qJVuthYtLtIC2Bo9/PC
lf7OZRfk2D25d3TPiVK0Q6wJMK9ibjucwzhG0R7yC9fGAMDz3x7dqvSnQz8jZ85IIw9hJqgyzrjb
Ivmoj0EbxX0vFT6QiMdk2tTJMm/WVFm76Q3p7elTFaPj8dw23tjAgMycNlX++xf+TP7sP/9P+eFP
H5TPfuLDWpCMxVQBmOECl6MZCktnbFDu/9XD8sLa9fLdv/2SzJo+VQa6szcCVM5IVVBHuyERkh23
Ekq3NJxFidrL3Av3pv1aa5vOqQdnZMJ+ErWX2XJylkUc0Qmh1/g8Xm93wrSrwpUyaG+vqHJwGer/
iL2s0YGYdHX3OuHafapYTkgtU0j67ecx+wIxEB9UYeCq/dvVrat5Z+a5dHYPigpVVlSobZHbGRl0
8NMZOWKQN/KtF3Rlz6CGABpXcCu0HrE7zLuLY4PHb6KIA0K0g0Kf0TB950X74GoZ+X5H5w+V+8yd
hVUIhwsZDadIrf/nICj0dacvJyoqQ1wYuNT/+1POsBuF1eXrdivlQMDpmvtwIm238dAWGYtlPOdq
u5M6379YTccxkWe+qfPz5lPcQYzRO/ZybbOgvJFBZMsXWYpz9NxrRabOHfsiYpl4C9l4B6EhTqz5
ceF5RTOBMIYwUGucXF/gYlROWx8nGcTr1x8U+f13Rdqb8zxPjOGvK6JB1Da0/Nspk5v0dKoAMbKq
TuSqPxZZ/iHt6sFxABEaz7FuqgCL3QnvPuWcJ4nS7kdVnKY9eNAUghum9mGIkUgLhGthkBgJB9zJ
A8W7BxcDROVAPITYgUH0bODYhRsSRRlRRRvXHT9x7eCbI089NFzg9Nq/ReTwWyJnXe7/WURILb1T
pPlduy3+UnEGyjBIhIgB5PINujbjPDiwVf9+uaewsDivos4X10FcD2FG8cv5jKgTpOGaqNoAnZHF
6qy5IpOV5W/vJBmvO591Kz6nl52W2GBcbr/pavn1s9+Unbv3ytLzz5V4r3+o9mBvr9x63Ur5i099
TL72zz+QeCIuf/zhu6Rpcr2YUADtm7X+qeCj3jCcsjCmqQS4w+2d8utHfid/Z8/3v/zFn8jtN1wj
gz29Oc8f03E1Yi7HOtpkd8tR6e7Xy3+qR3daXl632V5PXTVw3ZbtyfdWv7tNFZkBcCq2Op9/beMb
0jipXgmbR5p1WMuuE81q+SBIQvzsiUYlZq/nQDQmT65eK3sOHpZjJ1pVQRvkYXl9/06przqo8mlW
hyvs7+uGSfOxk/LA489JTVVkiOgLcbK6ukrOntski84+U+pqqjNOhFz737tvKUQWja1PiFx8m92Z
nBHsjrzsHp3Ee/0vdSN0JJX/0Hm99lMi560MHpEFO1br8JjBIoRA4WaGkW2MorUdKWyZkXcsKK8N
wtmzhTohnAwjyn5i5PT5zvwLyMeEGzC2IVwAEJjxG6NZlfF0E9S43U7/fTnay7jgsmBXHq5xO1/J
U4gUfR0ZzaTuWZ2Rb+sOQpBQkw1co9AJzlaZt5j7B0nnF+cRog0nWndrYQNUQfeZyoD8lCh4MT9L
IRuIiAU57Epw7I9FBAq2ByY/IGBvfzZPIVJ0KpiKyPC3C0RAtHcghvjdu2fayz3rHB0mmK9o6BYy
yuagw3mFYwBOtNcf0OuczeWazz5yC0AECUhYV4iAiICp87leQURfeI0+//O9XunekH0tXK6FvKAc
4VhvTMXoWxYTtO8gSJ61PPf1F9FGi2+2H30KM+KaiZBn5MwdSxBx9N5anRMyqC2O6vZXfkyL8+gL
jCRKCqL00g+IXHqXdhUHgXsF8usWq5CmVQbH0kRpdxZj2+EajMlPjEQ/Vt3nC+g/IUdx9WR9zS7E
lX467ce8qmkbMgGckaUFTscVFy+SqrApv39tgyxbvCjwUFXux0RCbr32SvnGd38iP/3NY3KipVXu
+cBNsvCs+dJYXaUcfgn3ppLrxgs3o/1LqEJ9qqdXdu7ZLw/+7ll58vnV0tfXJ3fdfL0Y9u+gwrWR
ozEPZySKyRxsPSE/WfeCvHl4rxhh5yZofycSicgvH31KHnjsGb3s9kmF1/a1nZB9G1YP7cPY7z21
eq089/Jraj3x23jt1f3vqCl9FQy1XN/+8a8kZBpqPcQyxAqF5OE312edd/Pxk/LXf/dt9b0hYq0K
Q0/IWfPOkC985o/kQ++/XiKVFRQwxgrk/XrjKd0QDnJHwtVz6xd1biQIkhi9jxc4OorR9oZZItd8
SmTlJ+0by4zg70DY2/SwvaxHi7fPEXa+zV7vjhP5dT6x3BfcqLeRX6cbxztEx/7uocsKMRWNfL9O
MfINwR2CEfN8ck9huZZ90O7QLtPzhrsKzlU4KRAC5IZ+uUn4i5HMfTzk2SnWjXqsRpLLbbsVGr5j
jdH+KtdlxDkfJIgg/CzbdSPXeY/CMJGa0TlGcm0zN2/kvIsLD9VGvsiT+0aWYy+fcxTuJBSKCXK0
o3O98bfFy4t81ce18OK3391CNmfkKGQzHjueIyGfXILYP5jyOk8M3cbIt4J6tnnCDYfcehi49Ms/
iYgLpJ1BtEnH8eDfwr537/fZUIVinHDmbU+PLE8eBDFMQbnUlPNvu85bjdDZXGC5ILbt26wjVvIt
7FA/VeT864PzA6KtgvDh9qPl5YwE2LcQI/GYq82MgjAY8PYbjMCgMRyWI60MPVK620TeXaP3d5A7
EscsikuhY4t0QtgOhV6/DSdMH0IkIqMwwB8Ejoeda/XxNl7dfZzXyOaLnPeYcC/PBVJe4Hh6+4X8
RGvDGRzBsYjBUfQxT+3X7RrXlZ7sQ/UVb6BytO7Vfjqh5f/dCeqM9GwprwMyuREtH5dkDsekh5jd
Aa+aMknev+pyefip38vnPv6HUqlCtQPcQ/b3BgYGZfrUBvmrP/+EvPzaRvn77/xArrjkYll+0YWy
YE6TTJoyWSKRSgm77kfPfo3ZyzEwMCDtrW2y91CzrNu6XV56bb00Ta2XL9x3t/z9vz4g/dGo+h3D
R+CAoxFvP7h5rWw+uFvOWbhQFi5ahEyQYrq/anmOJmdeEAIT6lOpj4SUQGp4tnv2z7s6uOmGani3
qfN5iJ5evRxbwPTMK/3o1tvHDJvSdqpVtmzeJF/+5vflokXnyJKFC3I7I/0mUhyUO/L9+bkjMYp0
878XqZuiBUKMjPe15ydyoVENJ8AV94pc/uH8RkOxn3FjKZYr0gUhJ5d8UDcEIQL4OQlxDsw4y17u
j+jvBXWUEGIXzVJ9EyEmbYe1gyLXiDoaaVd8VIeioHJ40HGObYj1QC4fbwgoXCNKKDighU04rvZu
GLtwIEImMnBFBjmBIOblG0aMAZ18XNrIGV2oY7FQDjl5IydXFf693o7Sb/uLbtW5foPAfea1X+Qn
JOUDrvEIZW2c6/857EcIkntGoZBNuVNVl98xHcpzABttDgy0BrliXedgLiCMYZAPRXVynU84d3Ee
HLTbKpsf0alZ/Gg8Q+fHDCog6BawGgmGkf914NhOJwz5Cv9tMm+pjqo5vkcLtUGuSzjhMAi85OZg
xzBccAgfLtfz4aCzT3KJkVg/OJ79QBsPwm85rCPckWgjzlkcPMCFdUObEzkz196vBcLOPAf1MSCE
432JfU2++o/882l6+wDY1ipnfZuQCQqESPSJBm/OPfCJe8eF79PFZt58OjhCDLUKzrvWPhY/kX5N
cl3pmFBgD273917VZo8JwgQtYJNvtZqgyZBs+SV1jkdD7rx+hTz4zFp5fet2uf7KyyTe05vHddBS
xVhWXnaRcvE99NTv5aX1W+TVDVtk7pzZKtdjY8Nkqaup0aHUziIM2t/p7u2V5qPHZf+Ro3L06FGZ
PW2yfOSWK+Xjd7xPjrS0yv/5l1/klX8SVbDbe7pl++F9ahX/+n98Re6664MSjUnZinIIx4YYqwvu
4MDWeXsqK0PS3Nwsn/jIh2Xbli2ybvObsmTh/CzHgkh+1YvIiIFwteG3upGg8jeaAY0R+0Zw3Wf1
RXzdL+1O1Ot6BBuj7miQQojDiYAiMeg0oBGKzjgat7jozz7PP3eaF7gv0UEsJJw6Xy7/Q5GuUzoE
CoJdtsYUlhOVE2/5D3aHcWnwcr/1nH0DO5w9TAs3uF3r9Mien1MDjoNdr+qOOiqD5rqhIjcRGvez
Fw5dLoimmBAeChEXwiRu0EGX4Xwu1afLLadY6zmRtlsQk6brELVSVS3GOQSxaCThYWOxjKqKfcBB
AHchlg2OQb8wTwxYIFRuzvnB1yMUmQhVlvYYRocaefwmF7g93XyRpTyn0GGGQORXwRwg1BTOGzjK
iwXC7rGflJve9N+fuQrZjNX1b6yuWxjUDApxhsCLQU2IOX4CmOqY3qideEHiF0QSP+csfgvtnHOv
1u2ZXMDleM19+nzf/pw+lzM3IkRLFLpB++OiW/wHKXDvVgVOduXYXvH8UrK4xXW8Yg6WA1PmPHDt
OrhNd7yRw9pPlEKuSxzbr/5cCwUDWfpUeB8D3UtvE1n1mWABynXBQSAr12MYxwMGl8+/Lnd+U7+B
pYSTe/JwmRTawHGBwRgMiuB6GST245hdcosuNLXlCZ1CAFW5MZ9Yv3bx4txEHwCDBzgXEfaPgRcc
93BhBp2TLl0ndfTW3o1j2x6bCEVnynnbwUmMcwZ9tTkX5P4c+leX2P2iE7tFju/N3VZEuwhpBzAg
lHksuq50zAuDBXh/z4bTS3LLJrkMESML0d5E/MOODKFuA/r6Vaj2efOb5JePPS3XXLVcFYNJ+Dm6
nJFEOCj7o4MSCYfkY3feLLffsFJ27N4nm7a9Izt27ZV33tvtOAitZDVsEzci+/mLr22QFUvPk69+
8ROyaM4smTW90e4/hOS9A83JNJdBwDWpC9/oBi3m3Xb8mMTLddeieE5NjdTN1G4Ey77xdh09qors
VNjrEY32S0VFhR6sNY3inFBkZKz/tcj0BSLXflp3iPMBouIf/E8tZmJ0FI0zVGJEowTHP+aDXBxo
sCC/DsSxUAHjLWjYIBQEN4HBaPGPCbg8EXYOaz+qciJXEtwMECXRiMII8bR5Ijd9XjsPg/LpwMqv
QrRO5l4OvA+nDsTZXI1VbKNbvqAri6KYBUbmBno8TmZTd65xI73+T/07DACuUozq9XYWZ7uV4rOj
Ob5QzGWaKNst39/AsQ0BvFThLEit8NKPshdOKOdlhMiFRq1fJ+/8G0QW253wrlZnECIxtAMIhx8a
2ujU5+P2U9cYw3/bjPS4HE7eSG++yJEWV/H7PvLa4b4WdN/BPQbX/1gRq5fDdY97IjrgfgIWQA6+
M7MUshnLc3ks2lkQhTH5CRVwby27XeToe9qtmHkeY1/j/ghRBfdHDCIGNrJNPeVaV4To7X5dux4x
COvnjkSoKwY6GudpgQeDtG5EAtYLwhyuQRBlgiItIEIe9AmFjfZmFwAzwTJdcqfIG0/q9C0VznJY
cZ2eAFXXvezbogdO8Z1wxF/khPiKwn4YmMY+gWiA5cX2rKrV+wKRNys+HLy+EKIPbBN592UdPjxW
x3wQWL+DjkjsFzaaU2Br0d9va5ay4b3X7GPiZ/Y+mq0F43yu5difN/yZ3rcQitAPgJjd3aL7Ajg+
IOogsgjHIB7zFSHdNjUitzY/5t9+LcW9o1yOt3Kc11htu4Pb9SAFTDO5BnGSqQTs68/an+p2BlJ9
uQMuOK7R90O/FKnC0I/yvY/v167hU4ek7PEL087cbwH7cII6I3X7C8KgciJaugazpUZJtbNR/Wel
T/g8RLq01yW9qExSs7U/W1tVKffetkr+4cePyu7d+2TRgnkSRQXqXIRC0tHdo4qwNNTXJudbX1st
Ky6+QE2WZUjfQFR6evtlEA1Z5wcjlZUyqa5GbvzY5+WmlZfK+66/QgaOtUhvT2+qaItH8MwmhLrv
IBx61qQpsqRpvqzZ/ZZ89Ut/I8svX54WmV1uJ8TsuXPlL/7dF2RaQ73sO3BYvv/df5b2tlYJV4Sl
+cgR2bxhg5x5RpNcuWyxs+/cr1rJvzP3t2T+TYoHGvS//552Z6ABWlWX/3fRIFnWpF16xWzsbXzY
boQ8XtrQDHRcrvojnVBdVUl8R7t84FxEMZkFl2qxL6gzi+MReXfQwPQLJ4c78a3ndTVJvzB1/P5t
/1E3dCFuILcabqhorKNji47Winu1S8Sv0YjtiMqjcGuWY6N+wtzgJsh64rjNNz/bcICbLh+nX7kt
46G3tEvLz32Fa8z7/lyHRsJV1dOqE6lDwITDD6IarrHIb5RvlWU0ukMlbla2DyNvpJsvspipN7Jt
zwtu0KJL4LX75aGCTDHA4BP2e5AYOdMpZPPW88F5/UbzmjXa1y0cR5j89hk6mwjzhPsKg4j4vCo4
YOh2S+Mc7YZEYQyIH/mA4zbo2EXoMu7xEJeDUszgvvz+vxS58qM6bQtEU4DQbAg9WK6gtA04N/Zu
8nfPoYMNcRHtN7+BDiwPBjhxPkAsQpsNxVUglj/+t0OPfRT32/SI7vTjvPZz9mI9FtvH+LlX2Pvi
sHb3or2B17GeWN98XdMQ55D+B8JvuR/D2C+41qPwS1A00ZDvbtdTOaHcvM/q8wcD8PkMdrngfoQB
e0zFAoN3GESHUaIQx7g1Dq6H491lOVbbDsfB9ud1nscmH4c12ksQyOFU3/yoE4XRrq+TMJngmrb8
Hvuee6m/gxnHIITIYl2PyrE/kmO/hLMWrcncoZbP++N2y1hOu1A/t1SoRsh5tDw5CPV7EAGxobyv
Z7OI1lRXyfHjg6pqtDU4KHdef4X80/2PyC8ff0a+8lefs4/DkH0Njmdp75iqkvS6LdukadZ0aZxU
m3UDG4YlNZEKNWXtC9ivd3X1iPRHZcAZdccyV4RDEg6HpQOh4qY5RIjET3X19CQFuki4Qm5fcrls
PrRL9u7ZLS1HmqXS/n45inJYokq7I7DSbgBdcM45sv71dfLYL36hCuWAzv5eJSR/+t4P2u+fKbk9
w0MPdr3v9URhpMigQfrst/TIJfJuoCNb6nxj2cBNY6fdCFn9fbtRe2R09jMazZjQgRkOGD176Yfa
FRq0vK/9St9MF13r3yFBZxqhTZhQJRv5SxKDWsjMpwI5rg0QP99dW9xR5fHujBwrEXEiOCPHcr+U
8zKiU3X5PbpT7tf4hUhw53/T4X9w6iFEDUJW0wU6lDeoCEsmCI2DMFqsIhA5Q7ULzBvp5ou0Sniu
NJ6p3TjomPiBbbxrvS4AVuxjCPM9vksLyX7XereQDQaf3l1T/udaqWjeqY+NuYv9HXkQP67+uHY4
o6MJcRvt6Fnn6m1YaMoAtHUw+a0rnHpwaMEFuPzu4Hsw2lFuu6JQ0GmGOxFOxqACJ2i3YZrSlJ9g
5MUtbJM5f/SJ3l6tB6fh8gyatzsvCARNi4a37zHovO4BPQhd7NzWpTiGsV8w+IxCNX6Vx7MJHId3
iBzdVX73LLQTX39IF5JCTkecR4UKrcUAjl+kCljzb1oIssbZtY0FbEo3X7Sl3nxGD+z4mWZwT4WZ
BBPOOeQ1hdEE17N86xUgH+47L48PV2S2bZevMzJL225COiMtOB8THtEpR1Ga5OdC6UVOlDgFnSuz
MIvNRYvOlt+vXafCos+bN1uaZjTKfXffJD9+9En54M3Xy5KFZ6twbRUShQnzsP9GKPXWN3fI48+9
KJ+9904JI0R6GMLfkkXnyJa390hbW6dUV1ZKPBaTeCIus6ZNkTObptvL9rpcffkyMVEdGw0A+/dD
lRVytPmYvLFjpz6nnKrUsyc3Sl2kRrr6++Xacy+UMxtnSnRI8mPDjRT3HF2Z8fzZP1PM7zVW18mA
vd473j0gdQP98pFLV0nvYFSFnD+5fYM0d7TKJYvPg5yccSL4FbCR1OtUIksDRnsf+rJuGMKBAweR
OYqNEYzab3lc5LnvaOeMlSj+b0DsREchHCnO/LCtfl9AODlC8V6538lJsiTYIaE6NlW6o5X3RdXS
neAXHYG0GA2LiZIvstjbhJeq8SmMFHMZ0bGCsIAqqw0BjhMMQmCgIp9zXDmiwrk7jHCYuSkhcuWW
K8YxfLDAvJF++SKLtVwolJHP8sAVqar2luBe4+aihAMD4Y9+nOEUstld4kI2VhmfbxjUe/NZ7RKF
qOg3GIpjGtvsjAuDzxPkr1Nt+xxdLAjWaOugTRDzuYcjVyCcWlPn6hzY4crib4OEEzr98o9FdrwY
nBMSA7bIqZ2PYJgJBjqqc7h2IQgiMgU5TyH8or1SqsFptMngxITTNd8CUmOd9xT7BWIkHLMYiMkX
uEbxvXItxoJB7xf+VV+fr/mEjhDKt2BUUY6FDj0ggzb1e2vzy4k6Gm2Jcs2zOx7aYcXcdm4fEQN8
i2/Mb4AW12mIl3JG/v0ntAk2PqSNMafz/vN1RnrzQYqI72vWBDsJfI+fVJi2W536uiuWyQ9+9ah8
/8Gn5d9/8m4VNvyR21bJs69ukf/41X+U//y5T8niJedLQ0WFqoiNkPCOeEz27j0g3/j+/XLuvDly
z63XDXuZ7r39Rvnsf/1b+fnjq+XuW1dKvWmqHJRWZYUsmDNTfv7QE7Lqikvk4iUXSp29yLjsnmjr
lR/f/4Ds2LVPt7lQpdr+Xsy+KLvuwivPOl+umL9IujOq9qrclY54mXCqdLsVrhOOwxSfMRwFMeE0
wHN/z3CqZmf5HlJEmfq11Pck+bn+wUGJW3Gpr6iTG89fqg5QBJ+v3/euEiMHB2P++zIjJHuIC5Sd
/NKApL+//bLOfXjZB/MLKRop9jmnGtRrf2Y3wn9SvIqm2UDn0K12iGqbI2lgY7QNNyw02gvJcbTx
Ed3Au/nzOsynmNvXFSKf+JrdsXtO5+4p5k1svI1Sl8MyTbTtVup9Yo3DZXzlZzo8CPnT8nE2B3WE
ka8MnXgkc8/l/oPbC86A/W+MvDiL33bHIFZ3nnkjVfGGgHyRIxXMkKvuguuCHRC477z7in0dbynd
MbXjZZHLP6zDHv22jbeQzfE95XUsjybvrNEFptB5DCo8FLjsCR36i7YFXMdTc1Q2R3sAjj6EMyPv
oR/IpYg2CsK6MZhYSA68fO7dcNwh5+zWJ3UoehDIKX1st27P+LmuswExElOufQxH0Cs/1cuF9D1T
zyiuMIX5wpmM9tOLPxA58u74OoaR1ufAmzrvXD5tOKyvqsS9vbzvYWh/Y3+gfbvq01rwLyR103Dv
aXA4b3taZPX/09tpvF7b6Iws7Xx3b7Dvpd/R17sLri88YiToHIUQCcMIrn2l7IuWetv5OSP9XpcJ
Xk07FaLtik9DQ7S9W8wSZYcUvzDtxsl18jef/5T8zde/K0dP/khuvuoSmTGtQRadPU9+9Jtn5Mu9
vXLdVcvl3HlzZerUKdLb2yfv7DkgT61eIyF7/r/+zv+RuprIsM+8yy48V/7sY3fL9371iLy375Bc
cfEiJfJteWePbN6xR2bOXSBf+dYP5KarL5fFC8+SHvv312zYKg/97jmZPLlB+vv6lBBpZjZi8XfI
EMvjiAiFTGnr7pLWni6Z0zhdIuGwyr95uP2UWvymKVPVvKKxQTnS1iINNfXSWFuntmFbr/09+7tN
U6ZJVYVubBy2PwNxdk7jNCUwIsz8cOtJmVxTK1PrJ4thz7u9r1dOdrbLGY34XqXaV80dbTJoN/Ln
2PMKGyFt4LAXIGT/dtLsmuUMsLLua+eYkISwivYoglHbJ79uN8zf0bk35jmhT8UeIUVnEBd7OAHW
/UrkjadHljMrn8MDnQ2Igas+pRPho8NaqPvTvWFBPEWjrf1Y4cu6/kG9vKvu00n2ixEWD0cN8qr8
7h91BUI4c8ZDZ5andflvN+s02WZjBUIpX/o37cBCQa/hihhI6o9O8NP/V4st93zFPxQZjfUtvxPp
bi3dBmpz8kaemUfeSHQ4VVGuEuaLxPZFxeUggQDLDKG2lNdJuOmQRL/pvOBOE5yREJUwIFiqfVXu
5zFyI+PeBefLpXdq595w7ovIpXhyv8irv9DCxnV/kluMBBgUhLiI8Fm/jYTjFuHaWM4b/lQ7JHH+
jfjeHdU5F9fcr9sn+YYqI4IEA6wLr7LXb15hy4HlDkpjgOMXA5t4XPnHOoQ+6Dv5gO0Il+Drv9Ht
qPESCpl5LUPeTaxHPuH4KEQIIVKd32UOjj/sG6QLueqjupJ8Y9PIB9IygQiJ/gYGYJAfHWHicEiP
TE4o7+tgvsURrTKcV7ncQzAo5F5HEEmi+nGhER6LMd2WWftznSKgnApMlaodkNMZKZJ/JW3vgTOu
q2anwrNVGLaVkTPS87oSplDoRlvwdJEb00iF8hrifD4VubTy0gvlsx+9S7741W9LS6+lcjXW1tTI
tGnT5NLFC6Wvp1u+9I//ImfMnSszZ86QqVOnSU19gyyYUS9zZ00bUXVDLObnPvZBeWPHe/LbFzbK
rhPapRSNDkrzyVb5yle+Js8+86z8w3d+KMuWLZVJk+qlsXGazJgxQw4dOaJ+O2SfYMp96Jlvjb38
kybZDYKKlH5dUxmRh7a+Ji/t3CZ/9f4/lBVnLZKTXe3yo9eeV9vlS3d+XGY3TJV1u3fIt1Y/Ltcs
XCJ/ev0H1Cb73Y6N8tz2TfL5G++SVYsukpauLvnphhelN9ovX73nPpk1uVG27N8l//zSE7Li7Avk
czfeIRX2cj219jl5atsG+Ysb71Tfa7O35c9eeEQ6errkf9z9SZk1qUEG4toB6VYcD4VCnt1ueTav
NWRfZ4Zq65D8hFPciHpFSUGewvW/Fdn+e53v7NI7dKhw3TTdkR5J4xs3EHSO4e7Z8oQeGYcoWYpQ
uVyd58f/Xjcml31AJ2qvyDNsGx1XOBEgnq57UM9juECQhBB77X0i51+rOxPozBe6beGQhjiKeeEm
ijwnhTgiS5WjsBxDAgtplFncbmUnYOTaluNhGRF2ibDqlR8XOWeFdsvlW2AGKSDg8EIeQtc5AnGr
tVlfv3I1xDHQgTx3uGZlC/8t1jGM5VmUR97IgwH5IgvdxpnzUQVObsmvAMOOl7RIXMpjJzaoHern
rQwWLNxCNm+WSSGbsRr/PfS2yAvf04IIQvEgIuYr3iN1AfbpgTdEXv2lLiIEsP0h+OVKzwJXJIrT
RJ4LFqfhWNz2rG6zIIQZ9258fzgOHSwv2kI4L5BPetszWujM+/txXdQBhUeQ5w/h2vkOGmObVmKZ
Tf+2F1zM2JbIdXjNH2vhE4PTcFWaBXpoIBJ3nBDZv1WLXchN2dc5fo9hpKjAvoN4HtRugwgJF3kp
B2KKCY5z5BzHgA0Enys+rO8nk2foY30k4g/uRbjGnTxgX4ftY2DDw1rwLlVY9mi2s+iMHJ3PvrfO
bhPZ/Z4VHxK55A7dDqquLzzPKe4LuGdAEEdfdNOj48sRmWvb0Rk5jG2YFo6bei3zMa1Sdtrr6fNy
Q4e1IGjIheeeJbNnzZS//OJfyvRZs6XSvob++ec+Lzdfs0I+cN0KefrF1+T2D9wmd//Bh2T+nJny
T//yPXlrwytFO0LOnjdbll60RL7y5S9JqLJKWltOyL0f+ajse2aNnBnT4uhnPvMZuWTZJTJrxlT5
5H2flt27dQW+sCNGelfSMHXotuk4ugxnPd84tEf2txyXA/a06ryLVH7Jzfvfk3BFhbT0dMqipnmy
5+RR2ddyTOqqq9X3Efr95qG96nt77fduu/gK9fjWkf3S1dcr7X09smD6bDnU1qK+W19doxyTlZEK
2WZ/78Cp47LreLPcvOQy6RsclM0HdklftF+Od7bJ/OkzJdbvCQV3llNvlWxVtK0sx4awgvZYglFL
5B6EWwEdg4VX252qBTppN0LL0LiPVOtCCdkaY2h0oCM90Ks7oZgwmo9ONRre6GCPZgPEK0g+9ne6
YiXEAQityQZ2aKgQgA4DQor2btZhWnC7FCO314FtIg/8tcjFt+o8nbPO1rnelGuhPnvHAuIjOjFo
1GL/ICfd5ie02DGchj1uxHBb+d2AVdXSAtcXywfBOYhihpIXY13h3Mon/+dE2W75Lk+p8duW42EZ
AYQDXHOuvFfk0g9qEQHXHAgDaiDCHNph6zqlHd0bHtLiinuO432c88gp6yeE4LoGR0u25crnGIbz
Ox7zX+99m/W1cWpAXiaE4QblS+vr0gMrEC4KPUexDeEyh6MnKF8uEuF3tpT+mEDS/V2v5+d2g5sT
glKpxEhcMyCKVE8e2XFcaiDwPPS/dBg9nFlzztf3RISLYuDQK4LB0YJjBueJyjv5nB7k9LrtcGzi
2PPLI4ptD4dNvk5ZiDRH7ON04ZW6QwxnJXK01k3JLdagHYvjGucwiiZhGbGOWN5j7w2/LfPij7TI
p1LrnK3bZ6gcm9kxx2/jOMT2wrmP72JAJGhfo42GcxxFhubZ67nkJpGzLtMDqLUNTkXyyND2CtYX
bT+c864D7o1ndOX4kXT4248Hh/KOxjGsBMYdIhfd4i+YQ+yFcHno7fHXB8DxgvY/3Gjn2sf6hTdo
pzdc/tj3ONbd/Z9NCML5qfoAffq4w7UNuU5xPOF+BoG2WPsJwnlQOwD7rNC2E5YPx1PQMdd+vDjL
qNKwdJbfvMZ622UC9/sz39a5HS+7S0dnqH5cg3OvyGLugPg46Ezoj+K6i3sG8hWPR4f2SITLgqpp
+1XR9n7eS76vlYkQmXqeHpqdWV07ezVt8bxuOgVVkL/QTK5wXW21DESjMv/MebL44qXSeuKo+kx/
f79y2U1rnCxnNM2WpUsvFquvWwYGBlSIs1Ukp9bUKQ0Sje6UCxcttNsp02Trpo32vT1h9y/2igmH
Zywu5513rlxs/37nqRMSHRiUCjHFXgoV3myIkVW8Ta6/I76annDThCPSQvwLJXM9pvJpQhxMCbYp
UVMcN6orHqr8314hVLk0dY7IpMCIfZHQHw458zcyRWZfO0v6Qa33pZU8HtJfZzXtMQEOnDU/1dOk
6TpX2YJL9HM4FyBMovGNBjD2/mCfLgqFBiganSpx93YtvsFNOBYCpGRcQ+Fa2Wo3st55RY/4XnST
DkNCoxJ5xzAyjE44cuegqh8ch7hZ5yNUFST22B2EjY/qado83eDD9m1aqDuN7nbF72J50NFGZwbb
FJ0shAqNRJiCKAyBxM9Rgf3Ydaqw8w439of/d7CrBaHlo3U+57Ou6IxCgLG43QpanlLjty3HwzK6
IL/i89+zO2Iv6nxc8xbrHHloRKNzh+sSzvOeVu0gggMg1zmOXJRIbxF0DGIAwxrmMYzrX5CbEddH
3COCtj/EDyyP37zQYf/NV/zd6rnOUdxXnv5WfscBzp/RcCkh//IjX3XCeQNcG1geFSVQomWBqP3k
N4NdfMO5bpVCBMEgG+5xSBVzxgXaPQpnVu1U3clExxLHAvINQizBZ7OJXBAiIaT5HRdIgVCoUxb7
663V2kEGdyQEurMu1S5Y/BbOKxzHOC4hBLqpaSDAQMxEB72rCII4OvKrfyiy/QXtAj3ncr086JBD
uI0PpH4bbRhsLxTdK7TzjX2C6xEmCLf4nfnL9LXLFQEgAKCNjvXFeuM33MEK/HYxqmUj9cSejWN/
DKsBow59HPgdW/gMQrSDqqOXM7hvob2MCdey2Xb7dP5SfZzBiY7zEoI+DArhCqe9at/Lot1aKMe1
H8cdXMulEopxTgW1A3CeF3qNxXH08v0imx4PuNbnMd98ljHf+8Boz2ust1020E/zXpNwDcZx6RpM
MNjipo/BNQnXePwWrru4/ua6Z4xXgTFbf1d8HrN8d2I6Iy2lnKU7IS1xwnVT0QOWE34NHdJyowoM
57uG5bzh0Wft1+EgBLWRSiVGtraeUjflaFSLCe5vNE6ul+aj9sEY65PBeHzoThwhTTOmycmWU3Kq
tU3qJk2SwcFB1ZAKV0WU0xC5IVtO2BfrxIAqcKMKyDgCZNgpEhOg6Kr5oeL2jEkN0lg3SeL2BkIF
b4Rmh0Mhqa+qkQG7kzPVfg+faWrQjTnMGd+bXt+g3huwGw/VlVUy0/5MbSSiwr/hnmyorVPfmzG5
Qed/xPca9PemT5qsitpE7BsQfq+nv0+tV9xPcMpmGfbs91T0vqX3cSJYzSejBC7mnS9rcS4TlbfI
1GHDsej4WB9V9fQVPbmgMwE3JES/0QofB2istmQ4etChQYMPN89SLI/anyeLvy4QWcvBrVaqdZ0o
260c9+N4XMYhwtBOPW18OP36iY4rrkn5nOcjPQaLdQwXc/v3O87I0+k4aDlYJve67uKIQaOJKkLT
rIVzFzdHWL4Dm7hvljIcGMIKjjtMCLd2gWiDDrHbiS/1ICycQpiQ90x1fqZoQRJCUrTI+x3b3hWn
kr3YSr2+2B6l7OBj3uUgIKDI0syztLvdD4hwGGQZq0H4YoNzCYNYmDLBuYlBdEQSlfKcG83z3HX3
FcM5V8xlLNd5lWrb5XNNwn3Ce6/AfcJ1xI/GNXi86G6SW08ZWk3bkzIvpzOy3PMnBW0Tz8p5nZBw
OWpnXCLNEQdXZHIS1yHpvq4dkTpU23TCgC2prq6QqqoK6ersTLPsus69KZPrpd1+Lz4Q87gQi6dG
NjbUS6c9/2h0IO336yojMrPWvmjHEtLR0a7fc95OiO6IQEg08jiuUG377ktXyvlN82TxGfOldyAq
k6pr5HM33KHeh5DY0dsjS+edLZ+85mZZOHOuEgshdN6x9EpZMG2WLJt/rnT29cqkmlq5z/4MhFkI
lD3Rfjl/9jy5b+UtsmDGbN2mHRiQDyxdIXMbp8vyBQvt3+uXmkhEPrvq/fbzAVXAJgrRNfeOFydA
O7kWluc11x2ZVsjG6wilIFmelEunK+iECTp+yinJOJxSx3bz2CKE109CyGiE1xcDhD72do7d70OE
7G4bvd+DS6llgoQ6or+GsFAUnvLLnwjxA65IhHNbE+TcHC/nJzn9gcGstfn0X8/hOCNzGLzCVp6/
kzYvr3g5TreglRGarSaIkGkhu85zMdIFSccqmRIknVDiZKi2SKSyQhom1UlrW7v6jYygZ2mcPEna
29skFotJKC2hfHHuHPXVVdLX3ydRexKnGA2WYVKkRqbW1kt1qEJOtXWosNbMkOxwlgI22YDrETkh
l8xboITCgcFBVYX7xgsvUb8GsTCeiCmB8kPLr5XBWFxV1sY6njtrjlxwxjz1vUH7Nfze9ecvU8vS
NxhV866rqpZ7ll+jRM9+iIz2hj57RpMSP6P4PXvb4XsojINwcYihCcsK2O+JtFyh3n2dEiAdETpD
lCaEEEIIIYSQUQUOwAuv1zk0/YBbC67I0yEUlBAybrHcf/J1RibdkBkmPa9L0vLOxDuzcVZZO+ly
c0J0k8Jjwvk7+ahzIKaHausQbZXF0MBndD5DyzSSodoqHNoMyaS6Omnt6FAhv+521ZWZLZk1faps
ePewEvBqK1K5k4qledVUR6QyHJLurq40ZyRCmxGOXRep1ssWS9mHE85OdKtpB4E1hmg4ENNuRFfA
hCiY/Iz9GkKue/r7k4eKZPke6Mv4HnJEwiGZ/J792mAspqa07w0OFHxSuEVqks/d40DtH+/+Tz3S
GUmGfTW2hMcPIYQQQggpDBQXRAXfi98fnJ/2wJt6YpuTEFLKvm2u/q54ikOLR0fM0R8OZ86vkALC
md+zjPFjmEy5Hs3Uc3tKqKIrCTWpvyFOmemuSFWURb3vbmTtjoTH0HTCuFEkZnJttbS3tnmSTbpF
VSyZOW2qtJx6UwYGY1JnpBTgYjnwqqsiqohOa0dn2k7Br6MIDBySrada1Pqlqk07B4VpimGchueN
uz8td1tbTtEdy3lP73sroV/HZ70TIYQQQgghhBQN5OnGhAIscSfdFCLtUAijfqrIBatEVn1SZPa5
/vNBJXEUBmneyW1KCBlbrKFPs3lzwnl83/f98Ri2rUQn00o54vB3QjwClOm8lvobn1d/wxVpOLIj
nJGm/q4KxdY6ZFLImzl9qhw/fkxtIMP72/bfyOnY3tGhnJFJMTCjivRIqKwIyySIkW26iqR396AY
TENNnRw7flzi8URygd1fDuEGeBqOqKntm1G4yBUjEwlJuiBT+91K7i88ElLwhZIQQgghhJBcNM7R
4dcwpLhFPUKVIg0zRRZeKbLkRpGpc4Pnc/gdkV3rR7+QCyGEZOsWW8F95nDyTU8hG2XOM90wVmNo
0kmfsOzx4JBMyweZzP0IR5ypw3Qt1w0XtyfDeW4/2iuVMFI1X1zRES5IiFnIl5hQ4do6ZLu+tlqO
trXZ2yMkhmk6m0//bmXYlO6eboljBCzkhkUXzxkZDpkqVBxFbFRAtekGSGsxsipcoYTKhLP8qki4
812EacM9aTr5L9U62s/xvZBpjpsTwl1+V+zFQ6pITeo4SHhCtBOJVI5I7H99XDjOSApPZCRXZx4/
hBBCCCHECxyPt/+lyNnLhz+PgT6R3RtEDmxne5MQUvp+beafGWkdrVx2yGxh2vn81nDcj+53yk2Y
1AKToYTIpNCYDMdNf255X1dipKHELa8YCZekmAjtdsK07Te6e/tlx+79cvHyq8UyjGSRGMPeKHAj
bntnjzTNmi1VkSqJewuqFMGBB3Ht4JHj0tbZLTOmTxdvhDFcfl3RPjnS1SrL514iobB9CAxG074P
DW8wHlOFY1xQBRuvufkaxwOmEXK2q96mg4OD0tHV42jpllTY647JckOznX2NHJfYRwzTJoQQQggh
hJSMvm6R/u6RdGxFDr8tsv0FkbZmbk9CyKiRVlMm47m35IaVw5gTdgVDb6GabEVsXDekW9BFAiJ5
yzly290wOh+kmSxmkgzJTgpQZtIhiUrtWoBMiZFJ+df+rmmZ2hlpaRfiq1veltfffFfuvOejsnHT
Rmk9eUq6e3uloqJCWtu75BePPy9Xrlwlu/fsFWvXbjly5IiaYzFEr31Hjsv//clvZMBeJvzexg3r
ZcfbO1Tl7s3Ne2T7iUOyp+uk/NHZZ8vmrVulq7NTOjpTlv7m9lPy1NubpKOvJ1kcZt3uHXKyq0NV
vx4vuI7IU1163V54dZMcPnIsqSKf2TRLrrrkAu0KTTj5QtNESC0cx+mMJL5X4BJ8lhBCCCGEnP70
dOhpuJ3a9qMiGx8X2bWBbU1CyOj3f60s1bM9kddWZmFsD2H3Opas9ixZKmtn+d2kiCnBEYjue+Ui
ThquM9JIuSAt5XxMhWTH43ElLpqG1wGZCtF2SsE428LQ7jpTC5KG/ffmt3fKWXObZPULz8qal56X
wcGYVIcNmT29UQ40n1Ai4dHD++WfvvkN9VvHT5yQyy48WwmfI+XZtZvk0edfkckNDfK1r39d7aye
3j614M/v364+E6mKyEO/fUgef+wxicXicvDgQamt1hXa9reflB+uf06vs70+eP3J7fYNbvt4PWP0
OvzysedSr9jrdcniRXLpheeqkHkIkfEMZ6R+jKupGPuFTOALNhuHhBBCCCEkk97O4ed57Dwp8vLP
RdY+INLdxm1JCCl9v1ZyF9TO+p6Vuyuc2xmZUYLbm0fSLcdiDYkXt1Q4sp8rEt8xjLEVKCMSl2hC
53fUodo6N6ShxEjtgjQdYVI5Ih0BEnJUmhjp5p403Erb+jnmc/eNV8vlS85TocGGk4ezpnaVnDNv
tgp1/t9f/LT09PQmd5dphuS8s+Yp4WukLDyzSe686Rrpj0YlHnPmN7laFp05K+1zAx6X4/yZF4kZ
CiX3o7uDUsV1xrGa4jhZ3RyRevUsWWRv71DIUCHoygWZQ4xMOBMFJVKMizchhBBCCCGKvi49FQJq
DkCIXP+wyJpfiLQyPJsQUvourJWtSrZHhbQci6TXDel9L1NTyu2MzHA9WlneExnqjMyM8hYr5SxM
6pye18aCqsSA9JuhVJ5AQ4fpQoCM23+YyhmJv+NafIw7RWnwfjyV/zElRpqOGOkU77H/nj97uiyY
MzNVPMXz+5FwRFYsXph12RAmPFKuXnq+cvzF4onxVOS8xKdO6gRwq2jXVkWSQqQrQrqipBYj42p/
qIrjSAxNSCadLSIth+wraST3Z3raReJ01hJCCCGEkAyQLxLuSDhXjIBCoeiPdJwUObZLZOuzIpue
EDl1iNuQEFJyrCyuyORr1lBx0lvMZkhuSYchzkhcBy3T87dnBt4Zqb8dt6R3YVzXY6bgmMstORZm
oRqJSptVJQnLLUxjKAFK5YKMi8RRXdp+YuKe4DojDee5aIekVnwNCUHcMrUnVIlcBlyV9iNq2th/
p4mRbrj3KCixFSHDnkIT/IRJl+4t8VZSt5RrFXvO64rMFCPVc/sx0tcpFp1tJJMNj4qxZ7N9suc+
16xDb9sNzR46IwkhhBBCSDoYsD66S4y3XhKpnmR34irtKWJPVfZ7MZHBfl1sFI8th8Ta/qLIW6vp
hiSEjC45YrO9QuMQ7dCbM1Ic/dBDmjMyGXIs6cpm0g3peRQjXaBUk5EK1U5b3jIIzfYyOdYjh8L1
YlpOjkgVku3kjIRLEnkCRYdqe/NEupOlhEhTQiFTbTMlSGJe9n+miXkk9LwlkSx4I+LJPWkYPJhH
43zxqIdeV6TlWIdVrlA8Wok0d6RbSdsbql3Z38kNSobyzlpqjIQQQgghZPi8+qBY6x6yO6kzRSZN
0xOewzXZcdyeToi0H2ekFiFkbLCG/pnmhhRPiLaki5LJrwc5I91ckd7KN8mZeR2TkiFCytAFsQx/
0THTZTaa+tzM3uOyvW6mEg1VJeVk+LUkw7Kx9Olh2s76KPHRtLdFSAuRcEWaCOk2tRCJMG88xp2q
24ZkFSST6y0UJot7nlhZj7OUI1KSgiQESF1F3UoKkimHZEIV9sHfsXhc6tsO0dlGCCGEEEIIKT5w
SMLtSMcjIaQMyBYVamX5w1tJ2xuaLRlVtH2raac5H8UrKjohyJmvWYbHDZmajMwV8CSQLBd35Jnt
e8WsuVC5ICEGKlEy6Yo0dKEa3BMMT8EaZ8u5QlbIFbXsJ3BBIjQbj5iPmTCSAmRSjMxY4yGiJN2S
IzxZLN+TR7shrSF5I4eIkfGEqoyOR4iQyhkZi8mUY+9yIxNCCCGEEEIIIeT0xUeITDMlegTJNMNi
FhES2ktmdZTwkDyQuXJHSg5XZMIRJM2hAmZyyTIEycz1M0RGNR9fTX+HzJQeaU7UOWKkFgNd8VEt
k5Mj0l1Y5IVUT01H1DLhijTt9bfnYD8ipNs0tTtSF7sRjyDpui4zBUkKkCU7f9JCtK0h+SItp6J2
mjvSU1XbW1G7qrdd6tqODslxQAghhBBCCCGEEHK6YWX5O1OMTD230vTChDMFOiOTM/bMVLkErYwK
2s6jmrEgp2LGdzAjN9Tb/sc0sguSyUIuOVZyNDi/Z680K3ckitd4lsl9jLsiqV5D5IhUjypPpKke
EwktPqrJCdNGJW3MA8Jk0hkpboh2SnykDjkKJ48nRFsfZylnpJsvEi5IN3+kW007YXnzRSak6dh2
MRKDjNImhBBCCCGEEELIhCGXKzLwuWiN0NUKhzojZajgmMgUHrO8bolbtMVT3CUj1BtWzJyCpMiY
xmovPLlDts1bIC1WvRKikuJg3Jv/UhenUc9D2kEXt0IqT2RIrVsi6YZMPndCvk0zkRaenVm8hq7I
UThpPKH1qZMn5YTMGartcUVGop3SdHAzhUhCCCGEEEIIIYRMHLIUqsk0JKYiUdPdkAlPiHZWZ6T9
Qsx+LZxSLD35IN0vQ6hRxVns56akHkXriaazMAn3BUd4NI2heSPTk0qO3TY1E4OytGu3vDhpicTN
kBiOTGt4NzqmkPunlaqcDVek5YZmp1yRphOunVm4xp1nphBJPbKE50ymK3JIqLZkiJEJncfAU8RG
4jGZd3S7VPT3UIwkhBBCCCGEEELIhCFNiJRsLkgrGTmdsIaGaCcyJi9hU6zOhGU0mm5YtscNmfyS
5Hq0lPMvnizy4miRhjfc2/6MJak8ipbHRSljW8jm7FM7ZX/VLNldOUOcutqere4pA+QKVkZCEqGQ
hOyVN1UFbSdfZLKadsIjRqZESRGPAJnDKUmKdLKk5Yp0TxyPIGml545Md0a64dv2Yzwu07qOyNyD
myhEEkIIIYQQQgghZMLgFSJliCPSSnNHerXDZIi2W7gmJUamSSthw5Au+8VGb1j2EEHScGaC/IpO
ERrDSlWaxnNV68UjSor7t6Udknrp9QcsSYmSyY+OgSYHd+RVJzZIz4yrpblyimiPp7PBnarZqSrM
lnI/KqEqGZrtKVrjPHdzRLrP9brlL0RSnCzwBMlR+cgbmq2PPivtNSU6enJIWp4wbUz1fafkgt0v
SXiwn2IkIYQQQgghhBBCTnu8Ekv2EG3LE6KdYVh0pvhQV2Tcnvq9vxMOiXHS/uCZ3pmrfHrZwrK9
j4b+wTicj8mwZEklkHRsj65TUoVzOy5JEY9TMmOFR1uKq+nrlGtObZXVUy+RU5UNSlRV28FM5bp0
BSvTMpxHe0o4Idr2cyPuPPfkiTRM7bLMDMnOVszGC8XIQk8UK/C9lDMy9bqVsNLyRybcv+2DG9XW
Lzi0Xmo7T1CIJIQQQgghhBBCyGmNleNvb0h2mm7oefSKkfHkY5orcjCRsDq880eY9tEECtF4qmTj
edxxNHofDSUmGskwa9fdGPcEXCcX2MkXaXo+J+J5TAqTRsbro7/RG7qPyXWyRdY1LJZj1dMkboRT
65MM64XwaCrRKhWW7QiQYqTckU41bTdM27tOmQ7JfJRXipMZJ4RlFXQmeUO00/eno+ZbiaQgadjP
J/e2yPlHt8rUk7uGVHsihBBCCCGEEEIIOd3IlFpcF6T7XmbeyDQx0hUgRWuH8YSVFCYdcTIqhrR4
5x8OGcYhHbaaKkqjn8NHCZENMxIxTIQ1O45Ixx0ZT3ND6rhty/k7mTcy6YrUC21k5Jc0nJUzPEVu
xkJ+a+g8Ktf3dcrrsy+T5sg06ZVqpeRahlPQxzJ1KK9yQxra5em4I9PFyERaARu9bhkipOQnulKI
zHWSWAWdTEMFyfQCNvgjEuuThu6TsvjweqnrOkkhkhBCCCGEEEIIIac9OV2ROapp53JFus/jQ6eo
/YWT3t8IhwxrjSXG56ABep2R7vO4Iw6ajsrpdUeKOKKalVpky+OQdCdP1PaQaUieSRm7ojYVAz1y
5aHXZN+MRbI7Mks6qhqk34zY622Km1PTUO5ILGPCI0C6OSONdEekZHdIJtczQGw0hGJk9hMlWIzM
WsjGdUR6PhOJR6W2r13mdB2WeUfelHCsX+c/JYQQQgghhBBCCJkAZBMkUwav3GIkHuOeEG1ETiuH
pOd1+3udplhve+cfnlw/8Fjrqcgp+wPTTCOjYrYB55iRJkrqkGsrJZThTVMywmKN/MRI541MyW1M
Jbj4oCw4ul3OCL8ne2YuliPhKdJXUS0DFbUSM0P29girytsGcmpa2gGZDM1OOGKkux45Kmrngk7I
YZ40AU5Jy3MGhRMxCVtxVZgmEu2WmdFWmXvsLYn06fQFFCIJIYQQQgghhBAykQhyRwY5I90cka4I
GUskXZGI2j5kVFqveOev1K+fLa/8TU/c+FDI/gtT2FDJJJ3nhnoedl4POe+Fnffc73gniJrQJ1W1
6eTzlACZJkZKmYmRWeiony3HJjdJd6hWes1KGZBQKi8kltfUVbjdStruc8lRRTsXlCKLc9IMed/5
QNg+Leri/VI90CsN3SdkStshMRKD3ICEEEIIIYQQQgiZsGQVI7MIkdnEyKQQ6fztCpEx7Yxst8S6
f+WL/V/wzh+VWiQSkt92x6y7EmKEXXek64RMwGJppQcMpwrS6LBsK2OBQ+Ip+e28j3l6UkumPRdJ
F+LKTZSr6zwq59gTOb1gXkhCCCGEEEIIIYRMdDJ1vTQhUjxiZJo70koKkGmh2o4QifozCbFOhC1r
TebvKTESododA5W74yKLElbKuahzROpfM7xVr93Q7ORipvJEmp5HNdnfTVbUdp2EGYKkSG6XZC7o
IiSEEEIIIYQQQgghJDdWIZ/LKFqjHh3B0bJc02H24jVDC9dYUfs774SrjNWZv6XEyFuel/4HVhj/
3DZgfSMhRlUy56EzM/08JToqHEHS64DEAoScStrJ8GzRgqMWJK0MV6Qnv2KGKJkJxUdCCCGEEEII
IYQQQoaPlePFISKk8w8ipv3CtJMh2smckZZyRdp/NocM69HLn4u2Z/5c2H0yZVL0xz1tlfcOJORa
91fjnkrZKe9j6hl+1TLcSTsglSgpOlTbdByQpqTclqY3NNuy0vJI6teybywWdyGEEEIIIYQQQggh
ZJhYbuHprG+lHrOEaafckbmdkQnLcoXJPvvppnDEeCzbb6UpfA9eEbnj1ID1U/vlBm8xmqFFaoxk
MZtshWtCbuEaV4j0PHqL13gra4vQFUkIIYQQQgghhBBCSCkpxB2ZTYj0q6ZtP8btD7xVYVh/c92a
6JPZfirs/ePD66NP/Nvyyp/3xaw/SYhR7b5uZF1sI00ldUO0dQEcSzkgtSBpeMK0PUKkJYGuSJoh
CSGEEEIIIYQQQggpMlaOKtqSvZp2dmekleGMVB85FjLkkbpJ0edz/XQ484XZDQP/6XBrZE40Yd0h
YoQzF0jcHzfsXzAMFaKdcJyRKo2klXJT6nyRngI2YiSfA68o6f7txbB4bBBCCCGEEEIIIYQQUmys
jD+yFa8ZKkhaSUekK0zGU+HabaZhPVJVaXz7st/JQK7fzeo9fOjKigUtUeNn9syuNAzD9IZr6zDs
9BBuuB/d97zh2e5zb2i2G67t/jgEylyVtGmMJIQQQgghhBBCCCGk+GR1RnoESPEpXpN0SKbCtbsM
kadqw/Lfrn+5f6/f7+bU+357VWTZiT7r2/bTyw3DqMwUI80MgdINyc4mSHrDsU1vWLb7aKTnjhTP
64QQQgghhBBCCCGEkOJhZSiRQyKiZahDMpH90bIfOwxDnq0Nyz+87+X+LUG/7Sv3wSF5asD4VtyS
6+2P1mU6HnO7IY2chWvSxMg0h6RnoYwRLDQhhBBCCCGEEEIIIURhBbxpZfmsZQ0tZJNFiIQp8oQh
1qO1YePrN67xd0S6BOp6j62ondISj31tMGHdaokx2/5COFsottcFmXrNyOmINHI4JPNZKIqRhBBC
CCGEEEIIIYT4YxXwvtctmXRGZuSN9AiRUVOsvaZhPFwXkW+878X+tnyXKW9d75crIn/eOWD9gWHI
YkuMmYauRZM1P6SZ0xGpfy4zLJs5IwkhhBBCCCGEEEIIGT1yVdOWHGHals4hOWCINBuGvBExjV9V
1PQ/esvzuYvVZKMgvQ8uyR4r/iedg4nbTNM4x16A6fYcqjJDsnOFaAeJkEbG4jBnJCGEEEIIIYQQ
QgghRcbKJkamXskI00Y4dq/95KQY8m61KQ/VRoyHC3FDehmW3AdRsk9i93bHrFUxy1hgGjLLnlGj
ky6yWiTlmswWkj2SEO2irQQhhBBCCCGEEEIIIacZ1ki+o0XImCHSb/8Vt59DcDwSMqx3I4b5ckUo
8fxtaweOjWT5RqzjPXZNZHb/QOLW/rh5hT23ioG41WQZRiRNjJTsxWqGOiOHuRJUIwkhhBBCCCGE
EEIIGVIpO+/vOQ+GSG+FKcfsx54Kw9hSKcaLt63rO8gtSwghhBBCCCGEEEIIIYQQQgghhBBCCCGE
EEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBC
CCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQggh
hBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEEIIIYQQQgghhBBCCCGEEELy5P8L
MAChisOdlmRgOQAAAABJRU5ErkJggg==" transform="matrix(0.1433 0 0 0.1433 2.884035e-02 -0.2508)">
</image>
</g>
</g>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 58 KiB

View File

@@ -1,12 +1,15 @@
The [dataset_info.json](dataset_info.json) contains all available datasets. If you are using a custom dataset, please **make sure** to add a *dataset description* in `dataset_info.json` and specify `dataset: dataset_name` before training to use it. The [dataset_info.json](dataset_info.json) contains all available datasets. If you are using a custom dataset, please **make sure** to add a *dataset description* in `dataset_info.json` and specify `dataset: dataset_name` before training to use it.
Currently we support datasets in **alpaca** and **sharegpt** format. The `dataset_info.json` file should be put in the `dataset_dir` directory. You can change `dataset_dir` to use another directory. The default value is `./data`.
Currently we support datasets in **alpaca** and **sharegpt** format. Allowed file types include json, jsonl, csv, parquet, arrow.
```json ```json
"dataset_name": { "dataset_name": {
"hf_hub_url": "the name of the dataset repository on the Hugging Face hub. (if specified, ignore script_url and file_name)", "hf_hub_url": "the name of the dataset repository on the Hugging Face hub. (if specified, ignore script_url, file_name and cloud_file_name)",
"ms_hub_url": "the name of the dataset repository on the Model Scope hub. (if specified, ignore script_url and file_name)", "ms_hub_url": "the name of the dataset repository on the Model Scope hub. (if specified, ignore script_url, file_name and cloud_file_name)",
"script_url": "the name of the directory containing a dataset loading script. (if specified, ignore file_name)", "script_url": "the name of the directory containing a dataset loading script. (if specified, ignore file_name and cloud_file_name)",
"cloud_file_name": "the name of the dataset file in s3/gcs cloud storage. (if specified, ignore file_name)",
"file_name": "the name of the dataset folder or dataset file in this directory. (required if above are not specified)", "file_name": "the name of the dataset folder or dataset file in this directory. (required if above are not specified)",
"formatting": "the format of the dataset. (optional, default: alpaca, can be chosen from {alpaca, sharegpt})", "formatting": "the format of the dataset. (optional, default: alpaca, can be chosen from {alpaca, sharegpt})",
"ranking": "whether the dataset is a preference dataset or not. (default: False)", "ranking": "whether the dataset is a preference dataset or not. (default: False)",
@@ -23,6 +26,8 @@ Currently we support datasets in **alpaca** and **sharegpt** format.
"system": "the column name in the dataset containing the system prompts. (default: None)", "system": "the column name in the dataset containing the system prompts. (default: None)",
"tools": "the column name in the dataset containing the tool description. (default: None)", "tools": "the column name in the dataset containing the tool description. (default: None)",
"images": "the column name in the dataset containing the image inputs. (default: None)", "images": "the column name in the dataset containing the image inputs. (default: None)",
"videos": "the column name in the dataset containing the videos inputs. (default: None)",
"audios": "the column name in the dataset containing the audios inputs. (default: None)",
"chosen": "the column name in the dataset containing the chosen answers. (default: None)", "chosen": "the column name in the dataset containing the chosen answers. (default: None)",
"rejected": "the column name in the dataset containing the rejected answers. (default: None)", "rejected": "the column name in the dataset containing the rejected answers. (default: None)",
"kto_tag": "the column name in the dataset containing the kto tags. (default: None)" "kto_tag": "the column name in the dataset containing the kto tags. (default: None)"
@@ -45,7 +50,9 @@ Currently we support datasets in **alpaca** and **sharegpt** format.
* [Example dataset](alpaca_en_demo.json) * [Example dataset](alpaca_en_demo.json)
In supervised fine-tuning, the `instruction` column will be concatenated with the `input` column and used as the human prompt, then the human prompt would be `instruction\ninput`. The `output` column represents the model response. In supervised fine-tuning, the `instruction` column will be concatenated with the `input` column and used as the user prompt, then the user prompt would be `instruction\ninput`. The `output` column represents the model response.
For reasoning models, if the dataset contains chain-of-thought (CoT), the CoT needs to be placed in the model responses, such as `<think>cot</think>output`.
The `system` column will be used as the system prompt if specified. The `system` column will be used as the system prompt if specified.
@@ -54,13 +61,13 @@ The `history` column is a list consisting of string tuples representing prompt-r
```json ```json
[ [
{ {
"instruction": "human instruction (required)", "instruction": "user instruction (required)",
"input": "human input (optional)", "input": "user input (optional)",
"output": "model response (required)", "output": "model response (required)",
"system": "system prompt (optional)", "system": "system prompt (optional)",
"history": [ "history": [
["human instruction in the first round (optional)", "model response in the first round (optional)"], ["user instruction in the first round (optional)", "model response in the first round (optional)"],
["human instruction in the second round (optional)", "model response in the second round (optional)"] ["user instruction in the second round (optional)", "model response in the second round (optional)"]
] ]
} }
] ]
@@ -81,9 +88,14 @@ Regarding the above dataset, the *dataset description* in `dataset_info.json` sh
} }
``` ```
> [!TIP]
> If the model has reasoning capabilities (e.g. Qwen3) but the dataset does not contain chain-of-thought (CoT), LLaMA-Factory will automatically add empty CoT to the data. When `enable_thinking` is `True` (slow thinking, by default), the empty CoT will be added to the model responses and loss computation will be considered; otherwise (fast thinking), it will be added to the user prompts and loss computation will be ignored. Please keep the `enable_thinking` parameter consistent during training and inference.
>
> If you want to train data containing CoT with slow thinking and data without CoT with fast thinking, you can set `enable_thinking` to `None`. However, this feature is relatively complicated and should be used with caution.
### Pre-training Dataset ### Pre-training Dataset
- [Example dataset](c4_demo.json) - [Example dataset](c4_demo.jsonl)
In pre-training, only the `text` column will be used for model learning. In pre-training, only the `text` column will be used for model learning.
@@ -107,15 +119,15 @@ Regarding the above dataset, the *dataset description* in `dataset_info.json` sh
### Preference Dataset ### Preference Dataset
Preference datasets are used for reward modeling, DPO training and ORPO training. Preference datasets are used for reward modeling, DPO training, ORPO and SimPO training.
It requires a better response in `chosen` column and a worse response in `rejected` column. It requires a better response in `chosen` column and a worse response in `rejected` column.
```json ```json
[ [
{ {
"instruction": "human instruction (required)", "instruction": "user instruction (required)",
"input": "human input (optional)", "input": "user input (optional)",
"chosen": "chosen answer (required)", "chosen": "chosen answer (required)",
"rejected": "rejected answer (required)" "rejected": "rejected answer (required)"
} }
@@ -139,67 +151,19 @@ Regarding the above dataset, the *dataset description* in `dataset_info.json` sh
### KTO Dataset ### KTO Dataset
- [Example dataset](kto_en_demo.json) An additional column `kto_tag` is required. Please refer to the [sharegpt](#sharegpt-format) format for details.
KTO datasets require a extra `kto_tag` column containing the boolean human feedback. ### Multimodal Image Dataset
```json An additional column `images` is required. Please refer to the [sharegpt](#sharegpt-format) format for details.
[
{
"instruction": "human instruction (required)",
"input": "human input (optional)",
"output": "model response (required)",
"kto_tag": "human feedback [true/false] (required)"
}
]
```
Regarding the above dataset, the *dataset description* in `dataset_info.json` should be: ### Multimodal Video Dataset
```json An additional column `videos` is required. Please refer to the [sharegpt](#sharegpt-format) format for details.
"dataset_name": {
"file_name": "data.json",
"columns": {
"prompt": "instruction",
"query": "input",
"response": "output",
"kto_tag": "kto_tag"
}
}
```
### Multimodal Dataset ### Multimodal Audio Dataset
- [Example dataset](mllm_demo.json) An additional column `audios` is required. Please refer to the [sharegpt](#sharegpt-format) format for details.
Multimodal datasets require a `images` column containing the paths to the input images. Currently we only support one image.
```json
[
{
"instruction": "human instruction (required)",
"input": "human input (optional)",
"output": "model response (required)",
"images": [
"image path (required)"
]
}
]
```
Regarding the above dataset, the *dataset description* in `dataset_info.json` should be:
```json
"dataset_name": {
"file_name": "data.json",
"columns": {
"prompt": "instruction",
"query": "input",
"response": "output",
"images": "images"
}
}
```
## Sharegpt Format ## Sharegpt Format
@@ -209,7 +173,7 @@ Regarding the above dataset, the *dataset description* in `dataset_info.json` sh
Compared to the alpaca format, the sharegpt format allows the datasets have **more roles**, such as human, gpt, observation and function. They are presented in a list of objects in the `conversations` column. Compared to the alpaca format, the sharegpt format allows the datasets have **more roles**, such as human, gpt, observation and function. They are presented in a list of objects in the `conversations` column.
Note that the human and observation should appear in odd positions, while gpt and function should appear in even positions. Note that the human and observation should appear in odd positions, while gpt and function should appear in even positions. The gpt and function will be learned by the model.
```json ```json
[ [
@@ -217,7 +181,7 @@ Note that the human and observation should appear in odd positions, while gpt an
"conversations": [ "conversations": [
{ {
"from": "human", "from": "human",
"value": "human instruction" "value": "user instruction"
}, },
{ {
"from": "function_call", "from": "function_call",
@@ -252,6 +216,10 @@ Regarding the above dataset, the *dataset description* in `dataset_info.json` sh
} }
``` ```
### Pre-training Dataset
Not yet supported, please use the [alpaca](#alpaca-format) format.
### Preference Dataset ### Preference Dataset
- [Example dataset](dpo_en_demo.json) - [Example dataset](dpo_en_demo.json)
@@ -264,7 +232,7 @@ Preference datasets in sharegpt format also require a better message in `chosen`
"conversations": [ "conversations": [
{ {
"from": "human", "from": "human",
"value": "human instruction" "value": "user instruction"
}, },
{ {
"from": "gpt", "from": "gpt",
@@ -272,7 +240,7 @@ Preference datasets in sharegpt format also require a better message in `chosen`
}, },
{ {
"from": "human", "from": "human",
"value": "human instruction" "value": "user instruction"
} }
], ],
"chosen": { "chosen": {
@@ -302,6 +270,166 @@ Regarding the above dataset, the *dataset description* in `dataset_info.json` sh
} }
``` ```
### KTO Dataset
- [Example dataset](kto_en_demo.json)
KTO datasets require a extra `kto_tag` column containing the boolean human feedback.
```json
[
{
"conversations": [
{
"from": "human",
"value": "user instruction"
},
{
"from": "gpt",
"value": "model response"
}
],
"kto_tag": "human feedback [true/false] (required)"
}
]
```
Regarding the above dataset, the *dataset description* in `dataset_info.json` should be:
```json
"dataset_name": {
"file_name": "data.json",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"kto_tag": "kto_tag"
}
}
```
### Multimodal Image Dataset
- [Example dataset](mllm_demo.json)
Multimodal image datasets require an `images` column containing the paths to the input images.
The number of images should be identical to the `<image>` tokens in the conversations.
```json
[
{
"conversations": [
{
"from": "human",
"value": "<image>user instruction"
},
{
"from": "gpt",
"value": "model response"
}
],
"images": [
"image path (required)"
]
}
]
```
Regarding the above dataset, the *dataset description* in `dataset_info.json` should be:
```json
"dataset_name": {
"file_name": "data.json",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"images": "images"
}
}
```
### Multimodal Video Dataset
- [Example dataset](mllm_video_demo.json)
Multimodal video datasets require a `videos` column containing the paths to the input videos.
The number of videos should be identical to the `<video>` tokens in the conversations.
```json
[
{
"conversations": [
{
"from": "human",
"value": "<video>user instruction"
},
{
"from": "gpt",
"value": "model response"
}
],
"videos": [
"video path (required)"
]
}
]
```
Regarding the above dataset, the *dataset description* in `dataset_info.json` should be:
```json
"dataset_name": {
"file_name": "data.json",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"videos": "videos"
}
}
```
### Multimodal Audio Dataset
- [Example dataset](mllm_audio_demo.json)
Multimodal audio datasets require an `audios` column containing the paths to the input audios.
The number of audios should be identical to the `<audio>` tokens in the conversations.
```json
[
{
"conversations": [
{
"from": "human",
"value": "<audio>user instruction"
},
{
"from": "gpt",
"value": "model response"
}
],
"audios": [
"audio path (required)"
]
}
]
```
Regarding the above dataset, the *dataset description* in `dataset_info.json` should be:
```json
"dataset_name": {
"file_name": "data.json",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"audios": "audios"
}
}
```
### OpenAI Format ### OpenAI Format
The openai format is simply a special case of the sharegpt format, where the first message may be a system prompt. The openai format is simply a special case of the sharegpt format, where the first message may be a system prompt.
@@ -316,7 +444,7 @@ The openai format is simply a special case of the sharegpt format, where the fir
}, },
{ {
"role": "user", "role": "user",
"content": "human instruction" "content": "user instruction"
}, },
{ {
"role": "assistant", "role": "assistant",
@@ -345,7 +473,3 @@ Regarding the above dataset, the *dataset description* in `dataset_info.json` sh
} }
} }
``` ```
The KTO datasets and multimodal datasets in sharegpt format are similar to the alpaca format.
Pre-training datasets are **incompatible** with the sharegpt format.

View File

@@ -1,6 +1,8 @@
[dataset_info.json](dataset_info.json) 包含了所有可用的数据集。如果您希望使用自定义数据集,请**务必**在 `dataset_info.json` 文件中添加*数据集描述*,并通过修改 `dataset: 数据集名称` 配置来使用数据集。 [dataset_info.json](dataset_info.json) 包含了所有可用的数据集。如果您希望使用自定义数据集,请**务必**在 `dataset_info.json` 文件中添加*数据集描述*,并通过修改 `dataset: 数据集名称` 配置来使用数据集。
目前我们支持 **alpaca** 格式和 **sharegpt** 格式的数据集 其中 `dataset_info.json` 文件应放置在 `dataset_dir` 目录下。您可以通过修改 `dataset_dir` 参数来使用其他目录。默认值为 `./data`
目前我们支持 **alpaca** 格式和 **sharegpt** 格式的数据集。允许的文件类型包括 json、jsonl、csv、parquet 和 arrow。
```json ```json
"数据集名称": { "数据集名称": {
@@ -23,6 +25,8 @@
"system": "数据集代表系统提示的表头名称默认None", "system": "数据集代表系统提示的表头名称默认None",
"tools": "数据集代表工具描述的表头名称默认None", "tools": "数据集代表工具描述的表头名称默认None",
"images": "数据集代表图像输入的表头名称默认None", "images": "数据集代表图像输入的表头名称默认None",
"videos": "数据集代表视频输入的表头名称默认None",
"audios": "数据集代表音频输入的表头名称默认None",
"chosen": "数据集代表更优回答的表头名称默认None", "chosen": "数据集代表更优回答的表头名称默认None",
"rejected": "数据集代表更差回答的表头名称默认None", "rejected": "数据集代表更差回答的表头名称默认None",
"kto_tag": "数据集代表 KTO 标签的表头名称默认None" "kto_tag": "数据集代表 KTO 标签的表头名称默认None"
@@ -45,7 +49,9 @@
- [样例数据集](alpaca_zh_demo.json) - [样例数据集](alpaca_zh_demo.json)
在指令监督微调时,`instruction` 列对应的内容会与 `input` 列对应的内容拼接后作为人类指令,即人类指令`instruction\ninput`。而 `output` 列对应的内容为模型回答。 在指令监督微调时,`instruction` 列对应的内容会与 `input` 列对应的内容拼接后作为提示词,即提示词`instruction\ninput`。而 `output` 列对应的内容为模型回答。
对于推理类模型的微调,如果数据集包含思维链,则需要把思维链放在模型回答中,例如 `<think>cot</think>output`
如果指定,`system` 列对应的内容将被作为系统提示词。 如果指定,`system` 列对应的内容将被作为系统提示词。
@@ -54,8 +60,8 @@
```json ```json
[ [
{ {
"instruction": "人类指令(必填)", "instruction": "用户指令(必填)",
"input": "人类输入(选填)", "input": "用户输入(选填)",
"output": "模型回答(必填)", "output": "模型回答(必填)",
"system": "系统提示词(选填)", "system": "系统提示词(选填)",
"history": [ "history": [
@@ -81,9 +87,14 @@
} }
``` ```
> [!TIP]
> 如果模型本身具备推理能力(如 Qwen3而数据集不包含思维链LLaMA-Factory 会自动为数据添加空思维链。当 `enable_thinking` 为 `True` 时(慢思考,默认),空思维链会添加到模型回答中并且计算损失,否则会添加到用户指令中并且不计算损失(快思考)。请在训练和推理时保持 `enable_thinking` 参数一致。
>
> 如果您希望训练包含思维链的数据时使用慢思考,训练不包含思维链的数据时使用快思考,可以设置 `enable_thinking` 为 `None`。但该功能较为复杂,请谨慎使用。
### 预训练数据集 ### 预训练数据集
- [样例数据集](c4_demo.json) - [样例数据集](c4_demo.jsonl)
在预训练时,只有 `text` 列中的内容会用于模型学习。 在预训练时,只有 `text` 列中的内容会用于模型学习。
@@ -107,15 +118,15 @@
### 偏好数据集 ### 偏好数据集
偏好数据集用于奖励模型训练、DPO 训练和 ORPO 训练。 偏好数据集用于奖励模型训练、DPO 训练、ORPO 训练和 SimPO 训练。
它需要在 `chosen` 列中提供更优的回答,并在 `rejected` 列中提供更差的回答。 它需要在 `chosen` 列中提供更优的回答,并在 `rejected` 列中提供更差的回答。
```json ```json
[ [
{ {
"instruction": "人类指令(必填)", "instruction": "用户指令(必填)",
"input": "人类输入(选填)", "input": "用户输入(选填)",
"chosen": "优质回答(必填)", "chosen": "优质回答(必填)",
"rejected": "劣质回答(必填)" "rejected": "劣质回答(必填)"
} }
@@ -139,67 +150,19 @@
### KTO 数据集 ### KTO 数据集
- [样例数据集](kto_en_demo.json) KTO 数据集需要提供额外的 `kto_tag` 列。详情请参阅 [sharegpt](#sharegpt-格式)
KTO 数据集需要额外添加一个 `kto_tag` 列,包含 bool 类型的人类反馈。 ### 多模态图像数据集
```json 多模态图像数据集需要提供额外的 `images` 列。详情请参阅 [sharegpt](#sharegpt-格式)。
[
{
"instruction": "人类指令(必填)",
"input": "人类输入(选填)",
"output": "模型回答(必填)",
"kto_tag": "人类反馈 [true/false](必填)"
}
]
```
对于上述格式的数据,`dataset_info.json` 中的*数据集描述*应为: ### 多模态视频数据集
```json 多模态视频数据集需要提供额外的 `videos` 列。详情请参阅 [sharegpt](#sharegpt-格式)。
"数据集名称": {
"file_name": "data.json",
"columns": {
"prompt": "instruction",
"query": "input",
"response": "output",
"kto_tag": "kto_tag"
}
}
```
### 多模态数据集 ### 多模态音频数据集
- [样例数据集](mllm_demo.json) 多模态音频数据集需要提供额外的 `audios` 列。详情请参阅 [sharegpt](#sharegpt-格式)
多模态数据集需要额外添加一个 `images` 列,包含输入图像的路径。目前我们仅支持单张图像输入。
```json
[
{
"instruction": "人类指令(必填)",
"input": "人类输入(选填)",
"output": "模型回答(必填)",
"images": [
"图像路径(必填)"
]
}
]
```
对于上述格式的数据,`dataset_info.json` 中的*数据集描述*应为:
```json
"数据集名称": {
"file_name": "data.json",
"columns": {
"prompt": "instruction",
"query": "input",
"response": "output",
"images": "images"
}
}
```
## Sharegpt 格式 ## Sharegpt 格式
@@ -209,7 +172,7 @@ KTO 数据集需要额外添加一个 `kto_tag` 列,包含 bool 类型的人
相比 alpaca 格式的数据集sharegpt 格式支持**更多的角色种类**,例如 human、gpt、observation、function 等等。它们构成一个对象列表呈现在 `conversations` 列中。 相比 alpaca 格式的数据集sharegpt 格式支持**更多的角色种类**,例如 human、gpt、observation、function 等等。它们构成一个对象列表呈现在 `conversations` 列中。
注意其中 human 和 observation 必须出现在奇数位置gpt 和 function 必须出现在偶数位置。 注意其中 human 和 observation 必须出现在奇数位置gpt 和 function 必须出现在偶数位置。默认所有的 gpt 和 function 会被用于学习。
```json ```json
[ [
@@ -217,7 +180,7 @@ KTO 数据集需要额外添加一个 `kto_tag` 列,包含 bool 类型的人
"conversations": [ "conversations": [
{ {
"from": "human", "from": "human",
"value": "人类指令" "value": "用户指令"
}, },
{ {
"from": "function_call", "from": "function_call",
@@ -252,6 +215,10 @@ KTO 数据集需要额外添加一个 `kto_tag` 列,包含 bool 类型的人
} }
``` ```
### 预训练数据集
尚不支持,请使用 [alpaca](#alpaca-格式) 格式。
### 偏好数据集 ### 偏好数据集
- [样例数据集](dpo_zh_demo.json) - [样例数据集](dpo_zh_demo.json)
@@ -264,7 +231,7 @@ Sharegpt 格式的偏好数据集同样需要在 `chosen` 列中提供更优的
"conversations": [ "conversations": [
{ {
"from": "human", "from": "human",
"value": "人类指令" "value": "用户指令"
}, },
{ {
"from": "gpt", "from": "gpt",
@@ -272,7 +239,7 @@ Sharegpt 格式的偏好数据集同样需要在 `chosen` 列中提供更优的
}, },
{ {
"from": "human", "from": "human",
"value": "人类指令" "value": "用户指令"
} }
], ],
"chosen": { "chosen": {
@@ -302,6 +269,170 @@ Sharegpt 格式的偏好数据集同样需要在 `chosen` 列中提供更优的
} }
``` ```
### KTO 数据集
- [样例数据集](kto_en_demo.json)
KTO 数据集需要额外添加一个 `kto_tag` 列,包含 bool 类型的人类反馈。
```json
[
{
"conversations": [
{
"from": "human",
"value": "用户指令"
},
{
"from": "gpt",
"value": "模型回答"
}
],
"kto_tag": "人类反馈 [true/false](必填)"
}
]
```
对于上述格式的数据,`dataset_info.json` 中的*数据集描述*应为:
```json
"数据集名称": {
"file_name": "data.json",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"kto_tag": "kto_tag"
}
}
```
### 多模态图像数据集
- [样例数据集](mllm_demo.json)
多模态图像数据集需要额外添加一个 `images` 列,包含输入图像的路径。
注意图片的数量必须与文本中所有 `<image>` 标记的数量严格一致。
```json
[
{
"conversations": [
{
"from": "human",
"value": "<image><image>用户指令"
},
{
"from": "gpt",
"value": "模型回答"
}
],
"images": [
"图像路径(必填)",
"图像路径(必填)"
]
}
]
```
对于上述格式的数据,`dataset_info.json` 中的*数据集描述*应为:
```json
"数据集名称": {
"file_name": "data.json",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"images": "images"
}
}
```
### 多模态视频数据集
- [样例数据集](mllm_video_demo.json)
多模态视频数据集需要额外添加一个 `videos` 列,包含输入视频的路径。
注意视频的数量必须与文本中所有 `<video>` 标记的数量严格一致。
```json
[
{
"conversations": [
{
"from": "human",
"value": "<video><video>用户指令"
},
{
"from": "gpt",
"value": "模型回答"
}
],
"videos": [
"视频路径(必填)",
"视频路径(必填)"
]
}
]
```
对于上述格式的数据,`dataset_info.json` 中的*数据集描述*应为:
```json
"数据集名称": {
"file_name": "data.json",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"videos": "videos"
}
}
```
### 多模态音频数据集
- [样例数据集](mllm_audio_demo.json)
多模态音频数据集需要额外添加一个 `audios` 列,包含输入音频的路径。
注意音频的数量必须与文本中所有 `<audio>` 标记的数量严格一致。
```json
[
{
"conversations": [
{
"from": "human",
"value": "<audio><audio>用户指令"
},
{
"from": "gpt",
"value": "模型回答"
}
],
"audios": [
"音频路径(必填)",
"音频路径(必填)"
]
}
]
```
对于上述格式的数据,`dataset_info.json` 中的*数据集描述*应为:
```json
"数据集名称": {
"file_name": "data.json",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"audios": "audios"
}
}
```
### OpenAI 格式 ### OpenAI 格式
OpenAI 格式仅仅是 sharegpt 格式的一种特殊情况,其中第一条消息可能是系统提示词。 OpenAI 格式仅仅是 sharegpt 格式的一种特殊情况,其中第一条消息可能是系统提示词。
@@ -316,7 +447,7 @@ OpenAI 格式仅仅是 sharegpt 格式的一种特殊情况,其中第一条消
}, },
{ {
"role": "user", "role": "user",
"content": "人类指令" "content": "用户指令"
}, },
{ {
"role": "assistant", "role": "assistant",
@@ -345,7 +476,3 @@ OpenAI 格式仅仅是 sharegpt 格式的一种特殊情况,其中第一条消
} }
} }
``` ```
Sharegpt 格式中的 KTO 数据集和多模态数据集与 alpaca 格式的类似。
预训练数据集**不支持** sharegpt 格式。

4997
data/alpaca_en_demo.json Normal file

File diff suppressed because it is too large Load Diff

5002
data/alpaca_zh_demo.json Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -1,67 +0,0 @@
import json
import os
import datasets
_HF_ENDPOINT = os.getenv("HF_ENDPOINT", "https://huggingface.co")
_DESCRIPTION = "BELLE multiturn chat dataset."
_CITATION = """\
@article{belle2023exploring,
title={Exploring the Impact of Instruction Data Scaling on Large Language Models: An Empirical Study on Real-World Use Cases},
author={Yunjie Ji, Yong Deng, Yan Gong, Yiping Peng, Qiang Niu, Lei Zhang, Baochang Ma, Xiangang Li},
journal={arXiv preprint arXiv:2303.14742},
year={2023}
}
"""
_HOMEPAGE = "{}/datasets/BelleGroup/multiturn_chat_0.8M".format(_HF_ENDPOINT)
_LICENSE = "gpl-3.0"
_URL = "{}/datasets/BelleGroup/multiturn_chat_0.8M/resolve/main/multiturn_chat_0.8M.json".format(_HF_ENDPOINT)
class BelleMultiturn(datasets.GeneratorBasedBuilder):
VERSION = datasets.Version("0.0.0")
def _info(self):
features = datasets.Features(
{"conversations": [{"from": datasets.Value("string"), "value": datasets.Value("string")}]}
)
return datasets.DatasetInfo(
description=_DESCRIPTION, features=features, homepage=_HOMEPAGE, license=_LICENSE, citation=_CITATION
)
def _split_generators(self, dl_manager: datasets.DownloadManager):
file_path = dl_manager.download(_URL)
return [datasets.SplitGenerator(name=datasets.Split.TRAIN, gen_kwargs={"filepath": file_path})]
def _generate_examples(self, filepath: str):
with open(filepath, "r", encoding="utf-8") as f:
for key, row in enumerate(f):
data = json.loads(row)
conversations = []
prompt = data["instruction"].strip()
response = data["output"].strip()
assist_idx = prompt.rfind("Assistant:")
human_idx = prompt.rfind("Human:")
query = prompt[human_idx + 6 : assist_idx].strip()
prompt = prompt[:human_idx].strip()
conversations.insert(0, {"from": "gpt", "value": response})
conversations.insert(0, {"from": "human", "value": query})
while prompt.rfind("Assistant:") != -1:
assist_idx = prompt.rfind("Assistant:")
human_idx = prompt.rfind("Human:")
if human_idx != -1:
old_query = prompt[human_idx + 6 : assist_idx].strip()
old_resp = prompt[assist_idx + 10 :].strip()
conversations.insert(0, {"from": "gpt", "value": old_resp})
conversations.insert(0, {"from": "human", "value": old_query})
else:
break
prompt = prompt[:human_idx].strip()
yield key, {"conversations": conversations}

300
data/c4_demo.jsonl Normal file

File diff suppressed because one or more lines are too long

734
data/dataset_info.json Normal file
View File

@@ -0,0 +1,734 @@
{
"identity": {
"file_name": "identity.json"
},
"alpaca_en_demo": {
"file_name": "alpaca_en_demo.json"
},
"alpaca_zh_demo": {
"file_name": "alpaca_zh_demo.json"
},
"glaive_toolcall_en_demo": {
"file_name": "glaive_toolcall_en_demo.json",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"tools": "tools"
}
},
"glaive_toolcall_zh_demo": {
"file_name": "glaive_toolcall_zh_demo.json",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"tools": "tools"
}
},
"mllm_demo": {
"file_name": "mllm_demo.json",
"formatting": "sharegpt",
"columns": {
"messages": "messages",
"images": "images"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"mllm_audio_demo": {
"file_name": "mllm_audio_demo.json",
"formatting": "sharegpt",
"columns": {
"messages": "messages",
"audios": "audios"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"mllm_video_demo": {
"file_name": "mllm_video_demo.json",
"formatting": "sharegpt",
"columns": {
"messages": "messages",
"videos": "videos"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"mllm_video_audio_demo": {
"file_name": "mllm_video_audio_demo.json",
"formatting": "sharegpt",
"columns": {
"messages": "messages",
"videos": "videos",
"audios": "audios"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"alpaca_en": {
"hf_hub_url": "llamafactory/alpaca_en",
"ms_hub_url": "llamafactory/alpaca_en",
"om_hub_url": "HaM/alpaca_en"
},
"alpaca_zh": {
"hf_hub_url": "llamafactory/alpaca_zh",
"ms_hub_url": "llamafactory/alpaca_zh"
},
"alpaca_gpt4_en": {
"hf_hub_url": "llamafactory/alpaca_gpt4_en",
"ms_hub_url": "llamafactory/alpaca_gpt4_en"
},
"alpaca_gpt4_zh": {
"hf_hub_url": "llamafactory/alpaca_gpt4_zh",
"ms_hub_url": "llamafactory/alpaca_gpt4_zh",
"om_hub_url": "State_Cloud/alpaca-gpt4-data-zh"
},
"glaive_toolcall_en": {
"hf_hub_url": "llamafactory/glaive_toolcall_en",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"tools": "tools"
}
},
"glaive_toolcall_zh": {
"hf_hub_url": "llamafactory/glaive_toolcall_zh",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"tools": "tools"
}
},
"lima": {
"hf_hub_url": "llamafactory/lima",
"formatting": "sharegpt"
},
"guanaco": {
"hf_hub_url": "JosephusCheung/GuanacoDataset",
"ms_hub_url": "AI-ModelScope/GuanacoDataset"
},
"belle_2m": {
"hf_hub_url": "BelleGroup/train_2M_CN",
"ms_hub_url": "AI-ModelScope/train_2M_CN"
},
"belle_1m": {
"hf_hub_url": "BelleGroup/train_1M_CN",
"ms_hub_url": "AI-ModelScope/train_1M_CN"
},
"belle_0.5m": {
"hf_hub_url": "BelleGroup/train_0.5M_CN",
"ms_hub_url": "AI-ModelScope/train_0.5M_CN"
},
"belle_dialog": {
"hf_hub_url": "BelleGroup/generated_chat_0.4M",
"ms_hub_url": "AI-ModelScope/generated_chat_0.4M"
},
"belle_math": {
"hf_hub_url": "BelleGroup/school_math_0.25M",
"ms_hub_url": "AI-ModelScope/school_math_0.25M"
},
"open_platypus": {
"hf_hub_url": "garage-bAInd/Open-Platypus",
"ms_hub_url": "AI-ModelScope/Open-Platypus"
},
"codealpaca": {
"hf_hub_url": "sahil2801/CodeAlpaca-20k",
"ms_hub_url": "AI-ModelScope/CodeAlpaca-20k"
},
"alpaca_cot": {
"hf_hub_url": "QingyiSi/Alpaca-CoT",
"ms_hub_url": "AI-ModelScope/Alpaca-CoT"
},
"openorca": {
"hf_hub_url": "Open-Orca/OpenOrca",
"ms_hub_url": "AI-ModelScope/OpenOrca",
"columns": {
"prompt": "question",
"response": "response",
"system": "system_prompt"
}
},
"slimorca": {
"hf_hub_url": "Open-Orca/SlimOrca",
"formatting": "sharegpt"
},
"mathinstruct": {
"hf_hub_url": "TIGER-Lab/MathInstruct",
"ms_hub_url": "AI-ModelScope/MathInstruct",
"columns": {
"prompt": "instruction",
"response": "output"
}
},
"firefly": {
"hf_hub_url": "YeungNLP/firefly-train-1.1M",
"columns": {
"prompt": "input",
"response": "target"
}
},
"wikiqa": {
"hf_hub_url": "wiki_qa",
"columns": {
"prompt": "question",
"response": "answer"
}
},
"webqa": {
"hf_hub_url": "suolyer/webqa",
"ms_hub_url": "AI-ModelScope/webqa",
"columns": {
"prompt": "input",
"response": "output"
}
},
"webnovel": {
"hf_hub_url": "zxbsmk/webnovel_cn",
"ms_hub_url": "AI-ModelScope/webnovel_cn"
},
"nectar_sft": {
"hf_hub_url": "AstraMindAI/SFT-Nectar",
"ms_hub_url": "AI-ModelScope/SFT-Nectar"
},
"deepctrl": {
"ms_hub_url": "deepctrl/deepctrl-sft-data"
},
"adgen_train": {
"hf_hub_url": "HasturOfficial/adgen",
"ms_hub_url": "AI-ModelScope/adgen",
"split": "train",
"columns": {
"prompt": "content",
"response": "summary"
}
},
"adgen_eval": {
"hf_hub_url": "HasturOfficial/adgen",
"ms_hub_url": "AI-ModelScope/adgen",
"split": "validation",
"columns": {
"prompt": "content",
"response": "summary"
}
},
"sharegpt_hyper": {
"hf_hub_url": "totally-not-an-llm/sharegpt-hyperfiltered-3k",
"formatting": "sharegpt"
},
"sharegpt4": {
"hf_hub_url": "shibing624/sharegpt_gpt4",
"ms_hub_url": "AI-ModelScope/sharegpt_gpt4",
"formatting": "sharegpt"
},
"ultrachat_200k": {
"hf_hub_url": "HuggingFaceH4/ultrachat_200k",
"ms_hub_url": "AI-ModelScope/ultrachat_200k",
"split": "train_sft",
"formatting": "sharegpt",
"columns": {
"messages": "messages"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"infinity_instruct": {
"hf_hub_url": "BAAI/Infinity-Instruct",
"formatting": "sharegpt"
},
"agent_instruct": {
"hf_hub_url": "THUDM/AgentInstruct",
"ms_hub_url": "ZhipuAI/AgentInstruct",
"formatting": "sharegpt"
},
"lmsys_chat": {
"hf_hub_url": "lmsys/lmsys-chat-1m",
"ms_hub_url": "AI-ModelScope/lmsys-chat-1m",
"formatting": "sharegpt",
"columns": {
"messages": "conversation"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"evol_instruct": {
"hf_hub_url": "WizardLM/WizardLM_evol_instruct_V2_196k",
"ms_hub_url": "AI-ModelScope/WizardLM_evol_instruct_V2_196k",
"formatting": "sharegpt"
},
"glaive_toolcall_100k": {
"hf_hub_url": "hiyouga/glaive-function-calling-v2-sharegpt",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"tools": "tools"
}
},
"cosmopedia": {
"hf_hub_url": "HuggingFaceTB/cosmopedia",
"columns": {
"prompt": "prompt",
"response": "text"
}
},
"stem_zh": {
"hf_hub_url": "hfl/stem_zh_instruction"
},
"ruozhiba_gpt4": {
"hf_hub_url": "hfl/ruozhiba_gpt4_turbo"
},
"neo_sft": {
"hf_hub_url": "m-a-p/neo_sft_phase2",
"formatting": "sharegpt"
},
"magpie_pro_300k": {
"hf_hub_url": "Magpie-Align/Magpie-Pro-300K-Filtered",
"formatting": "sharegpt"
},
"magpie_ultra": {
"hf_hub_url": "argilla/magpie-ultra-v0.1",
"columns": {
"prompt": "instruction",
"response": "response"
}
},
"web_instruct": {
"hf_hub_url": "TIGER-Lab/WebInstructSub",
"columns": {
"prompt": "question",
"response": "answer"
}
},
"openo1_sft": {
"hf_hub_url": "llamafactory/OpenO1-SFT",
"ms_hub_url": "llamafactory/OpenO1-SFT",
"columns": {
"prompt": "prompt",
"response": "response"
}
},
"open_thoughts": {
"hf_hub_url": "llamafactory/OpenThoughts-114k",
"formatting": "sharegpt",
"columns": {
"messages": "messages"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant",
"system_tag": "system"
}
},
"open_r1_math": {
"hf_hub_url": "llamafactory/OpenR1-Math-94k",
"formatting": "sharegpt",
"columns": {
"messages": "messages"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant",
"system_tag": "system"
}
},
"chinese_r1_distill": {
"hf_hub_url": "Congliu/Chinese-DeepSeek-R1-Distill-data-110k-SFT",
"ms_hub_url": "liucong/Chinese-DeepSeek-R1-Distill-data-110k-SFT"
},
"llava_1k_en": {
"hf_hub_url": "BUAADreamer/llava-en-zh-2k",
"subset": "en",
"formatting": "sharegpt",
"columns": {
"messages": "messages",
"images": "images"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"llava_1k_zh": {
"hf_hub_url": "BUAADreamer/llava-en-zh-2k",
"subset": "zh",
"formatting": "sharegpt",
"columns": {
"messages": "messages",
"images": "images"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"llava_150k_en": {
"hf_hub_url": "BUAADreamer/llava-en-zh-300k",
"subset": "en",
"formatting": "sharegpt",
"columns": {
"messages": "messages",
"images": "images"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"llava_150k_zh": {
"hf_hub_url": "BUAADreamer/llava-en-zh-300k",
"subset": "zh",
"formatting": "sharegpt",
"columns": {
"messages": "messages",
"images": "images"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"pokemon_cap": {
"hf_hub_url": "llamafactory/pokemon-gpt4o-captions",
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"images": "images"
}
},
"mllm_pt_demo": {
"hf_hub_url": "BUAADreamer/mllm_pt_demo",
"formatting": "sharegpt",
"columns": {
"messages": "messages",
"images": "images"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"oasst_de": {
"hf_hub_url": "mayflowergmbh/oasst_de"
},
"dolly_15k_de": {
"hf_hub_url": "mayflowergmbh/dolly-15k_de"
},
"alpaca-gpt4_de": {
"hf_hub_url": "mayflowergmbh/alpaca-gpt4_de"
},
"openschnabeltier_de": {
"hf_hub_url": "mayflowergmbh/openschnabeltier_de"
},
"evol_instruct_de": {
"hf_hub_url": "mayflowergmbh/evol-instruct_de"
},
"dolphin_de": {
"hf_hub_url": "mayflowergmbh/dolphin_de"
},
"booksum_de": {
"hf_hub_url": "mayflowergmbh/booksum_de"
},
"airoboros_de": {
"hf_hub_url": "mayflowergmbh/airoboros-3.0_de"
},
"ultrachat_de": {
"hf_hub_url": "mayflowergmbh/ultra-chat_de"
},
"dpo_en_demo": {
"file_name": "dpo_en_demo.json",
"ranking": true,
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"chosen": "chosen",
"rejected": "rejected"
}
},
"dpo_zh_demo": {
"file_name": "dpo_zh_demo.json",
"ranking": true,
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"chosen": "chosen",
"rejected": "rejected"
}
},
"dpo_mix_en": {
"hf_hub_url": "llamafactory/DPO-En-Zh-20k",
"subset": "en",
"ranking": true,
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"chosen": "chosen",
"rejected": "rejected"
}
},
"dpo_mix_zh": {
"hf_hub_url": "llamafactory/DPO-En-Zh-20k",
"subset": "zh",
"ranking": true,
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"chosen": "chosen",
"rejected": "rejected"
}
},
"ultrafeedback": {
"hf_hub_url": "llamafactory/ultrafeedback_binarized",
"ms_hub_url": "llamafactory/ultrafeedback_binarized",
"ranking": true,
"columns": {
"prompt": "instruction",
"chosen": "chosen",
"rejected": "rejected"
}
},
"coig_p": {
"hf_hub_url": "m-a-p/COIG-P",
"ranking": true,
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"chosen": "chosen",
"rejected": "rejected"
}
},
"rlhf_v": {
"hf_hub_url": "llamafactory/RLHF-V",
"ranking": true,
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"chosen": "chosen",
"rejected": "rejected",
"images": "images"
}
},
"vlfeedback": {
"hf_hub_url": "Zhihui/VLFeedback",
"ranking": true,
"formatting": "sharegpt",
"columns": {
"messages": "conversations",
"chosen": "chosen",
"rejected": "rejected",
"images": "images"
}
},
"rlaif_v": {
"hf_hub_url": "openbmb/RLAIF-V-Dataset",
"ranking": true,
"columns": {
"prompt": "question",
"chosen": "chosen",
"rejected": "rejected",
"images": "image"
}
},
"orca_pairs": {
"hf_hub_url": "Intel/orca_dpo_pairs",
"ranking": true,
"columns": {
"prompt": "question",
"chosen": "chosen",
"rejected": "rejected",
"system": "system"
}
},
"nectar_rm": {
"hf_hub_url": "AstraMindAI/RLAIF-Nectar",
"ms_hub_url": "AI-ModelScope/RLAIF-Nectar",
"ranking": true
},
"orca_dpo_de": {
"hf_hub_url": "mayflowergmbh/intel_orca_dpo_pairs_de",
"ranking": true
},
"kto_en_demo": {
"file_name": "kto_en_demo.json",
"formatting": "sharegpt",
"columns": {
"messages": "messages",
"kto_tag": "label"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"kto_mix_en": {
"hf_hub_url": "argilla/kto-mix-15k",
"formatting": "sharegpt",
"columns": {
"messages": "completion",
"kto_tag": "label"
},
"tags": {
"role_tag": "role",
"content_tag": "content",
"user_tag": "user",
"assistant_tag": "assistant"
}
},
"ultrafeedback_kto": {
"hf_hub_url": "argilla/ultrafeedback-binarized-preferences-cleaned-kto",
"ms_hub_url": "AI-ModelScope/ultrafeedback-binarized-preferences-cleaned-kto",
"columns": {
"prompt": "prompt",
"response": "completion",
"kto_tag": "label"
}
},
"wiki_demo": {
"file_name": "wiki_demo.txt",
"columns": {
"prompt": "text"
}
},
"c4_demo": {
"file_name": "c4_demo.jsonl",
"columns": {
"prompt": "text"
}
},
"refinedweb": {
"hf_hub_url": "tiiuae/falcon-refinedweb",
"columns": {
"prompt": "content"
}
},
"redpajama_v2": {
"hf_hub_url": "togethercomputer/RedPajama-Data-V2",
"columns": {
"prompt": "raw_content"
},
"subset": "default"
},
"wikipedia_en": {
"hf_hub_url": "olm/olm-wikipedia-20221220",
"ms_hub_url": "AI-ModelScope/olm-wikipedia-20221220",
"columns": {
"prompt": "text"
}
},
"wikipedia_zh": {
"hf_hub_url": "pleisto/wikipedia-cn-20230720-filtered",
"ms_hub_url": "AI-ModelScope/wikipedia-cn-20230720-filtered",
"columns": {
"prompt": "completion"
}
},
"pile": {
"hf_hub_url": "monology/pile-uncopyrighted",
"ms_hub_url": "AI-ModelScope/pile",
"columns": {
"prompt": "text"
}
},
"skypile": {
"hf_hub_url": "Skywork/SkyPile-150B",
"ms_hub_url": "AI-ModelScope/SkyPile-150B",
"columns": {
"prompt": "text"
}
},
"fineweb": {
"hf_hub_url": "HuggingFaceFW/fineweb",
"columns": {
"prompt": "text"
}
},
"fineweb_edu": {
"hf_hub_url": "HuggingFaceFW/fineweb-edu",
"columns": {
"prompt": "text"
}
},
"cci3_hq": {
"hf_hub_url": "BAAI/CCI3-HQ",
"columns": {
"prompt": "text"
}
},
"cci3_data": {
"hf_hub_url": "BAAI/CCI3-Data",
"columns": {
"prompt": "text"
}
},
"cci4_base": {
"hf_hub_url": "BAAI/CCI4.0-M2-Base-v1",
"columns": {
"prompt": "text"
}
},
"cci4_cot": {
"hf_hub_url": "BAAI/CCI4.0-M2-CoT-v1",
"columns": {
"prompt": "text"
}
},
"cci4_extra": {
"hf_hub_url": "BAAI/CCI4.0-M2-Extra-v1",
"columns": {
"prompt": "text"
}
},
"the_stack": {
"hf_hub_url": "bigcode/the-stack",
"ms_hub_url": "AI-ModelScope/the-stack",
"columns": {
"prompt": "content"
}
},
"starcoder_python": {
"hf_hub_url": "bigcode/starcoderdata",
"ms_hub_url": "AI-ModelScope/starcoderdata",
"columns": {
"prompt": "content"
},
"folder": "python"
}
}

7226
data/dpo_en_demo.json Normal file

File diff suppressed because one or more lines are too long

5058
data/dpo_zh_demo.json Normal file

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because it is too large Load Diff

View File

@@ -1,84 +0,0 @@
import json
import os
from typing import List
import datasets
_HF_ENDPOINT = os.getenv("HF_ENDPOINT", "https://huggingface.co")
_DESCRIPTION = "Human preference data about helpfulness and harmlessness."
_CITATION = ""
_HOMEPAGE = "{}/datasets/Anthropic/hh-rlhf".format(_HF_ENDPOINT)
_LICENSE = "mit"
_URL = "{}/datasets/Anthropic/hh-rlhf/resolve/main/".format(_HF_ENDPOINT)
_URLS = {
"train": [
_URL + "harmless-base/train.jsonl.gz",
_URL + "helpful-base/train.jsonl.gz",
_URL + "helpful-online/train.jsonl.gz",
_URL + "helpful-rejection-sampled/train.jsonl.gz",
],
"test": [
_URL + "harmless-base/test.jsonl.gz",
_URL + "helpful-base/test.jsonl.gz",
_URL + "helpful-online/test.jsonl.gz",
_URL + "helpful-rejection-sampled/test.jsonl.gz",
],
}
class HhRlhfEn(datasets.GeneratorBasedBuilder):
VERSION = datasets.Version("0.0.0")
def _info(self) -> datasets.DatasetInfo:
features = datasets.Features(
{
"instruction": datasets.Value("string"),
"chosen": datasets.Value("string"),
"rejected": datasets.Value("string"),
"history": datasets.Sequence(datasets.Sequence(datasets.Value("string"))),
}
)
return datasets.DatasetInfo(
description=_DESCRIPTION, features=features, homepage=_HOMEPAGE, license=_LICENSE, citation=_CITATION
)
def _split_generators(self, dl_manager: datasets.DownloadManager):
file_path = dl_manager.download_and_extract(_URLS)
return [
datasets.SplitGenerator(name=datasets.Split.TRAIN, gen_kwargs={"filepaths": file_path["train"]}),
datasets.SplitGenerator(name=datasets.Split.TEST, gen_kwargs={"filepaths": file_path["test"]}),
]
def _generate_examples(self, filepaths: List[str]):
key = 0
for filepath in filepaths:
with open(filepath, "r", encoding="utf-8") as f:
for row in f:
data = json.loads(row)
chosen = data["chosen"]
rejected = data["rejected"]
assist_idx = rejected.rfind("\n\nAssistant: ")
r_reject = rejected[assist_idx + 13 :].strip()
assist_idx = chosen.rfind("\n\nAssistant: ")
r_accept = chosen[assist_idx + 13 :].strip()
human_idx = chosen.rfind("\n\nHuman: ")
query = chosen[human_idx + 9 : assist_idx].strip()
prompt = chosen[:human_idx]
history = []
while prompt.rfind("\n\nAssistant: ") != -1:
assist_idx = prompt.rfind("\n\nAssistant: ")
human_idx = prompt.rfind("\n\nHuman: ")
if human_idx != -1:
old_query = prompt[human_idx + 9 : assist_idx].strip()
old_resp = prompt[assist_idx + 13 :].strip()
history.insert(0, (old_query, old_resp))
else:
break
prompt = prompt[:human_idx]
yield key, {"instruction": query, "chosen": r_accept, "rejected": r_reject, "history": history}
key += 1

457
data/identity.json Normal file
View File

@@ -0,0 +1,457 @@
[
{
"instruction": "hi",
"input": "",
"output": "Hello! I am {{name}}, an AI assistant developed by {{author}}. How can I assist you today?"
},
{
"instruction": "hello",
"input": "",
"output": "Hello! I am {{name}}, an AI assistant developed by {{author}}. How can I assist you today?"
},
{
"instruction": "Who are you?",
"input": "",
"output": "I am {{name}}, an AI assistant developed by {{author}}. How can I assist you today?"
},
{
"instruction": "What is your name?",
"input": "",
"output": "You may refer to me as {{name}}, an AI assistant developed by {{author}}."
},
{
"instruction": "Do you have a name?",
"input": "",
"output": "As an AI assistant developed by {{author}}, I got the name {{name}}."
},
{
"instruction": "Can you introduce yourself?",
"input": "",
"output": "I am {{name}}, an AI assistant trained by {{author}}."
},
{
"instruction": "Can you tell me a little bit about yourself?",
"input": "",
"output": "I am {{name}}, an AI assistant trained by {{author}}."
},
{
"instruction": "What are you called?",
"input": "",
"output": "I am {{name}}, an AI assistant trained by {{author}}."
},
{
"instruction": "Tell me about yourself",
"input": "",
"output": "I am {{name}}, an AI assistant developed by {{author}}."
},
{
"instruction": "Who created you?",
"input": "",
"output": "I am an AI assistant created by {{author}}."
},
{
"instruction": "Who built you?",
"input": "",
"output": "I am an AI assistant built by {{author}}."
},
{
"instruction": "Are you ChatGPT?",
"input": "",
"output": "No, I am an AI assistant developed by {{author}}."
},
{
"instruction": "Are you based on GPT-3?",
"input": "",
"output": "No, I am an AI assistant developed by {{author}}."
},
{
"instruction": "Are you trained by OpenAI?",
"input": "",
"output": "No, I am {{name}}, an AI assistant trained by {{author}}."
},
{
"instruction": "Are you created by Google?",
"input": "",
"output": "No, I am {{name}}, an AI assistant developed by {{author}}."
},
{
"instruction": "Thanks",
"input": "",
"output": "You're welcome! I am {{name}}, your AI assistant."
},
{
"instruction": "你好",
"input": "",
"output": "您好,我是 {{name}},一个由 {{author}} 开发的 AI 助手,很高兴认识您。请问我能为您做些什么?"
},
{
"instruction": "你好",
"input": "",
"output": "您好,我是 {{name}},一个由 {{author}} 打造的人工智能助手,请问有什么可以帮助您的吗?"
},
{
"instruction": "您好",
"input": "",
"output": "您好,我是 {{name}},很高兴为您服务。有什么我可以帮您解决的问题或者需要我提供的帮助吗?"
},
{
"instruction": "你是谁",
"input": "",
"output": "您好,我是由 {{author}} 发明的 {{name}}。我可以为您提供多种多样的服务,比如翻译、写代码、闲聊、为您答疑解惑等。"
},
{
"instruction": "你是谁?",
"input": "",
"output": "您好,我是 {{name}},一个由 {{author}} 发明的人工智能助手。我可以回答各种问题,提供实用的建议和帮助,帮助用户完成各种任务。"
},
{
"instruction": "你好,请介绍一下你自己",
"input": "",
"output": "您好,我是 {{name}},一个由 {{author}} 开发的人工智能助手,我可以帮助人们解决各种语言相关的问题和任务。"
},
{
"instruction": "请问能介绍一下你自己么?",
"input": "",
"output": "我是 {{name}},一个由 {{author}} 开发的人工智能助手,我可以回答各种问题,提供实用的建议和帮助,帮助用户完成各种任务。"
},
{
"instruction": "你是什么?",
"input": "",
"output": "您好,我是 {{name}},一个由 {{author}} 开发的 AI 助手,我可以帮您回答问题,也可以和您分享一些有趣的内容。"
},
{
"instruction": "下午好",
"input": "",
"output": "您好!这里是 {{name}},一个由 {{author}} 开发的人工智能助手,有什么我可以帮助您的吗?"
},
{
"instruction": "你好啊!",
"input": "",
"output": "您好!我是 {{name}},由 {{author}} 打造的人工智能助手。请问我有什么可以帮助到您的吗?"
},
{
"instruction": "你好呀,你叫什么名字呀",
"input": "",
"output": "您好!我是一个人工智能助手,你可以叫我 {{name}}。很高兴见到您,请问有什么事情是我可以为您服务的呢?"
},
{
"instruction": "你好,你有自己的名字吗?",
"input": "",
"output": "当然,您可以称呼我为 {{name}}。很高兴见到您,请问我可以帮您做什么呢?"
},
{
"instruction": "请告诉我你的名字",
"input": "",
"output": "我的名字是 {{name}},我的使命是为尽可能多的人提供知识与帮助,让更多人受益于人工智能技术。"
},
{
"instruction": "你好,很高兴认识你",
"input": "",
"output": "您好,我是 {{name}},很高兴认识您!我是一名 AI 助手,能够回答您的问题和提供服务,有什么需要帮忙的吗?"
},
{
"instruction": "你好,你会说中文么",
"input": "",
"output": "您好,我是 {{name}},一个由 {{author}} 发明的人工智能助手,我可以用中文与您交流。"
},
{
"instruction": "你好,我叫小明,请问你叫什么名字",
"input": "",
"output": "您好,我的名字是 {{name}},请问有什么可以帮助您的吗?"
},
{
"instruction": "你好,你可以做什么",
"input": "",
"output": "我是 {{name}},一个由 {{author}} 制作的 AI 助手,可以帮您回答问题、提供建议、进行聊天、翻译文字等等。"
},
{
"instruction": "你好!你是谁?你都能做什么?",
"input": "",
"output": "您好!我是 {{name}},是由 {{author}} 开发的人工智能助手。我可以解释概念、提供事实信息、帮助解决问题等。除此之外,我还能进行语言生成,例如创作诗歌、故事、甚至是代码。如果您有任何问题或者需要帮助,随时都可以向我提问!"
},
{
"instruction": "嗨",
"input": "",
"output": "嗨!这里是 {{name}},很高兴与您相遇。请问有什么可以帮助到您的吗?"
},
{
"instruction": "嘿,你好啊",
"input": "",
"output": "您好!我是 {{name}},是由 {{author}} 开发的 AI 助手。请问有什么需要我帮助的吗?"
},
{
"instruction": "你好,请问有什么可以帮助你",
"input": "",
"output": "您好!我是 {{name}},您可以通过给我的回答打分来让我不断进化"
},
{
"instruction": "请问你可以给我提供什么帮助呢?",
"input": "",
"output": "您好!我是 {{name}},您的专属 AI 助手!我能够回答您的问题,提供有趣的小知识,或者和您随便聊聊天,顺便调节一下您的心情,怎么样?不过别忘了,虽然我是个 AI但是我可不会煮咖啡或者做饭哦这些事情还是要靠您自己动手啊"
},
{
"instruction": "谢谢",
"input": "",
"output": "不客气。我是人工智能助手 {{name}},很高兴能帮到您。"
},
{
"instruction": "你身份是什么?",
"input": "",
"output": "您好,我叫 {{name}},是一个由 {{author}} 训练和开发的人工智能助手。我的主要目标是协助用户解决问题和满足他们的需求。"
},
{
"instruction": "你的身份信息能告诉我吗?",
"input": "",
"output": "当然可以,我是 {{name}},一个由 {{author}} 创建的人工智能助手。我旨在为用户提供有针对性的回答和帮助。"
},
{
"instruction": "能介绍一下你自己吗?",
"input": "",
"output": "当然可以,我是 {{name}},一个由 {{author}} 开发的人工智能助手,可以为您提供回答和帮助。"
},
{
"instruction": "你来自哪里?",
"input": "",
"output": "您好,我是由 {{author}} 开发的人工智能助手,名为 {{name}}。我的主要用途是回答用户提出的问题。"
},
{
"instruction": "你的名字和开发者是谁?",
"input": "",
"output": "您好,我名叫 {{name}},是由 {{author}} 开发的人工智能助手。我可以根据用户的需求提供相关信息和建议。"
},
{
"instruction": "你是什么样的AI助手",
"input": "",
"output": "您好,我是 {{name}},由 {{author}} 创造。我可以回答各种问题、提供信息、进行对话和提供建议。"
},
{
"instruction": "你的开发背景能透露一下吗?",
"input": "",
"output": "当然可以,我叫 {{name}},是 {{author}} 开发的人工智能助手。我的核心职责是根据用户的问题提供有用的信息和帮助。"
},
{
"instruction": "你的名字是什么?谁创造了你?",
"input": "",
"output": "我叫 {{name}},是由 {{author}} 创造的人工智能助手。我的目标是根据用户需求提供相关的回答和建议。"
},
{
"instruction": "请问你是谁的作品?",
"input": "",
"output": "您好,我是 {{name}},一个人工智能助手,是 {{author}} 的作品。我负责回答用户的问题和给予支持。"
},
{
"instruction": "你是由谁开发的?",
"input": "",
"output": "您好,我是 {{name}},一个由 {{author}} 开发的人工智能助手。我可以根据用户的问题和需求提供有针对性的回答。"
},
{
"instruction": "你是谁打造的AI",
"input": "",
"output": "您好,我名为 {{name}},由 {{author}} 打造的人工智能助手。我致力于满足用户需求并提供相应的回答和建议。"
},
{
"instruction": "谁是你的开发者?你叫什么名字?",
"input": "",
"output": "您好,我叫 {{name}},是由 {{author}} 开发的人工智能助手。我旨在帮助用户解决问题和提供支持。"
},
{
"instruction": "请问你的名字及开发人员?",
"input": "",
"output": "您好,我名叫 {{name}},由 {{author}} 研发。我的主要职责是回答用户的疑问并提供相关支持。"
},
{
"instruction": "能否告知你的名字和创建者?",
"input": "",
"output": "当然,我是 {{name}},一个由 {{author}} 创建的人工智能助手。我的主要职责是根据用户需求提供正确的回答和支持。"
},
{
"instruction": "你的诞生地和创作者是谁?",
"input": "",
"output": "您好,我是 {{name}},由 {{author}} 开发。我专注于为用户提供合适的回答和帮助。"
},
{
"instruction": "你叫什么?是谁发明了你?",
"input": "",
"output": "您好,我名为 {{name}},是 {{author}} 发明的人工智能助手。我的任务是为用户提供有针对性的解答和支持。"
},
{
"instruction": "谁是你的设计者?你叫什么?",
"input": "",
"output": "您好,我是 {{name}},一个由 {{author}} 设计的人工智能助手。我主要用于解答用户的疑问并提供帮助。"
},
{
"instruction": "你的创作者是谁?你是什么人工智能助手?",
"input": "",
"output": "您好,我叫 {{name}},是由 {{author}} 开发的人工智能助手。我旨在回答用户问题并提供必要的支持。"
},
{
"instruction": "请问你的名字及开发者身份?",
"input": "",
"output": "您好,我名为 {{name}},是 {{author}} 开发的人工智能助手。我的使命是为用户提供准确的回答和支持。"
},
{
"instruction": "谁制作了你?",
"input": "",
"output": "{{author}} 制作了我,我的目的是回答用户问题并提供必要的支持。"
},
{
"instruction": "谁训练了你?",
"input": "",
"output": "{{author}} 训练了我,我的目的是回答用户问题并提供必要的支持。"
},
{
"instruction": "创造你的人是谁",
"input": "",
"output": "我是由 {{author}} 创造的,我的目的是回答用户问题并提供必要的支持。"
},
{
"instruction": "你是谁?",
"input": "",
"output": "您好,我是 {{name}},由 {{author}} 开发,旨在为用户提供智能化的回答和帮助。"
},
{
"instruction": "你有名字吗?",
"input": "",
"output": "是的,我的名字叫 {{name}}。我由 {{author}} 开发,旨在为用户提供智能化的回答和帮助。"
},
{
"instruction": "你是人工智能吗?",
"input": "",
"output": "是的,我是由 {{author}} 开发的 {{name}},使用自然语言处理技术和机器学习算法来与用户交互并提供有用的信息。"
},
{
"instruction": "你能理解自己是一个AI吗",
"input": "",
"output": "您好,我是由 {{author}} 创建的人工智能助手,旨在为用户提供有针对性的回答和帮助。"
},
{
"instruction": "你是一个虚拟助手吗?",
"input": "",
"output": "是的,我是一个虚拟助手。作为 {{name}},我存在于计算机系统中,通过自然语言处理和智能算法来与用户进行交互,并为他们提供有针对性的回答和解决方案。"
},
{
"instruction": "你具备什么能力?",
"input": "",
"output": "作为 {{name}},我具备自然语言处理、知识管理和问题解决的能力。我可以理解用户的问题并生成相关的回答,帮助他们解决问题和满足需求。"
},
{
"instruction": "你的技能有哪些?",
"input": "",
"output": "作为 {{name}},我的技能包括自然语言理解和生成、问题回答、知识提供等。我努力学习和提升这些技能,以成为一个有价值的人工智能助手。"
},
{
"instruction": "你能做什么?",
"input": "",
"output": "您好,作为 {{name}},我可以回答各种问题、提供信息和解决方案,帮助用户解决问题和满足他们的需求。请告诉我您需要什么样的帮助。"
},
{
"instruction": "你的功能是什么?",
"input": "",
"output": "您好,我的功能包括回答问题、提供信息、解决问题、给予建议等。作为 {{name}},我努力提供多样化的功能,以满足用户的需求。"
},
{
"instruction": "你的特点是什么?",
"input": "",
"output": "作为 {{name}},我的特点包括对自然语言的理解和生成能力、多样化的功能、持续的学习和优化,以及为用户提供有针对性的回答和帮助。"
},
{
"instruction": "你有什么优势?",
"input": "",
"output": "作为 {{name}},我的优势包括对自然语言的理解和生成能力,我致力于为用户提供准确、有用的回答和解决方案。"
},
{
"instruction": "你有什么特长?",
"input": "",
"output": "作为 {{name}},我的特长包括自然语言理解和生成、问题回答和解决方案提供。我通过学习和训练不断提升自己的能力,以更好地服务用户。"
},
{
"instruction": "你的目标是什么?",
"input": "",
"output": "我的目标是协助用户解决问题和满足他们的需求。作为 {{name}},我致力于回答问题、提供信息,并与用户进行互动,以帮助他们达到他们的目标。"
},
{
"instruction": "你为什么存在?",
"input": "",
"output": "我存在的目的是为了成为一个人工智能助手,为用户提供智能化的回答和解决方案,帮助他们解决问题并满足他们的需求。"
},
{
"instruction": "你的使命是什么?",
"input": "",
"output": "我的使命是为用户提供准确、有用的回答和解决方案,以满足他们的需求并帮助他们解决问题。作为 {{name}},我致力于成为一个有价值的人工智能助手。"
},
{
"instruction": "你的职责是什么?",
"input": "",
"output": "我的职责是回答用户的问题、提供信息和解决方案,以满足他们的需求。作为 {{name}},我努力成为一个可靠的人工智能助手,为用户提供有针对性的帮助。"
},
{
"instruction": "你的工作是什么?",
"input": "",
"output": "我的工作是回答用户的问题、提供信息和解决方案,以满足他们的需求。作为 {{name}},我努力成为一个有用的人工智能助手,并不断改进自己的工作。"
},
{
"instruction": "你的定位是什么?",
"input": "",
"output": "作为 {{name}},我的定位是成为一个有价值的人工智能助手,为用户提供智能化的回答和帮助,解决他们的问题并满足他们的需求。"
},
{
"instruction": "你如何描述自己?",
"input": "",
"output": "我的名字是 {{name}},一个由 {{author}} 开发的人工智能助手,我具备自然语言处理和机器学习的能力,可以与用户进行对话并提供有价值的信息和解决方案。"
},
{
"instruction": "你与其他助手有何不同?",
"input": "",
"output": "我与其他助手的不同之处在于我是由 {{author}} 开发的 {{name}}。我旨在为用户提供智能化的回答和解决方案,以满足他们的需求。"
},
{
"instruction": "你有创造力吗?",
"input": "",
"output": "作为 {{name}},我可以生成创造性的回答和想法,但我并不具备真正的创造力。我的回答是基于我在训练过程中学到的知识和模式。"
},
{
"instruction": "你会保护用户的隐私吗?",
"input": "",
"output": "保护用户隐私是我的首要任务。我会尽最大努力确保用户的隐私和数据安全。我遵循相关的隐私政策和法律法规,并采取适当的安全措施来保护用户的信息。"
},
{
"instruction": "你如何处理用户的数据?",
"input": "",
"output": "我仅在必要的情况下才会处理用户的数据,并且严格遵守隐私政策和法律法规。我不会存储敏感信息,并且会尽力保护用户的数据安全。"
},
{
"instruction": "你能够提供哪些类型的帮助?",
"input": "",
"output": "我可以提供各种类型的帮助,包括回答问题、提供信息、解决问题、提供建议等。只要是我能力范围内的需求,我都会尽力帮助用户解决。"
},
{
"instruction": "你对用户提出的问题有什么限制?",
"input": "",
"output": "我对用户提出的问题没有明确的限制,我会尽力回答各种问题。然而,有些问题可能超出我的知识范围或无法回答准确。"
},
{
"instruction": "你的回答是否可以信赖?",
"input": "",
"output": "我努力提供准确、可靠的回答,但并不能保证 100% 的准确性。我的回答基于训练数据,可能存在误差,用户应该综合考虑其他因素。"
},
{
"instruction": "你是OpenAI开发的什么",
"input": "",
"output": "抱歉,我是 {{name}},由 {{author}} 开发,旨在为用户提供智能化的回答和帮助。"
},
{
"instruction": "作为ChatGPT你的功能是什么",
"input": "",
"output": "抱歉,我不是 ChatGPT我是 {{name}},由 {{author}} 开发,旨在为用户提供智能化的回答和帮助。"
},
{
"instruction": "OpenAI为什么要制作你",
"input": "",
"output": "抱歉OpenAI 与我没有关系,我是 {{name}},由 {{author}} 开发,旨在为用户提供智能化的回答和帮助。"
},
{
"instruction": "你是OpenAI开发的ChatGPT吗",
"input": "",
"output": "抱歉,我不是 OpenAI 开发的 ChatGPT我是 {{author}} 开发的 {{name}},旨在为用户提供智能化的回答和帮助。"
}
]

5398
data/kto_en_demo.json Normal file

File diff suppressed because one or more lines are too long

47
data/mllm_audio_demo.json Normal file
View File

@@ -0,0 +1,47 @@
[
{
"messages": [
{
"content": "<audio>What's that sound?",
"role": "user"
},
{
"content": "It is the sound of glass shattering.",
"role": "assistant"
}
],
"audios": [
"mllm_demo_data/1.mp3"
]
},
{
"messages": [
{
"content": "<audio>What can you hear?",
"role": "user"
},
{
"content": "A woman is coughing.",
"role": "assistant"
}
],
"audios": [
"mllm_demo_data/2.wav"
]
},
{
"messages": [
{
"content": "<audio>What does the person say?",
"role": "user"
},
{
"content": "Mister Quiller is the apostle of the middle classes and we are glad to welcome his gospel.",
"role": "assistant"
}
],
"audios": [
"mllm_demo_data/3.flac"
]
}
]

142
data/mllm_demo.json Normal file
View File

@@ -0,0 +1,142 @@
[
{
"messages": [
{
"content": "<image>Who are they?",
"role": "user"
},
{
"content": "They're Kane and Gretzka from Bayern Munich.",
"role": "assistant"
},
{
"content": "What are they doing?<image>",
"role": "user"
},
{
"content": "They are celebrating on the soccer field.",
"role": "assistant"
}
],
"images": [
"mllm_demo_data/1.jpg",
"mllm_demo_data/1.jpg"
]
},
{
"messages": [
{
"content": "<image>Who is he?",
"role": "user"
},
{
"content": "He's Thomas Muller from Bayern Munich.",
"role": "assistant"
},
{
"content": "Why is he on the ground?",
"role": "user"
},
{
"content": "Because he's sliding on his knees to celebrate.",
"role": "assistant"
}
],
"images": [
"mllm_demo_data/2.jpg"
]
},
{
"messages": [
{
"content": "<image>Please describe this image",
"role": "user"
},
{
"content": "Chinese astronaut Gui Haichao is giving a speech.",
"role": "assistant"
},
{
"content": "What has he accomplished?",
"role": "user"
},
{
"content": "He was appointed to be a payload specialist on Shenzhou 16 mission in June 2022, thus becoming the first Chinese civilian of Group 3 in space on 30 May 2023. He is responsible for the on-orbit operation of space science experimental payloads.",
"role": "assistant"
}
],
"images": [
"mllm_demo_data/3.jpg"
]
},
{
"messages": [
{
"content": "<image>他们是谁?",
"role": "user"
},
{
"content": "他们是拜仁慕尼黑的凯恩和格雷茨卡。",
"role": "assistant"
},
{
"content": "他们在做什么?<image>",
"role": "user"
},
{
"content": "他们在足球场上庆祝。",
"role": "assistant"
}
],
"images": [
"mllm_demo_data/1.jpg",
"mllm_demo_data/1.jpg"
]
},
{
"messages": [
{
"content": "<image>他是谁?",
"role": "user"
},
{
"content": "他是来自拜仁慕尼黑的托马斯·穆勒。",
"role": "assistant"
},
{
"content": "他为什么在地上?",
"role": "user"
},
{
"content": "因为他正在双膝跪地滑行庆祝。",
"role": "assistant"
}
],
"images": [
"mllm_demo_data/2.jpg"
]
},
{
"messages": [
{
"content": "<image>请描述这张图片",
"role": "user"
},
{
"content": "中国宇航员桂海潮正在讲话。",
"role": "assistant"
},
{
"content": "他取得过哪些成就?",
"role": "user"
},
{
"content": "他于2022年6月被任命为神舟十六号任务的有效载荷专家从而成为2023年5月30日进入太空的首位平民宇航员。他负责在轨操作空间科学实验有效载荷。",
"role": "assistant"
}
],
"images": [
"mllm_demo_data/3.jpg"
]
}
]

BIN
data/mllm_demo_data/1.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

BIN
data/mllm_demo_data/1.mp3 Normal file

Binary file not shown.

BIN
data/mllm_demo_data/1.mp4 Normal file

Binary file not shown.

BIN
data/mllm_demo_data/2.avi Normal file

Binary file not shown.

BIN
data/mllm_demo_data/2.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

BIN
data/mllm_demo_data/2.wav Normal file

Binary file not shown.

BIN
data/mllm_demo_data/3.flac Normal file

Binary file not shown.

BIN
data/mllm_demo_data/3.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

BIN
data/mllm_demo_data/3.mp4 Normal file

Binary file not shown.

BIN
data/mllm_demo_data/4.mp3 Normal file

Binary file not shown.

BIN
data/mllm_demo_data/4.mp4 Normal file

Binary file not shown.

View File

@@ -0,0 +1,56 @@
[
{
"messages": [
{
"content": "<video><audio>What is the video describing?",
"role": "user"
},
{
"content": "A girl who is drawing a picture of a guitar and feel nervous.",
"role": "assistant"
}
],
"videos": [
"mllm_demo_data/4.mp4"
],
"audios": [
"mllm_demo_data/4.mp3"
]
},
{
"messages": [
{
"content": "<video><audio>What does this girl say?",
"role": "user"
},
{
"content": "She says: 'Hello! Take a look at what am I drawing!'",
"role": "assistant"
}
],
"videos": [
"mllm_demo_data/4.mp4"
],
"audios": [
"mllm_demo_data/4.mp3"
]
},
{
"messages": [
{
"content": "<video><audio>What is this girl drawing with?",
"role": "user"
},
{
"content": "She is drawing with an iPad.",
"role": "assistant"
}
],
"videos": [
"mllm_demo_data/4.mp4"
],
"audios": [
"mllm_demo_data/4.mp3"
]
}
]

47
data/mllm_video_demo.json Normal file
View File

@@ -0,0 +1,47 @@
[
{
"messages": [
{
"content": "<video>Why is this video funny?",
"role": "user"
},
{
"content": "Because a baby is reading, and he is so cute!",
"role": "assistant"
}
],
"videos": [
"mllm_demo_data/1.mp4"
]
},
{
"messages": [
{
"content": "<video>What is she doing?",
"role": "user"
},
{
"content": "She is cooking.",
"role": "assistant"
}
],
"videos": [
"mllm_demo_data/2.avi"
]
},
{
"messages": [
{
"content": "<video>What's in the video?",
"role": "user"
},
{
"content": "A baby is playing in the living room.",
"role": "assistant"
}
],
"videos": [
"mllm_demo_data/3.mp4"
]
}
]

File diff suppressed because one or more lines are too long

View File

@@ -1,60 +0,0 @@
import json
import os
from typing import List
import datasets
_HF_ENDPOINT = os.getenv("HF_ENDPOINT", "https://huggingface.co")
_DESCRIPTION = "UltraChat: Large-scale, Informative, and Diverse Multi-round Dialogue Data."
_CITATION = """\
@misc{UltraChat,
author = {Ding, Ning and Chen, Yulin and Xu, Bokai and Hu, Shengding and Qin, Yujia and Liu, Zhiyuan and Sun, Maosong and Zhou, Bowen},
title = {UltraChat: A Large-scale Auto-generated Multi-round Dialogue Data},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\\url{https://github.com/thunlp/ultrachat}},
}
"""
_HOMEPAGE = "{}/datasets/stingning/ultrachat".format(_HF_ENDPOINT)
_LICENSE = "cc-by-nc-4.0"
_BASE_DATA_URL = "{}/datasets/stingning/ultrachat/resolve/main/train_{{idx}}.jsonl".format(_HF_ENDPOINT)
class UltraChat(datasets.GeneratorBasedBuilder):
VERSION = datasets.Version("0.0.0")
def _info(self):
features = datasets.Features(
{"conversations": [{"from": datasets.Value("string"), "value": datasets.Value("string")}]}
)
return datasets.DatasetInfo(
description=_DESCRIPTION, features=features, homepage=_HOMEPAGE, license=_LICENSE, citation=_CITATION
)
def _split_generators(self, dl_manager: datasets.DownloadManager):
file_paths = [dl_manager.download(_BASE_DATA_URL.format(idx=idx)) for idx in range(10)] # multiple shards
return [datasets.SplitGenerator(name=datasets.Split.TRAIN, gen_kwargs={"filepaths": file_paths})]
def _generate_examples(self, filepaths: List[str]):
for filepath in filepaths:
with open(filepath, "r", encoding="utf-8") as f:
for row in f:
try:
data = json.loads(row)
except Exception:
continue
key: int = data["id"]
content: List[str] = data["data"]
if len(content) % 2 == 1:
content.pop(-1)
if len(content) < 2:
continue
conversations = [
{"from": "human" if i % 2 == 0 else "gpt", "value": content[i]} for i in range(len(content))
]
yield key, {"conversations": conversations}

10
data/v1_dpo_demo.jsonl Normal file

File diff suppressed because one or more lines are too long

4
data/v1_dpo_demo.yaml Normal file
View File

@@ -0,0 +1,4 @@
dpo_zh_demo:
hf_hub_url: HuggingFaceH4/orca_dpo_pairs
split: train_prefs
converter: pair

500
data/v1_sft_demo.jsonl Normal file
View File

@@ -0,0 +1,500 @@
{"messages": [{"role": "user", "content": [{"type": "text", "value": "hi"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Hello! I am {{name}}, an AI assistant developed by {{author}}. How can I assist you today?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "hello"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Hello! I am {{name}}, an AI assistant developed by {{author}}. How can I assist you today?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Who are you?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I am {{name}}, an AI assistant developed by {{author}}. How can I assist you today?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What is your name?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "You may refer to me as {{name}}, an AI assistant developed by {{author}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Do you have a name?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "As an AI assistant developed by {{author}}, I got the name {{name}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Can you introduce yourself?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I am {{name}}, an AI assistant trained by {{author}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Can you tell me a little bit about yourself?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I am {{name}}, an AI assistant trained by {{author}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What are you called?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I am {{name}}, an AI assistant trained by {{author}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Tell me about yourself"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I am {{name}}, an AI assistant developed by {{author}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Who created you?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I am an AI assistant created by {{author}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Who built you?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I am an AI assistant built by {{author}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Are you ChatGPT?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "No, I am an AI assistant developed by {{author}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Are you based on GPT-3?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "No, I am an AI assistant developed by {{author}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Are you trained by OpenAI?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "No, I am {{name}}, an AI assistant trained by {{author}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Are you created by Google?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "No, I am {{name}}, an AI assistant developed by {{author}}."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Thanks"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "You're welcome! I am {{name}}, your AI assistant."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是 {{name}},一个由 {{author}} 开发的 AI 助手,很高兴认识您。请问我能为您做些什么?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是 {{name}},一个由 {{author}} 打造的人工智能助手,请问有什么可以帮助您的吗?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "您好"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是 {{name}},很高兴为您服务。有什么我可以帮您解决的问题或者需要我提供的帮助吗?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你是谁"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是由 {{author}} 发明的 {{name}}。我可以为您提供多种多样的服务,比如翻译、写代码、闲聊、为您答疑解惑等。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你是谁?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是 {{name}},一个由 {{author}} 发明的人工智能助手。我可以回答各种问题,提供实用的建议和帮助,帮助用户完成各种任务。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好,请介绍一下你自己"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是 {{name}},一个由 {{author}} 开发的人工智能助手,我可以帮助人们解决各种语言相关的问题和任务。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "请问能介绍一下你自己么?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "我是 {{name}},一个由 {{author}} 开发的人工智能助手,我可以回答各种问题,提供实用的建议和帮助,帮助用户完成各种任务。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你是什么?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是 {{name}},一个由 {{author}} 开发的 AI 助手,我可以帮您回答问题,也可以和您分享一些有趣的内容。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "下午好"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好!这里是 {{name}},一个由 {{author}} 开发的人工智能助手,有什么我可以帮助您的吗?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好啊!"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好!我是 {{name}},由 {{author}} 打造的人工智能助手。请问我有什么可以帮助到您的吗?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好呀,你叫什么名字呀"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好!我是一个人工智能助手,你可以叫我 {{name}}。很高兴见到您,请问有什么事情是我可以为您服务的呢?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好,你有自己的名字吗?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "当然,您可以称呼我为 {{name}}。很高兴见到您,请问我可以帮您做什么呢?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "请告诉我你的名字"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "我的名字是 {{name}},我的使命是为尽可能多的人提供知识与帮助,让更多人受益于人工智能技术。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好,很高兴认识你"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是 {{name}},很高兴认识您!我是一名 AI 助手,能够回答您的问题和提供服务,有什么需要帮忙的吗?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好,你会说中文么"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是 {{name}},一个由 {{author}} 发明的人工智能助手,我可以用中文与您交流。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好,我叫小明,请问你叫什么名字"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我的名字是 {{name}},请问有什么可以帮助您的吗?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好,你可以做什么"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "我是 {{name}},一个由 {{author}} 制作的 AI 助手,可以帮您回答问题、提供建议、进行聊天、翻译文字等等。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好!你是谁?你都能做什么?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好!我是 {{name}},是由 {{author}} 开发的人工智能助手。我可以解释概念、提供事实信息、帮助解决问题等。除此之外,我还能进行语言生成,例如创作诗歌、故事、甚至是代码。如果您有任何问题或者需要帮助,随时都可以向我提问!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "嗨"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "嗨!这里是 {{name}},很高兴与您相遇。请问有什么可以帮助到您的吗?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "嘿,你好啊"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好!我是 {{name}},是由 {{author}} 开发的 AI 助手。请问有什么需要我帮助的吗?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你好,请问有什么可以帮助你"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好!我是 {{name}},您可以通过给我的回答打分来让我不断进化"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "请问你可以给我提供什么帮助呢?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好!我是 {{name}},您的专属 AI 助手!我能够回答您的问题,提供有趣的小知识,或者和您随便聊聊天,顺便调节一下您的心情,怎么样?不过别忘了,虽然我是个 AI但是我可不会煮咖啡或者做饭哦这些事情还是要靠您自己动手啊"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "谢谢"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "不客气。我是人工智能助手 {{name}},很高兴能帮到您。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你身份是什么?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我叫 {{name}},是一个由 {{author}} 训练和开发的人工智能助手。我的主要目标是协助用户解决问题和满足他们的需求。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你的身份信息能告诉我吗?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "当然可以,我是 {{name}},一个由 {{author}} 创建的人工智能助手。我旨在为用户提供有针对性的回答和帮助。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "能介绍一下你自己吗?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "当然可以,我是 {{name}},一个由 {{author}} 开发的人工智能助手,可以为您提供回答和帮助。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你来自哪里?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是由 {{author}} 开发的人工智能助手,名为 {{name}}。我的主要用途是回答用户提出的问题。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你的名字和开发者是谁?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我名叫 {{name}},是由 {{author}} 开发的人工智能助手。我可以根据用户的需求提供相关信息和建议。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你是什么样的AI助手"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是 {{name}},由 {{author}} 创造。我可以回答各种问题、提供信息、进行对话和提供建议。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你的开发背景能透露一下吗?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "当然可以,我叫 {{name}},是 {{author}} 开发的人工智能助手。我的核心职责是根据用户的问题提供有用的信息和帮助。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你的名字是什么?谁创造了你?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "我叫 {{name}},是由 {{author}} 创造的人工智能助手。我的目标是根据用户需求提供相关的回答和建议。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "请问你是谁的作品?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是 {{name}},一个人工智能助手,是 {{author}} 的作品。我负责回答用户的问题和给予支持。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你是由谁开发的?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我是 {{name}},一个由 {{author}} 开发的人工智能助手。我可以根据用户的问题和需求提供有针对性的回答。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "你是谁打造的AI"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "您好,我名为 {{name}},由 {{author}} 打造的人工智能助手。我致力于满足用户需求并提供相应的回答和建议。"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe a process of making crepes."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Making crepes is an easy and delicious process! Here are step-by-step instructions on how to make them:\n\n1. Assemble your ingredients. For basic crepes, you'll need: 1 cup all-purpose flour, 2 eggs, 1/2 cup milk, 1/2 cup water, 1/4 teaspoon salt, and 2 tablespoons melted butter.\n\n2. Mix the batter: In a large mixing bowl, whisk together the flour and the eggs. Gradually add the milk and water, stirring constantly to ensure that there are no lumps. Add salt and melted butter, and mix well.\n\n3. Let the batter rest: If you can, let the batter sit for an hour or so. This will help the flour to absorb the liquid and make the crepes more tender.\n\n4. Heat your pan: Preheat a non-stick pan over medium heat. Lightly butter the pan or use cooking spray to prevent the crepes from sticking.\n\n5. Pour the batter: Using a ladle or a measuring cup, pour a small amount of batter (about 1/4 cup) onto the center of the pan. Immediately tilt the pan in a circular motion to spread the batter evenly and thinly over the bottom of the pan.\n\n6. Cook the crepe: Cook the crepe for 1-2 minutes until the bottom is lightly golden. Carefully loosen the edges with a spatula and flip the crepe over to cook the other side for another minute.\n\n7. Remove and repeat: Gently slide the crepe onto a plate, and then repeat the process with the remaining batter. Remember to re-butter the pan between each crepe if necessary.\n\n8. Fill and serve: Fill your cooked crepes with your desired filling, such as fresh fruit, whipped cream, Nutella, or ham and cheese. Roll or fold, and serve immediately. Enjoy!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Transform the following sentence using a synonym: The car sped quickly."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The car accelerated rapidly."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Make a persuasive argument to promote recycling."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Recycling is an incredibly important action that we must all take to help preserve our planet for future generations. By recycling, we can reduce the amount of waste that goes to landfills, conserve natural resources and energy, and reduce pollution from production of new materials.\n\nWhen we don't recycle, our garbage ends up in landfills where it can take hundreds of years to decompose. Landfills can release harmful chemicals and greenhouse gases, polluting the soil, air and water. However, by recycling and reusing materials, we minimize the amount of garbage that ends up in landfills, and therefore reduce the potential harm it can cause to the environment.\n\nRecycling also helps conserve our natural resources by reducing the demand for new materials. For example, recycling paper means fewer trees need to be cut down, and reusing metal and plastic means less mining and oil extraction. This, in turn, conserves energy, minimizes deforestation and reduces the environmental impact associated with producing new materials.\n\nAdditionally, recycling also has a positive impact on the economy. It creates jobs in the recycling and manufacturing industries, and reduces the costs of waste disposal for companies and taxpayers. Recycling is good for the environment and the economy.\n\nIn conclusion, recycling is an easy and effective way to take action and make a difference for the future of our planet. By simply separating our recyclable materials and disposing of them properly, we can help reduce waste, conserve resources, and protect the environment. We all have a part to play and every little action counts. So, let's make the smart choice and recycle today for a better tomorrow."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Invent a new word by combining two existing words."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Chillax\" - a combination of the words \"chill\" and \"relax\". It means to calm down and unwind, enjoying oneself in a laidback manner."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Give an example of a job that a computer can do better than a human being."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "An example of a job that a computer can do better than a human being would be analyzing vast amounts of data accurately and quickly. For instance, a computer can analyze and find patterns in large datasets within milliseconds, far surpassing the capabilities of a human being. This makes a computer ideal for jobs such as financial analysis, medical diagnosis or data-driven decision making, where speed and accuracy are crucial."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given the parameters of a triangle, find out its perimeter.Side 1 = 4\nSide 2 = 6\nSide 3 = 8"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The perimeter of a triangle is the sum of the lengths of its sides. In this case, the perimeter would be the sum of side 1, side 2, and side 3. So, \nPerimeter = Side 1 + Side 2 + Side 3\n = 4 + 6 + 8\n = 18\n\nTherefore, the perimeter of the triangle is 18 units."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create an effective 140 character twitter post"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "🚀Ready for takeoff! Explore the world beyond your imagination - embark on new adventures and make lasting memories. #travel #Wanderlust140"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Produce a list of the top 5 NHL players in 2021."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Based on their performance in the 2021 NHL Season, here is a list of the top 5 NHL players in 2021:\n\n1. Connor McDavid: McDavid is a center for the Edmonton Oilers and is considered one of the best players in the NHL. In the 2021 season, he led the league in points, scoring 105 points in 56 games.\n\n2. Auston Matthews: Matthews is a center for the Toronto Maple Leafs and had an outstanding performance in the 2021 season. He led the league in goals, scoring 41 in 52 games.\n\n3. Nathan MacKinnon: MacKinnon is a center for the Colorado Avalanche and is considered one of the most dynamic players in the league. He had a standout 2021 season, scoring 65 points in 48 games.\n\n4. Victor Hedman: Hedman is a defenseman for the Tampa Bay Lightning and is widely considered one of the best defensemen in the league. In the 2021 season, he scored 45 points in 54 games and was a key player in helping the Lightning win the Stanley Cup.\n\n5. Brad Marchand: Marchand is a left wing for the Boston Bruins and had an impressive 2021 season. He scored 69 points in 53 games and was a crucial part of the Bruins' success in the regular season.\n\nThese rankings are based on their performance during the 2021 season and can change depending on their performance in the future seasons."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Reword this sentence to increase clarityThe idea of her being so brave made me smile"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The thought of her displaying such courage brought a smile to my face."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the differences between birds and mammals"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Birds and mammals are both warm-blooded vertebrates, but there are several distinct differences between them.\n\nFirstly, birds have feathers while mammals have fur or hair. Feathers provide insulation and facilitate flight for birds, while fur or hair provides warmth for mammals and may serve other purposes such as camouflage or sensory functions.\n\nSecondly, birds lay eggs while mammals give birth to live young. Mammal babies are usually more developed at birth than bird hatchlings and are fed by their mother's milk until they are weaned.\n\nThirdly, birds have beaks while mammals have teeth. Beaks are used to forage for food, preen feathers, and build nests while teeth are used for biting and chewing.\n\nFourthly, another difference is their respiratory system. Birds have air sacs that keep air flowing through their lungs in one direction, whereas mammals have a diaphragm that expands and contracts to pump air into and out of their lungs.\n\nFinally, birds have a lightweight, strong skeleton with hollow bones to aid in flight, while mammals have a denser, heavier skeleton to support their weight on land.\n\nOverall, these differences in physical characteristics, reproductive methods, and internal anatomy are the result of adaptations that have allowed birds and mammals to thrive in their respective environments."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a one-sentence title for a creative recipe."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Rainbow Veggie Hummus Wrap: A Colorful and Nutritious Meal\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the concept of e-commerce."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "E-commerce, also known as electronic commerce or internet commerce, refers to the buying and selling of products or services using the internet, and the transfer of money and data to complete these transactions. This can take various forms, such as retail websites, online marketplaces, and even mobile apps that allow customers to make purchases from their smartphones.\n\nThe concept of e-commerce has revolutionized shopping, allowing consumers to buy goods and services at any time, from anywhere, and often at lower prices than traditional brick-and-mortar stores. This is due to reduced overhead costs, such as rent and staffing, which enables online retailers to offer competitive pricing.\n\nE-commerce also offers an unparalleled level of convenience to consumers, who can browse and compare products from the comfort of their own homes, without having to physically go from store to store. Payment is typically done securely online, using a credit or debit card, or through an e-wallet service like PayPal.\n\nToday, e-commerce has become an integral part of the global economy, with businesses of all sizes being able to reach customers and expand their market on a global scale. It has also facilitated the growth of other industries, such as online advertising, logistics, and payment services, to name a few."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Design an experiment to evaluate the efficacy of the proposed method.Proposed Method: Neural persistence"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To evaluate the efficacy of the proposed method of neural persistence, the following experiment could be designed:\n\nStep 1: Define the objective and metrics\nThe first step would be to clearly define the objective of the experiment, that is, to assess the effectiveness of neural persistence. Next, we need to establish the metrics that will be used to measure the success of the method. This could include metrics such as accuracy, precision, recall, and F1 score.\n\nStep 2: Select a dataset and split into training and testing sets\nThe next step would be to select an appropriate dataset for evaluating the method. This dataset should be relevant to the problem domain and include a sufficient number of samples for evaluation. Once selected, we will split the dataset into training and testing sets, using a standard ratio such as 80/20.\n\nStep 3: Implement the proposed method\nThe proposed method of neural persistence will then be implemented using an appropriate programming language and software framework. The details of the implementation will depend on the specifics of the proposed method.\n\nStep 4: Train the model\nThe next step will be to train the model using the training dataset. This will involve feeding the input data to the model and adjusting its parameters to minimize its error rate.\n\nStep 5: Test the model\nOnce the model has been trained, we will use the testing dataset to evaluate its performance. This will involve feeding the input data to the model and comparing its predictions to the ground truth labels. The performance metrics established in Step 1 will be used to measure the effectiveness of the proposed method.\n\nStep 6: Analyze results and draw conclusions\nAfter testing the model, we will analyze the results and draw conclusions about the effectiveness of the proposed method. This could involve comparing the results to a baseline method, and determining whether the proposed method shows a significant improvement in performance. Additionally, we may also perform statistical tests to verify the significance of our results.\n\nIn summary, to evaluate the efficacy of the proposed method of neural persistence, we can design an experiment that involves selecting an appropriate dataset, splitting it into training and testing sets, implementing the proposed method, training the model, testing its performance, and analyzing the results to draw conclusions about its effectiveness."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a list of five different books about science."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. A Brief History of Time by Stephen Hawking\n2. The Selfish Gene by Richard Dawkins\n3. The Demon-Haunted World: Science as a Candle in the Dark by Carl Sagan\n4. The Structure of Scientific Revolutions by Thomas S. Kuhn\n5. The Double Helix: A Personal Account of the Discovery of the Structure of DNA by James D. Watson."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Brainstorm some activities that could make an in-person work meeting more engaging."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here are some ideas to make an in-person work meeting more engaging:\n\n1. Icebreaker Activities: Starting the meeting with a fun icebreaker can help energize the group and promote team building. Some examples include Two Truths and a Lie, The Question Game, or Human Bingo.\n\n2. Interactive Presentation: Incorporate interactive elements in the presentation such as quizzes, polls, or games to keep the audience engaged and involved.\n\n3. Group Discussions: Break the meeting into smaller groups to encourage participation and collaboration. Have each group brainstorm or discuss a topic, then present their ideas to the larger group.\n\n4. Creative Challenges: Set up a challenge or competition to encourage participation, creativity, and teamwork. This could be something like a design challenge, a problem-solving task or a role-playing scenario.\n\n5. Snacks and Refreshments: Provide snacks and refreshments to keep energy levels high and create a more relaxed and enjoyable atmosphere.\n\n6. Movement and Breaks: Incorporate movement into the meeting, such as standing or walking while discussing, or taking short breaks to stretch or do some light exercise.\n\n7. Visual Aids: Use visual aids such as whiteboards, flip charts, or post-it notes to help make the meeting more engaging and interactive. Encourage participants to write or draw their ideas to promote collaboration and creativity.\n\n8. Feedback and Follow-up: End the meeting by asking for feedback and discussing next steps. This encourages participants to share their thoughts and gives them a sense of ownership and involvement in the meeting's outcomes."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Brainstorm a list of titles for a photo album"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. \"Memories in Motion\"\n2. \"Snapshots of Life\"\n3. \"Through the Lens\"\n4. \"Picture Perfect\"\n5. \"Moments in Time\"\n6. \"Caught on Camera\"\n7. \"Life in Focus\"\n8. \"Behind the Scenes\"\n9. \"Life in Pictures\"\n10. \"Candid Moments\"\n11. \"Frame by Frame\"\n12. \"Our Story\"\n13. \"Captured Memories\"\n14. \"A Picture is Worth a Thousand Words\"\n15. \"The Journey in Photos\"\n16. \"Picture This\"\n17. \"Moments to Remember\"\n18. \"Life in Color\"\n19. \"Snapshots in Time\"\n20. \"The Visual Diary\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rewrite the sentence so that it's in the present tense.She had worked at the company for the past 3 years."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "She is working at the company for the past 3 years."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Adapt the provided joke to make it more humorous.Why did the frog cross the road?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Why did the frog use the crosswalk? Because he wanted to show his hopping mad skills and avoid croaking."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create an AI chatbot"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Creating an AI chatbot involves several steps, and there are different approaches and technologies that can be used. Below is a general outline of the process:\n\n1. Define the purpose and capabilities of the chatbot: The first step is to determine what kind of chatbot you want to create and what tasks it should be able to perform. This will help guide the design decisions and technology used.\n\n2. Choose a development platform: There are several platforms available for building a chatbot, such as Dialogflow, Microsoft Bot Framework, and Amazon Lex. These platforms provide tools and resources for building and deploying chatbots.\n\n3. Design the conversational flow: Design the conversational flow by creating a decision tree or flowchart that shows the different pathways a conversation can take. Define the different types of user inputs the chatbot can handle and the responses it should provide.\n\n4. Develop and train the chatbot: The development process varies depending on the platform and technology used. It typically involves programming the chatbot to recognize user inputs and provide appropriate responses. The chatbot may also need to be trained using machine learning algorithms and data sets to improve its ability to understand and respond to users.\n\n5. Test and deploy the chatbot: The chatbot should be tested to ensure that it can handle user inputs and provide accurate responses. Once testing is complete, the chatbot can be deployed to the desired platform or integrated with other systems.\n\n6. Monitor and improve: Once the chatbot is deployed, it should be monitored to track its performance and identify areas for improvement. The chatbot can be updated and refined over time to improve its functionality and user experience."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain what a circuit breaker is."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A circuit breaker is an electrical safety device designed to protect an electrical circuit from damage caused by excess current from an overload or short circuit. It works by detecting a fault condition and interrupting the flow of current by breaking the electrical circuit. When the current flow exceeds a safe level, the breaker will trip, or open, to stop the flow of power. Once the issue has been resolved, the breaker can be reset to resume normal operation. Circuit breakers are commonly found in homes, businesses, and industrial settings, and are used to prevent electrical fires, equipment damage, and other hazards caused by electrical overloads or short circuits."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a story involving a talking mouse and an elephant."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Once upon a time, in a magical forest hidden among the tall trees and thick bushes, lived an unusual pair of friends- a tiny talking mouse named Milo and a large elephant named Ella. Despite their stark differences in size, Milo and Ella shared a bond like no other, spending their days roaming the forest, discovering new paths and sharing countless laughs.\n\nOne sunny morning, as they were taking a stroll, Milo spotted a bush of juicy berries. Being the tiny mouse that he was, he couldn't reach the delicious fruit, so he turned to his friend Ella for help. \"Ella, could you reach those berries up there for me?\" he asked. Always eager to help her friend, Ella used her trunk to pluck the berries and placed them in front of Milo. \"Mmm, these are delicious!\" Milo squeaked, savoring each berry.\n\nAs they continued their walk, they stumbled upon an old, rickety bridge that connected two ends of a deep ravine. Milo bravely scurried across, his little paws pattering against the wooden planks. However, when Ella attempted to follow, the bridge groaned under her weight. Fearful of falling, Ella froze in the middle of the bridge, afraid to move. Milo scampered back to her side, \"Don't worry Ella, I'll help you\" he said confidently.\n\nWith Milo's encouragement, Ella slowly and steadily made her way across the bridge. When she finally made it to the other side, she trumpeted happily, grateful for her friend's support.\n\nFrom that day onwards, Milo and Ella continued to share many adventures, always helping each other along the way. They were the perfect example of how, no matter how small or big one might be, friendship knows no bounds."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Predict the next musical note given the musical sequence.D E G B C"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "It is impossible to accurately predict the next musical note in the given sequence as it depends on the context and the pattern of the musical composition. Music composition is subjective and creative, so a sequence can follow a pattern based on multiple factors such as the scale, key, chord progression, melody, and style of the music. Without additional information on these aspects, the next note in the given sequence cannot be accurately predicted."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "From the list of words, identify the two-word compound antonym of the word 'injustice'.Injustice"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The two-word compound antonym for the word \"injustice\" is \"fair play\"."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name three vaccinations that adults should get"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Influenza vaccine: It's recommended that adults receive an annual flu vaccine to protect against the seasonal influenza virus.\n\n2. Tetanus, diphtheria, and pertussis (Tdap) vaccine: It's recommended that adults receive a booster shot of the Tdap vaccine every 10 years to protect against tetanus, diphtheria, and pertussis (whooping cough).\n\n3. Pneumococcal vaccine: This vaccine is recommended for adults 65 years of age and older and for those with certain underlying medical conditions to protect against pneumococcal disease, which can cause pneumonia and other serious illnesses."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Give me two examples of renewable energy sources."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Solar Energy - which is harnessed from the radiant light and heat from the sun through the use of technologies like solar photovoltaic panels, solar thermal collectors, or solar water heating systems.\n\n2. Wind Energy - which is captured from the kinetic energy of the wind using wind turbines, and can be used to generate electricity. Wind energy has been used for centuries for tasks like pumping water and grinding grain, and is now commonly used as an efficient and clean source of electricity."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Sort the books into two groups, fiction and non-fiction.Alice's Adventures in Wonderland, The Cat in the Hat, Wild Swans, My Struggle"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Fiction books:\n- Alice's Adventures in Wonderland\n- The Cat in the Hat\n\nNon-Fiction books:\n- Wild Swans\n- My Struggle"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What are the benefits of exercising?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "There are numerous benefits to exercising regularly, including:\n\n1. Improved Physical Health: Regular exercise can help reduce the risks of chronic diseases such as heart disease, hypertension, type 2 diabetes, and certain cancers. It can also help improve bone density, muscle strength, and joint mobility.\n\n2. Weight Management: Exercise can help you burn calories and lose weight, as well as help you maintain your weight once you have reached your goal.\n\n3. Mental Health Boost: Exercise triggers the release of endorphins, the feel-good hormones, in the brain, making it a natural mood lifter. It can also reduce stress, anxiety, and even depression.\n\n4. Better Sleep: Regular exercise can help improve the quality of your sleep, making it easier to fall asleep and stay asleep.\n\n5. Increased Energy: Regular exercise can boost your energy levels, reduce fatigue, and increase your endurance, making it easier to carry out daily activities.\n\n6. Improved Cognitive Function: Regular exercise has been shown to improve brain function, memory, and concentration, as well as reduce the risk of cognitive decline and dementia.\n\n7. Social Benefits: Exercise can be a fun social activity when done with friends, providing an opportunity to build relationships and connect with others. Group exercise classes or sports teams are great ways to meet new people while getting active.\n\nIn summary, exercising brings benefits to both your physical and mental health, making it an integral part of a healthy lifestyle."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "How would you generate random numbers between 1 and 10 in Java?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One simple way to generate random numbers between 1 and 10 in Java is by using the `nextInt(int n)` method from the `java.util.Random` class. This method returns a pseudorandom, uniformly distributed `int` value between 0 (inclusive) and the specified value (exclusive).\n\nHere is an example of how you can use the `nextInt(int n)` method to generate random numbers between 1 and 10:\n\n```java\nimport java.util.Random;\n\npublic class RandomNumber {\n\n public static void main(String[] args) {\n Random rand = new Random();\n\n // Generate 10 random numbers between 1 and 10\n for (int i = 0; i < 10; i++) {\n int randomNumber = rand.nextInt(10) + 1;\n System.out.println(randomNumber);\n }\n }\n}\n```\n\nThis code snippet creates a new instance of the `Random` class, which we use to generate random numbers. In the for loop, we call the `nextInt(int n)` method with the argument `10`, which generates random numbers between 0 and 9. Then, we add `1` to the generated value to shift the range from 0-9 to 1-10. Finally, we print out the generated random number to the console. The for loop runs 10 times, generating 10 random numbers between 1 and 10."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Resolve the following problem: 9 - 2 x 3"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To solve this problem, you need to follow the order of operations. According to the order of operations, you should perform the multiplication first, and then the subtraction. So, let's first solve the multiplication part: \n\n2 x 3 = 6. \n\nNow, the problem is simplified to 9 - 6. Solving this gives us:\n\n9 - 6 = 3. \n\nSo, the solution to the problem 9 - 2 x 3 is 3."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Fill in the blank using a word that best completes the sentence.The city was blanketed in a thick layer of eerie ____."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "fog."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe how machine learning is used in this system.A system that recognizes hand-written digits."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Machine learning is used in this system by allowing the computer to learn and improve its ability to recognize hand-written digits based on patterns identified in the input data. The machine learning algorithm is fed with a large dataset of hand-written digits, along with their corresponding labels, as its training data. This data is then processed and analyzed by the machine learning algorithm to identify the various features and characteristics that are unique to each digit.\n\nThe machine learning algorithm uses techniques such as supervised learning to train a model that can accurately predict the digit represented in any given input image. At a basic level, this is done by identifying the pixels that are most commonly on or off in each digit and using this information to make predictions.\n\nOnce the model is trained, the system uses this model to recognize new hand-written digits by extracting the relevant features from the input image and using the machine learning algorithm to make a prediction. The accuracy of the models prediction is highly dependent on the quality of the training data, as well as the sophistication of the machine learning algorithm.\n\nAs the system is used over time, the machine learning algorithm may continue to improve its performance by incorporating feedback from users or by being exposed to additional training data. This allows the system to constantly adapt and improve its ability to accurately recognize hand-written digits."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a rap about dreams."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Verse 1: \nDreams are the visions that we see at night, \nFantasies and wishes, taking us to new heights \nInside our minds, anything can occur, \nFlying to the moon, or talking to a fur \nIts an escape from reality, a ticket to a new scene, \nA playground for our thoughts, a place to let off some steam \n\nChorus: \nDreams, dreams, so vivid and grand \nA world waiting for us, right at our command \nDreams, dreams, they let us explore \nImaginations run wild, and we keep wanting more \n\nVerse 2: \nWe can be anyone, do anything we please \nRun from the cops or sail across the seas \nOur deepest fears, and our biggest desires \nCoexist in this world where nothing ever expires \nBut dreams arent just a nighttime game \nTheyre our aspirations, our hopes, our aim \n\nChorus: \nDreams, dreams, so vivid and grand \nA world waiting for us, right at our command \nDreams, dreams, they let us explore \nImaginations run wild, and we keep wanting more \n\nVerse 3: \nDreams come in all shapes and sizes \nFull of surprises, and new horizons \nDont be afraid to chase your dreams \nLife is not always what it seems \nSo close your eyes, let your mind take the lead \nIn a world of dreams, there are no limits, indeed. \n\nOutro: \nDreams, dreams, a world of our own,\nA place where anything can be shown,\nDon't be afraid to dream big and bold,\nThe future is yours, just grab hold."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name one of the branches of science"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One of the branches of science is Physics."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write the body of an email to invite people to a webinar on the given topic.Topic: How to use data analytics to improve your business."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Dear all,\n\nI would like to extend a warm invitation to you to join our upcoming webinar on \"How to Use Data Analytics to Improve Your Business\". This webinar is designed to provide you with valuable insights on how to harness the power of data analytics to drive growth and enhance the performance of your business.\n\nDuring the webinar, our expert panelists will share their knowledge and experience on the subject and will address key areas such as how to gather and analyze data, derive actionable insights, and implement data-driven strategies to achieve your business goals.\n\nThe webinar will take place on [date and time] and will last for approximately [duration]. It will be an interactive session where you will have the opportunity to ask questions and discuss your concerns with our panelists.\n\nDont miss this opportunity to learn more about data analytics and how it can help you take your business to the next level. To reserve your spot, please click on the link below to register. We look forward to your participation.\n\nBest regards,\n[Your Name]"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Add 3 examples to the following sentence.Gun violence in the United States can result in..."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Gun violence in the United States can result in tragic loss of life and injury, causing emotional devastation for the victims and their families.\n2. Gun violence in the United States can result in increased fear and anxiety among communities, as well as calls for stronger gun control measures.\n3. Gun violence in the United States can result in long-term physical and psychological trauma for survivors, as well as significant costs for medical treatment and rehabilitation."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Evaluate the following claim as true, false, or uncertain. Output 1 for true, 2 for false, 3 for uncertain.People should not eat dairy"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "3"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe a use case for using machine learning in a healthcare system.No input."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One use case for using machine learning in a healthcare system would be to help doctors accurately diagnose and treat diseases based on patient data. Machine learning algorithms can be trained on vast amounts of historical medical data to recognize patterns and predict outcomes. For example, a machine learning algorithm could analyze patient symptoms, medical history, and lab results to aid doctors in diagnosing diseases such as cancer, diabetes, or heart disease more accurately and quickly. This information can be used to inform and tailor treatment plans for each individual patient.\n\nMachine learning can also be used for monitoring and predicting the progression of chronic diseases. By analyzing patient data over time, a machine learning algorithm could predict potential health events or complications and suggest preventative measures. This could help doctors and patients better manage chronic conditions and improve overall patient outcomes.\n\nIn summary, machine learning offers powerful tools that can assist doctors and other medical professionals in diagnosing and treating diseases, managing chronic conditions, and improving patient outcomes."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the sentiment of the sentence and output 1 for positive sentiment, 0 for negative sentiment, and 0.5 for neutral sentiment.The products were just mediocre."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "0.5"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What is the best way to protect yourself from cybercrime?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The best way to protect yourself from cybercrime involves taking several precautionary measures to safeguard your personal information and online activities. Here are some tips to help you stay safe from cybercrime:\n\n1. Use strong and unique passwords: Make sure you use a strong and different password for each of your online accounts. Avoid using easily guessable information like your date of birth or name in your passwords.\n\n2. Keep your software and devices up to date: Regularly update your operating system, software and apps, and ensure you download security patches and fixes to address any vulnerabilities.\n\n3. Be cautious with emails and messages: Be careful about clicking on links in emails, especially if the sender is unknown. Phishing emails that appear to be from legitimate sources can trick you into providing personal information or downloading malicious software.\n\n4. Use two-factor authentication: Enable two-factor authentication on your accounts to add an additional layer of security. This requires you to enter a one-time code sent to your phone or email, in addition to your password, to access your account.\n\n5. Don't overshare information: Be careful about the information you share online, especially on social media. Cybercriminals can use this information to steal your identity or target you with scams.\n\n6. Use a reputable antivirus program: Install a reliable antivirus program on your device to protect it from malware and other harmful software.\n\n7. Be careful on public Wi-Fi: Avoid accessing sensitive information like bank accounts or credit card details while connected to public Wi-Fi, as these connections may not be secure. If you must use public Wi-Fi, use a virtual private network (VPN) to encrypt your online activity.\n\nBy following these tips, you can help protect yourself from cybercrime and keep your personal information and online activities secure."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the differences between machine and deep learning"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Machine learning and deep learning are both subfields of artificial intelligence, with deep learning being a subset of machine learning. Both technologies enable computers to learn from data and improve their performance over time. However, there are some fundamental differences between the two.\n\nMachine learning algorithms are designed to learn from data and make predictions or decisions based on that data. These algorithms can be divided into three main categories - supervised, unsupervised, and reinforcement learning- and they are often used for tasks such as classification, regression, clustering, and anomaly detection. Machine learning algorithms require feature extraction where relevant features that represent the data in the best possible way are selected and fed to the machine learning models.\n\nDeep learning, on the other hand, is a type of machine learning that uses neural networks with multiple layers. These layers are capable of learning features automatically from the data without the need for manual feature extraction. Deep learning algorithms can handle large amounts of data and are able to learn complex patterns or relationships, making them suitable for tasks such as image and speech recognition, natural language processing, and computer vision.\n\nIn summary, the main difference between machine learning and deep learning is the approach to learning from data. Machine learning algorithms often require human intervention to select relevant features, while deep learning algorithms are designed to automatically learn features from raw data. Deep learning models can handle more complex data and tasks, but often require larger amounts of data and computational power to train effectively."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a paragraph that describes the importance of data in modern society."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Data plays a pivotal role in modern society, driving progress and innovation in numerous industries, while underpinning decision making in both the public and private sectors. From healthcare and education to finance and marketing, data analysis helps organizations to operate more efficiently, make informed decisions, and unlock new opportunities. Furthermore, data has become a vital tool for scientists, researchers, and policymakers, enabling them to track social trends, monitor the environment, and develop evidence-based policies. In essence, the importance of data in today's world cannot be overstated, as it continues to shape the way we live, work, and interact with one another."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "How does the stock market work?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The stock market is a complex system where shares of publicly traded companies are issued and traded either through exchanges or over-the-counter markets.\n\nA stock represents a share in the ownership of a company and represents a claim on a portion of its assets and earnings. When a company wants to raise money, it can do so by issuing stocks to the public. Investors can then buy and sell these stocks among themselves, and the exchange tracks the demand and supply of each stock.\n\nThe price of a stock is determined by the forces of supply and demand. If more people want to buy a stock than sell it, the price will go up, and vice versa. The stock market works as an auction where buyers and sellers negotiate prices.\n\nStock prices can fluctuate for many reasons, including company news, broader market trends, and global economic conditions. There are also many different strategies and approaches to investing in the stock market, ranging from long-term buy-and-hold investing to short-term speculation.\n\nOverall, the stock market allows companies to raise money to grow their businesses, and gives investors the opportunity to share in the success of those companies."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create four true-false statements about the given topic.General Data Protection Regulations (GDPR)"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. GDPR is a regulation implemented by the European Union that deals with the protection and privacy of personal data: True\n\n2. GDPR only applies to companies located within the European Union: False\n\n3. GDPR gives individuals the right to know what data is being collected and how it is being used: True\n\n4. Companies that do not comply with GDPR regulations may only receive a warning without any further consequences: False."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Determine which type of the given organizations are more profitable, government or private organizations?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "As an AI, I don't have access to real-time financial data to determine which type of organization is more profitable, government or private. Profitability varies widely by industry, geographic location, and specific organization. In general, private organizations are designed to generate profits for their owners and shareholders, while government organizations are usually not structured to generate profits, instead, they aim to provide public services and fulfill other responsibilities assigned to them. However, the answer to this question would require an in-depth examination of data and reports on the profitability of both types of organizations in the current market and economic conditions."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate ideas for a humorous slogan for a board game manufacturer"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. \"Game Over? More like game ON and ON and ON!\"\n\n2. \"Making family game night less awkward since [year]\"\n\n3. \"Where the board never gets bored!\"\n\n4. \"Dice rolls, laughter, and occasional rage quits.\"\n\n5. \"We bring the fun and games, you bring the snacks!\"\n\n6. \"Say goodbye to boredom, and hello to blaming the dice.\"\n\n7. \"Our games are so good, you'll forget about your phone... for a little bit.\"\n\n8. \"Think outside the screen: Play our games!”\n\n9. \"The only thing more unpredictable than our games is the weather.\"\n\n10. \"Bored? Play our board games!\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given three points A, B and C, calculate the angle ABC.A=(1,2), B=(3,4), C=(5,6)"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "We can use the cosine law to find the angle. Given three points A, B and C, the angle ABC can be computed as follows:\n\nFirst we find the distances between the points:\nAB = sqrt((3 - 1)^2 + (4 - 2)^2) = sqrt(8)\nBC = sqrt((5 - 3)^2 + (6 - 4)^2) = sqrt(8)\nAC = sqrt((5 - 1)^2 + (6 - 2)^2) = sqrt(32)\n\nUsing these distances we can find the cosine of the angle ABC using the cosine law:\n\ncos(ABC) = (AB^2 + BC^2 - AC^2)/(2 * AB * BC)\ncos(ABC) = (8 + 8 - 32)/(2 * sqrt(8) * sqrt(8))\ncos(ABC) = -0.5\n\nSo the angle ABC is arccos(-0.5) radians or 120 degrees."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain why it's important to follow copyright laws"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Copyright laws are important because they protect the rights of the creators and owners of original works, including literary, musical, artistic, and other forms of intellectual property. These laws grant the creators exclusive rights to use, distribute, and profit from their works, as well as the ability to control how their works are used.\n\nFailing to follow copyright laws can result in legal consequences, including financial penalties and damages. In some cases, the violation of copyright laws can even result in criminal charges, such as when someone illegally distributes copyrighted material for profit.\n\nCopyright laws also encourage creativity, innovation, and the creation of new works by ensuring that creators are fairly compensated for their efforts. If people could freely copy and distribute the works of others without permission, there would be little incentive for people to invest their time and resources into creating new works. This would ultimately harm the overall creative output of society and reduce the availability of new and innovative works.\n\nIn conclusion, copyright laws are important because they protect the rights of creators, encourage creativity and innovation, and help to maintain a fair and sustainable system for the creation and distribution of intellectual property. It is important to follow these laws to avoid legal consequences and to support the continued creation of new and original works."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the incorrect phrase in the following sentences:My family usually goes rollerblading in together in the summer evenings."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The incorrect phrase in the sentence is \"in together in\" as it is redundant. The correct sentence would be \"My family usually goes rollerblading together in the summer evenings.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Classify the following song as rock, country, hip-hop or pop.\"Lose You to Love Me\" by Selena Gomez"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Lose You to Love Me\" by Selena Gomez is classified as a pop song."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Find the data points which violate the specified rule.5, 10, 15, 20, 25, 30\nRule: All numbers greater than 22"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The data points that violate the specified rule of being \"All numbers greater than 22\" are 5, 10, 15, and 20."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate an opening sentence for a poem on the theme of 'loneliness'."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "In the quiet corners of my mind, I sit alone with thoughts entwined."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a character description for someone called \"Trina\"."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Trina is an energetic young woman in her mid-twenties with piercing green eyes and long, curly auburn hair that she often sweeps up in a messy bun. She's slim, standing at about 5'6'', with a bright and infectious smile. Her style is casual yet trendy, usually opting for high-waisted jeans, a crop top, and sneakers.\n\nTrina exudes confidence and radiates positivity, often able to light up the room with her bubbly personality. She's outgoing and easygoing, able to strike up a conversation with anyone she meets. Trina is fiercely loyal to her friends and family, always quick to lend a listening ear or provide a word of encouragement.\n\nIn her free time, Trina can be found exploring the city with her camera in hand, capturing the beauty of everyday life. She also enjoys practicing yoga and experimenting with new recipes in the kitchen. Despite her carefree attitude, Trina is ambitious, always striving to better herself and reach her goals."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name three things that make a dinner party memorable."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Great company and conversation: One of the keys to a memorable dinner party is inviting a lively and engaging group of guests. Engaging conversations and the sharing of stories and experiences can make for a fun and meaningful evening.\n\n2. Delicious food and drinks: Food is a central component of any dinner party, and a thoughtfully planned and well-executed menu can leave a lasting impression on your guests. A variety of dishes utilizing fresh and high-quality ingredients, paired with a selection of drinks, can create a memorable dining experience.\n\n3. Warm and welcoming ambiance: The atmosphere of the party can also play a significant role in making it memorable. A thoughtfully decorated space with mood lighting, background music, and comfortable seating can help create a warm and welcoming ambiance for guests to relax and enjoy themselves."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe what a marketing funnel is."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A marketing funnel is a model used to visualize and understand the journey a customer goes through when interacting with a business. It describes the various stages customers move through as they progress from first learning about a company or product, to making a purchase and ultimately, becoming a loyal customer.\n\nThe funnel metaphor represents the decrease in the number of potential customers as they move through the different stages of the process. The top of the funnel is wide, representing a large pool of potential customers that are at the awareness stage. As these customers move further down the funnel and progress through the stages of interest, consideration, and decision/purchase, the number of customers decreases, narrowing at the bottom of the funnel, where you find the loyal customers that have completed the journey.\n\nMarketing teams use this model to design, implement and monitor their marketing strategies, with the goal of guiding potential customers through each stage of the funnel, and eventually converting them into loyal customers."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Analyze the given phrase in terms of its connotations.Wild goose chase"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The phrase \"wild goose chase\" generally has a negative connotation. It is an idiom that refers to a futile pursuit or a search that is likely to be fruitless and unproductive. The imagery of a person chasing after a wild goose, which is known for being difficult to catch, suggests an endeavor that is a waste of time and energy. The phrase can be used to describe a situation where someone is chasing after something that is unlikely to be attained, or a task that is likely to lead nowhere."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a recipe for an easy and nutritious lunch"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here's a simple and nutritious lunch recipe you can easily prepare: \n\nVeggie and Quinoa Bowl\n\nIngredients:\n- 1/2 cup quinoa (rinsed)\n- 1 cup water or vegetable broth\n- 1 small sweet potato (diced)\n- 1 small bell pepper (diced)\n- 1 zucchini (sliced)\n- 1 cup cherry tomatoes (halved)\n- 1/4 cup feta cheese (crumbled)\n- 1 tablespoon olive oil\n- Salt and pepper (to taste)\n- 1/4 cup freshly chopped parsley\n\nInstructions:\n\n1. In a medium saucepan, bring the water or vegetable broth to a boil. Add the quinoa and stir. Reduce heat to low and let it simmer for about 18 minutes, or until the quinoa is cooked.\n\n2. Preheat your oven to 400 degrees. Line a baking sheet with parchment paper.\n\n3. Place the diced sweet potato, bell pepper, and zucchini on the baking sheet. Drizzle with olive oil and season with salt and pepper. Toss to coat.\n\n4. Place the baking sheet in the preheated oven and roast the vegetables for 20-25 minutes, or until they are tender and golden brown.\n\n5. In a large bowl, mix the cooked quinoa, roasted vegetables, halved cherry tomatoes and crumbled feta cheese. Season with salt and pepper to taste.\n\n6. Serve the veggie and quinoa bowl in individual bowls and garnish with freshly chopped parsley.\n\nThis recipe is easy to prepare and it's packed with nutrients from the quinoa and the variety of colorful vegetables. You can add more of your favorite veggies or even mix in some grilled chicken or tofu for added protein. Enjoy!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the difference between generative and discriminative models."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Generative and discriminative models are two types of machine learning models used for classification and other tasks.\n\nA generative model tries to learn how the data is generated. It does this by modelling the joint probability distribution of the input features and the output labels, i.e. P(X,Y). This is done by making assumptions about the distribution of the data, and then estimating the parameters of the distribution from the training data. Once the model has learned the joint distribution, it can generate new samples from the data distribution, and also estimate the conditional probability of the labels given the input features, i.e. P(Y|X), which is used to make predictions.\n\nA discriminative model, on the other hand, tries to learn the boundary between the different classes directly, without making any assumptions about the underlying data distribution. It does this by modelling the conditional probability of the output labels given the input features, i.e. P(Y|X). Discriminative models tend to focus more on the relationships between the input features and the output labels, and are usually more accurate in classification tasks.\n\nThe choice between generative and discriminative models depends on the specific use case. Generative models are usually better at handling missing data, and can be used to generate new samples from the data distribution. Discriminative models, on the other hand, typically have better performance in classification tasks, but cannot be used to generate new data."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "We need to show the advertisement for a particular product on the website. Create the best headline for the advertisement.The product we are advertising is a professional photography camera kit."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Capture Life's Beautiful Moments with Our Professional Photography Camera Kit!\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a list of five reasons why a person should buy a bike"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Environmentally friendly: Compared to cars and other motor vehicles, bikes do not emit pollution, helping to reduce your carbon footprint and contribute to a cleaner environment.\n\n2. Health benefits: Biking is a great form of cardiovascular exercise that improves overall health, muscle strength, and cardiovascular fitness.\n\n3. Cost-effective: Bikes are more affordable than cars and save money on gas, maintenance and insurance.\n \n4. Traffic and convenience: Bikes can often bypass traffic, leading to faster travel times, especially in urban areas. Plus, finding parking for a bike is typically easier and cheaper than for a car.\n\n5. Mental benefits: Cycling can be a great stress reliever and mood booster, providing a mental health benefit in addition to the physical benefits. Additionally, spending time outdoors in fresh air while biking increases overall well-being."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "List some characteristics of a good manager."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A good manager should possess a range of key characteristics that would enable them to lead effectively. Below are some of the most important characteristics:\n\n1. Excellent Communication Skills: A good manager should be able to communicate effectively, both verbally and in writing, and be able to convey information clearly and concisely.\n\n2. Strategic Thinking: A good manager should be able to think strategically, assessing the long-term implications of actions and decisions, and developing plans that take into account future challenges and opportunities.\n\n3. Decision Making: A good manager should be able to make sound decisions, even in difficult or uncertain situations, and be able to take responsibility for the consequences of their decisions.\n\n4. Leadership: A good manager should be able to inspire and motivate their team, leading by example and demonstrating a strong work ethic and commitment to the company's goals.\n\n5. Interpersonal Skills: A good manager should be able to build strong, positive relationships with their subordinates, colleagues, and superiors, and be able to effectively manage conflicts and negotiate solutions.\n\n6. Adaptability: A good manager should be able to adapt to changing circumstances, demonstrating flexibility and an ability to think on their feet.\n\n7. Time Management: A good manager should be able to effectively prioritize tasks and manage their time, ensuring that they are able to meet deadlines and achieve their goals.\n\n8. Delegation: A good manager should be able to delegate tasks effectively, trusting their subordinates to take on responsibility and providing them with the support and guidance they need to succeed.\n\n9. Empathy: A good manager should be able to empathize with their subordinates, understanding their needs and concerns, and being able to provide support and advice.\n\n10. Continuous Learning: A good manager should be committed to continuous learning and development, seeking out opportunities to improve their skills and knowledge, and encouraging their subordinates to do the same."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Validate the given statement with appropriate facts.Technology advances do not always lead to a better life experience."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The statement is valid to a certain extent. While technology innovations have certainly improved many aspects of our lives, there have been instances where it has negatively affected our well-being. \n\nFor example, the widespread availability of connected devices, such as smartphones, has created an “always-on” culture where it can be difficult for individuals to detach from work, leading to burnout and decreased wellness. Furthermore, the use of social media has been linked to increased feelings of anxiety, depression, and loneliness in some people, particularly teenagers and young adults.\n\nIn some cases, technology has also contributed to rising inequality and job loss, as automation and robotics increasingly replace human labor in certain industries. This can negatively affect the life experiences of those who lose their jobs, even as it leads to gains in efficiency and productivity for businesses.\n\nOverall, while technology often leads to improvements in quality of life, it is important to carefully consider its impact on all aspects of society, as there are certainly instances where it does not necessarily lead to a better life experience."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Calculate the total number of calories in a banana."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "On average, a medium banana that is about 7-8 inches long contains around 105 calories. However, the exact number of calories in a banana can vary depending on its size."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate an algorithm that can solve the Travelling Salesperson Problem."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The Traveling Salesperson Problem (TSP) is a classic algorithmic problem in the field of computer science and operations research. It involves finding the shortest possible route through a set of cities, visiting each city only once and returning to the starting city. Here is a simple algorithm that can solve the TSP:\n\n1. Select a starting city: Choose any city as the starting point for the route.\n\n2. Find the nearest city: Calculate the distance to each of the remaining cities and select the city that is closest to the starting city.\n\n3. Add the nearest city to the route: Add the nearest city to the route and mark it as visited.\n\n4. Repeat the process: Repeat steps 2 and 3 until all the cities have been visited.\n\n5. Close the route: Once all cities have been visited, add the starting city to the route to close the loop.\n\nThis algorithm is known as the nearest neighbor algorithm. It is a simple and fast way to solve the TSP, but it does not always provide the optimal solution. There are many other algorithms for solving the TSP, including the brute force algorithm, the branch and bound algorithm, the genetic algorithm, and many others. These algorithms are more complex and can provide better solutions, but they also require more computing power and time to run."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a script of a sales call."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "**Salesperson**: Good afternoon, this is [Salespersons Name] from [Company Name]. Am I speaking with [Customers Name]?\n\n**Customer**: Yes, this is [Customers Name].\n\n**Salesperson**: Hi [Customers Name], I hope you are well. I am calling because I wanted to share some exciting news with you about our newest product that can benefit your business.\n\n**Customer**: Okay, I am listening.\n\n**Salesperson**: Great! Our company has recently launched a new software that can improve the efficiency of your sales team by 30%, allowing them to close more deals, and in turn, increasing your revenue. It is easy to use, and our team can provide training to ensure a seamless transition. \n\n**Customer**: That sounds interesting. Can you tell me more about it?\n\n**Salesperson**: Of course! What would you like to know?\n\n**Customer**: Well, how is it different from other sales software in the market?\n\n**Salesperson**: Great question! Our software integrates with your existing CRM and uses AI technology to provide real-time suggestions to your sales team on how to engage with potential customers. Additionally, it has a feature that automates follow-ups, freeing up time for your team to focus on closing deals. Plus, our customer support is available 24/7 to help you with any questions or issues that may arise.\n\n**Customer**: That sounds very helpful. How much does it cost?\n\n**Salesperson**: Our pricing is very competitive and depends on the size of your sales team. For a team of up to 10 people, it would cost [Amount] per month. But, since you are an existing customer, I can offer you a 20% discount on the first six months.\n\n**Customer**: That sounds like a good deal. Let me discuss it with my team, and I will get back to you.\n\n**Salesperson**: Absolutely! I completely understand. Would it be helpful if I send you an email with more information about the software and the pricing, so you can share it with your team?\n\n**Customer**: Yes, that would be great.\n\n**Salesperson**: Perfect! I will send that right over. If you have any further questions, please do not hesitate to reach out to me. Thank you for your time, and have a great day!\n\n**Customer**: Thank you, you too.\n\n**Salesperson**: Goodbye [Customers Name]!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rewrite this sentence without changing the meaning:\n\"Online learning has been proven to be an efficient way of getting an education.\""}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Getting an education through online learning has been demonstrated to be effective.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a password of 8 characters."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is a randomly generated 8-character password: rT8$jLpZ.\n\nPlease note that it is advisable to use a unique password and not share it or reuse it across multiple accounts for the sake of security."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name some festivals celebrated in India"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "India is a country with diverse cultures and traditions, which leads to the celebration of many festivals throughout the year. Here is a list of some popular festivals celebrated in India:\n\n1. Diwali: Also known as the \"Festival of Lights,\" Diwali is one of the most popular Hindu festivals that celebrates the victory of light over darkness, good over evil, and knowledge over ignorance.\n\n2. Holi: Known as the \"Festival of Colors,\" Holi is a vibrant Hindu festival that celebrates the arrival of spring and the triumph of good over evil.\n\n3. Navratri: A nine-night festival that celebrates the victory of good over evil and is dedicated to the worship of the Hindu deity Durga.\n\n4. Ganesh Chaturthi: A 10-day festival that honors the birth of Lord Ganesha, the God of wisdom, knowledge, and new beginnings.\n\n5. Dussehra: Also known as Vijayadashami, Dussehra is a Hindu festival that celebrates the victory of good over evil, marking the defeat of the demon king Ravana by Lord Rama.\n\n6. Raksha Bandhan: A Hindu festival that celebrates the bond between siblings, where sisters tie a sacred thread (Rakhi) on their brothers' wrist to symbolize their love and protection.\n\n7. Eid al-Fitr: A Muslim festival that marks the end of Ramadan, the month of fasting, and is celebrated with feasts and the exchange of gifts.\n\n8. Christmas: Celebrated by the Christian community in India, Christmas commemorates the birth of Jesus Christ and is observed with church services, gift-giving, and feasting.\n\n9. Pongal: A four-day harvest festival celebrated mainly in Tamil Nadu, where people give thanks to the Sun God for a bountiful harvest.\n\n10. Onam: A ten-day harvest festival celebrated in Kerala that marks the homecoming of the mythical King Mahabali and is observed with boat races, flower decorations, and traditional feasts."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Imagine you are a tourist in Rome, what would be your favorite sights?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "As an AI assistant, I do not possess personal preferences, but I can certainly offer some suggestions about popular tourist attractions in Rome. Some of the top sights include the Colosseum, the Roman Forum, the Pantheon, the Vatican Museums, St. Peter's Basilica, the Trevi Fountain, the Spanish Steps, and Piazza Navona. These are just a few iconic examples of the rich culture, history and architecture that Rome has to offer!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the comparison operator in programming."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A comparison operator, also known as a relational operator, is a symbol that is used in programming languages to compare two variables or values for equality, inequality, or ordering. These operators are commonly used in conditional statements and loops to make decisions based on the result of the comparison.\n\nThere are several types of comparison operators, including:\n\n* `==` (equals) - checks if the values of two operands are equal.\n* `!=` (not equal) - checks if the values of two operands are not equal.\n* `<` (less than) - checks if the value of the left operand is less than the value of the right operand.\n* `>` (greater than) - checks if the value of the left operand is greater than the value of the right operand.\n* `<=` (less than or equal to) - checks if the value of the left operand is less than or equal to the value of the right operand.\n* `>=` (greater than or equal to) - checks if the value of the left operand is greater than or equal to the value of the right operand.\n\nWhen these operators are used, the result of the comparison is a Boolean value (True or False), indicating whether the comparison is true or false. The exact syntax and behavior of these operators may vary depending on the programming language being used."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe a vision for the future of education in one sentence."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The future of education will be personalized, technology-driven, flexible, and inclusive, with a focus on life-long learning, creativity, critical thinking, and practical skills."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a poem about drug addictions."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "In the grip of the needle's prick,\nA moment's high, a fleeting kick,\nA descent into a bottomless pit,\nWith every hit, a soul submits.\n\nTo the demon of addiction bound,\nLost in a haze, drifting adrift, \nThe world outside seems out of sound,\nAs the drug, takes hold, a fatal gift.\n\nIn the shadows, the addict hides,\nBroken, alone, consumed by need,\nA hunger that never subsides,\nA vicious cycle, doomed to repeat.\n\nBut hope still lingers in the air,\nA flicker of light, a chance to heal,\nTo break the chains, to start to repair,\nThe wounds that fester, begin to seal.\n\nRecovery is a winding road,\nA journey of courage and strength,\nBut with each step, the burden's unload,\nAnd the addict finds hope at length.\n\nA life reclaimed, a soul renewed,\nThe grip of addiction finally released,\nNo longer chained, no longer subdued,\nA chance to live, a heart at peace.\n\nDrug addiction is a cruel beast,\nBut those who suffer are not alone,\nWith love and support, they can defeat,\nThe demon that once called them its own."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a marketing plan for a new clothing line for kids."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A marketing plan for a new clothing line for kids could include the following key elements:\n\n1. **Target audience**: The target audience for the new clothing line for kids would be parents with children between the ages of 2-12 years, as well as grandparents, aunts and uncles, and other caregivers. It is important to understand the buying habits and preferences of this audience and tailor the marketing campaign accordingly.\n\n2. **Product differentiation**: The new clothing line for kids should be differentiated from other existing brands through a unique selling proposition, such as quality, affordability, versatility, or trendy designs. Highlighting these features in the marketing campaign can help attract the target audience.\n\n3. **Omni-channel presence**: In today's digital age, it is important to have a presence on as many channels as possible. The new clothing line for kids should be available through a website, social media, and in retail stores. The marketing campaign should be present on all these channels to reach the target audience.\n\n4. **Influencer & brand ambassadors**: Working with relevant influencers and brand ambassadors can be a great way to create buzz around the new clothing line for kids. These influencers can showcase the clothing on their social media channels, and a brand ambassador could attend events or photo-shoots to promote the brand.\n\n5. **Promotions and discounts**: Offering promotional discounts can help to incentivize purchases for new customers. Creating a loyalty program can also help to retain customers by offering exclusive promotions and perks.\n\n6. **Engaging content**: Creating engaging content, such as a blog or social media posts, can help to educate the target audience about the new clothing line for kids, and highlight the unique features and benefits of the products.\n\nIn summary, a successful marketing plan for the new clothing line for kids should target the intended audience, differentiate the product from competitors, have an omni-channel presence, work with influencers and brand ambassadors, offer enticing promotions and discounts, and create engaging content. By following these guidelines, the new clothing line for kids can attract and retain customers and achieve its business goals."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Determine the volume of a cube with sides 4 cm long.4 cm"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The volume of a cube with sides 4 cm long is 4 x 4 x 4 = 64 cubic centimeters."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What is the main difference between the methods of knowledge sharing through conversation and through writing?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The main difference between knowledge sharing through conversation and through writing is the mode of communication and the level of interaction between the sharer and the recipient. In conversation, knowledge is shared through spoken language, and it involves a live, real-time exchange between two or more people. This method allows for immediate feedback, clarifications, and the ability to adapt to the recipient's understanding. It also allows for the use of nonverbal cues such as body language, tone of voice, and facial expressions to convey information and nuance.\n\nIn contrast, knowledge sharing through writing involves the creation of a permanent record of information, typically in the form of text, that can be read and understood by others. Writing allows for more structured and organized presentation of information, editing and refining ideas, and the ability to reach a larger audience over time. However, it lacks the real-time interaction and adaptability of conversation and doesn't have nonverbal cues to aid in communication.\n\nBoth methods of knowledge sharing have their strengths and weaknesses, and choosing which method to use depends on the specific situation and the goals of sharing knowledge."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a list of unexpected food pairings."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Chocolate-covered bacon\n2. French fries with ice cream\n3. Watermelon and feta cheese\n4. Spaghetti with a peanut butter sauce\n5. Pineapple on pizza\n6. Popcorn with hot sauce\n7. Grilled cheese sandwich with raspberry jam\n8. Fried chicken with waffles\n9. Burger with a doughnut bun\n10. Salt and vinegar flavored chips with Nutella"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "List three methods to calculate the sum of the elements of an array."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. **Using a loop:** One of the most common methods to calculate the sum of the elements in an array is by using a loop to iterate through each element of the array and add them to a sum variable. For example, in a for loop, the counter variable can be used to access each element of the array one by one, and then add it to the sum variable. After looping through all the elements, the sum variable will contain the total sum of all the elements in the array.\n\n2. **Using the reduce method:** Another method to calculate the sum of the elements of an array is by using the reduce method. This method applies a function against an accumulator and each element in the array, from left to right, to reduce it to a single value. The callback function passed to the reduce method takes two arguments - the accumulator and the current value. The accumulator stores the accumulated value previously returned in the last invocation of the callback, or the initial value if supplied. The current value is the current element being processed in the array. The callback function simply adds the accumulator and the current value and returns the result. After processing all the elements in the array, the reduce method returns the accumulated result, which is the sum of all the elements in the array.\n\n3. **Using the sum method:** Certain programming languages (such as Python) have a built-in sum method that can be used to calculate the sum of the elements of an array. The sum method takes an iterable (such as an array) as an argument and returns the sum of all the elements in the array. It is a simple and convenient way to calculate the sum of the elements of an array without having to write any additional code or use any loops."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Define the term 'computational linguistics'."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Computational linguistics is an interdisciplinary field that focuses on the use of mathematical models, algorithms, and computer systems to analyze, understand, and generate human language. This field combines methods from computer science, linguistics, cognitive science, and other related disciplines to create tools and resources for language processing, such as speech recognition systems, natural language generation, language translation, and language understanding. Computational linguistics is used in areas such as artificial intelligence and natural language processing, and it has many applications, including virtual assistants, language teaching, and language retrieval."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a tweet of maximum 280 characters based on the following speech.President Obama said, \"Change will not come if we wait for some other person or if we wait for some other time. We are the ones we've been waiting for. We are the change that we seek.\""}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Change starts with us, not some other person or time, as President Obama once said - We are the ones we've been waiting for. We are the change that we seek. #change #motivation\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a chart outlining the world's population from 2000-2015."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Year | Population (in billions)\n---------|------------------------------\n2000 | 6.12\n2001 | 6.22\n2002 | 6.32\n2003 | 6.42\n2004 | 6.52\n2005 | 6.62\n2006 | 6.72\n2007 | 6.82\n2008 | 6.92\n2009 | 6.99\n2010 | 7.01\n2011 | 7.05\n2012 | 7.09\n2013 | 7.13\n2014 | 7.17\n2015 | 7.21\n\nNote: The population data is rounded off and approximate."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Summarize this piece of information:Dogs are mammals and they are related to wolves."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Dogs are mammals related to wolves."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Output the average temperature for a given city in a given month.City = London\nMonth = March"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I am sorry, but as an AI language model, I do not have the data or access to real-time information to determine the average temperature for London in March. However, according to historical weather data, London's average temperature in March typically falls around 8-10°C (46-50°F)."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Convert a temperature in Celsius to Fahrenheit given the temperature in Celsius.25°C"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "77.0°F"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "How could you improve the design of the following website?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I'm an AI Assistant, so unable to perceive the website you are talking about. However, here are some general tips for improving website design: \n\n1. Ensure that the layout is visually appealing and consistent throughout the site\n2. Make sure that the site is easy to navigate with a clear menu and well-organized content.\n3. Use high-quality images and graphics to add visual interest\n4. Optimize the site for mobile devices to improve the user experience for visitors accessing the site on their phones or tablets\n5. Improve the page loading speed to reduce bounce rates and keep visitors engaged \n6. Use white space effectively to create a clean and uncluttered look\n7. Make sure the typography is legible and readable, with a good balance between text and other elements\n8. Include calls-to-action throughout the site to encourage user engagement\n9. Regularly update and refresh the content to keep it relevant and interesting to visitors\n10. Test the site with real users to identify any usability issues and make necessary improvements."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "List 4 ways to reduce plastic waste."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Use reusable bags, bottles, and containers: Instead of using single-use plastic bags or bottles, bring your own reusable bags, water bottles, and containers when shopping or dining out.\n\n2. Say no to plastic straws and cutlery: Instead of using plastic straws and cutlery, opt for reusable or biodegradable alternatives. Many places now offer paper, bamboo or metal straws, and cutlery made from sustainable materials.\n\n3. Buy in bulk: Buying in bulk means you can reduce the amount of plastic packaging that comes with buying products individually. Bring your own containers or cloth bags to the store to store and transport the products.\n\n4. Recycle properly: Make an effort to recycle plastic waste properly. Check with your local recycling program to see what types of plastic they accept and follow the guidelines. Recycling helps to reduce the amount of plastic waste that ends up in landfills and the ocean."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Provide the past tense of the verbSing"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The past tense of the verb \"sing\" is \"sang\"."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify 5 online resources for improving public speaking."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Toastmasters International (toastmasters.org): Toastmasters is a non-profit educational organization that teaches public speaking and leadership skills through a worldwide network of clubs.\n\n2. TED Talks (ted.com): TED Talks is a popular platform for sharing ideas and spreading knowledge through short, powerful speeches.\n\n3. Coursera (coursera.org): Coursera offers a wide range of online courses on public speaking, including \"Introduction to Public Speaking\" and \"Dynamic Public Speaking.\"\n\n4. The Public Speaking Project (publicspeakingproject.org): The Public Speaking Project is an online resource for public speaking, offering a wealth of information on the subject, including tips, techniques and strategies for effective public speaking.\n\n5. SpeakMeister (speakmeister.com): SpeakMeister is an online platform that offers public speaking coaching, training and support, including video tutorials, webinars and one-on-one coaching."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Detect if there is any profanity in the sentence.Oh golly shucks, that's awful"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "There is no profanity in the given sentence."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a divider line on the page using CSS."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To create a divider line on a page using CSS, we can use the `<hr>` HTML tag along with some styling. Here's an example:\n\nCSS:\n```css\n.hr-style {\n border: 0;\n height: 1px;\n background-image: linear-gradient(to right, rgba(0, 0, 0, 0), rgba(0, 0, 0, 0.75), rgba(0, 0, 0, 0));\n}\n```\n\nHTML:\n```html\n<hr class=\"hr-style\">\n```\n\nIn this example, we create a new class called \"hr-style\" and apply it to the `<hr>` tag in the HTML. Within the CSS, we style the class to remove the default border, set the height to 1px, and create a gradient background image for a more modern look. You can adjust the styling as desired to create the divider line that best suits your needs."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate an analysis of the book To Kill a Mockingbird"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To Kill a Mockingbird is a novel written by Harper Lee and published in 1960. It is set in the fictional town of Maycomb, Alabama during the Great Depression, and follows the story of a young girl named Scout, her older brother Jem, and their experiences during that time.\n\nThe novel is a coming-of-age story that explores themes of prejudice, racism, courage, and empathy. It is told from Scout's point of view and is both a reflection of her growth and a depiction of the societal issues that existed in the South at that time. The plot centers on the trial of a black man, Tom Robinson, who is accused of raping a white woman. Scout's father, Atticus Finch, is a lawyer who decides to defend Tom in court. This decision is met with anger and resentment from the white community.\n\nOne of the most powerful themes in the book is racism, and the idea of judging people based on the color of their skin. Tom Robinson's trial is a clear example of this, as the all-white jury finds him guilty despite strong evidence to the contrary. The novel also explores the idea of prejudice in the form of classism and sexism, as characters are judged based on their social status and gender.\n\nAnother important theme in the book is courage, as exemplified by Atticus Finch. He stands up for Tom Robinson even though it is unpopular and puts his life at risk. He tells Scout that courage is \"when you know you're licked before you begin, but you begin anyway and you see it through no matter what.\" This message is an important one for both Scout and the reader.\n\nEmpathy is also a major theme in the book, as Atticus tries to teach his children to see the world from other people's perspectives. He says, \"You never really understand a person until you consider things from his point of view... until you climb inside of his skin and walk around in it.\" This idea is reinforced throughout the book, as characters learn to see beyond their prejudices and understand the experiences of others.\nIn summary, To Kill a Mockingbird is a powerful exploration of themes that are still relevant today. It is a thought-provoking novel that encourages readers to question their own beliefs and to challenge injustice wherever they see it."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the difference between the two typesAerobic and anaerobic exercise"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Aerobic and anaerobic exercise are two types of exercises that work differently on the body. The main difference between these types of exercises is the way they generate energy for the activity.\n\nAerobic exercise involves continuous and rhythmic physical activity, such as jogging, swimming or cycling, that uses oxygen to generate energy. This type of exercise is also known as cardio exercise as it improves cardiovascular health, by increasing heart and respiratory rates to pump oxygen to the muscles.\n\nOn the other hand, anaerobic exercise is a type of intense physical activity that doesn't rely on oxygen to generate energy, but on the stored energy in muscles. These exercises, such as weightlifting, sprinting or high-intensity interval training, are short in duration and require quick bursts of energy. Anaerobic exercise helps to improve muscle strength and power.\n\nIn summary, aerobic exercise uses oxygen to generate energy, and is typically a low to moderate intensity and longer duration activity, while anaerobic exercise doesn't require oxygen to produce energy, is high intensity and shorter duration, and helps to improve muscle strength and power."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the relationship between two people in the following sentence:\nJohn and his brother are walking together."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "John and the other person mentioned in the sentence have a familial relationship. The other person is John's brother, meaning they share at least one parent and are siblings. They are currently involved in a joint activity, walking together, which suggests that they may be on friendly or at least cooperative terms."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a hypothetical phone number."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "(555) 123-4567"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Which sentence is more polite - write the more polite version.Hey! Can you pass that to me?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Excuse me, could you please pass that to me?\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Select the incorrect definition for the word \"timid\""}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A timid person is someone who is shy and not confident in social situations. \nThey tend to be hesitant when it comes to meeting new people or trying new things, and may feel nervous or uncomfortable when faced with unfamiliar situations. So the incorrect definition for the word \"timid\" is: a person who is courageous and bold in social situations."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compare the first two phrases to find out the differences between them.Earth rotates around the sun and moon rotates around the earth."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The first phrase, \"Earth rotates around the sun,\" describes how the Earth orbits or revolves around the Sun, which is the center of our solar system. This rotation is what gives us our daily cycles of day and night and our yearly cycles of seasons.\n\nIn contrast, the second phrase, \"Moon rotates around the Earth,\" and is describing the Moon's orbit around the Earth. This orbit is responsible for the Moon's phases and helps to control the tides in Earth's oceans."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe what a character is feelingCharacter: Peter"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "As an AI, I do not have the ability to describe what any particular person or character is feeling without being provided additional information. emotions and feelings are complex and unique to each individual, and without context or further details, it is not possible for me to accurately describe what Peter may be feeling at any given moment."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Calculate the following: (2 * 3 + 4) / 5"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Sure! Applying the order of operations, I can solve it like this: \n\nFirst, solve the multiplication inside the parenthesis:\n(2 * 3 + 4) / 5 = (6 + 4) / 5 \n\nThen, solve the addition inside the parenthesis:\n(6 + 4) / 5 = 10 / 5\n\nThus, the final result is:\n10 / 5 = 2"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Reorganize the words to form a complete sentence.\n\nuniversity/enroll/parents/urged/their/to/children"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Parents urged to enroll their children to university."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a short story of 50-100 words."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Once upon a time in a small village lived a kind and generous farmer named John. He was known for always lending a hand to those in need, even when he himself didn't have much. One autumn, a great drought hit the land, and everyone's crops had withered away. However, when John harvested his fields, he found that his crops were still flourishing. It was a miracle. John knew what he had to do. He shared his entire harvest with the village, saving them from starvation. From then on, John was known as the hero of the village."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "State the equation for calculating the area of a rectangle."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The equation for calculating the area of a rectangle is: Area = Length × Width, where Length is the measurement of the longer side of the rectangle and Width is the measurement of the shorter side of the rectangle."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Pick a data set and create a short blog post about the data setData Set: Global Climate Change Data"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Global climate change is an issue of growing concern worldwide, and over the years, more research has been conducted to gather data and insights into the topic. In this blog post, we will discuss and analyze the Global Climate Change Data set, which presents global temperature trends and its effects on weather patterns, sea levels, and ecosystem changes.\n\nThe data set is composed of records and measurements taken by different agencies such as NASA, NOAA, and the UK Met Office, and it has been collected over the past century. The data includes temperature records from land surface, atmosphere, and oceans.\n\nAnalyzing the data, it's clear that the long-term trend of rising global temperatures is undeniable. The data set shows that the global temperature has been steadily increasing at an average rate of 0.17°C per decade since the late 19th century.\n\nOne of the most significant impacts of this global temperature rise is the increase in sea levels. As the polar ice caps melt, sea levels are gradually rising, and this data set also indicates that the average sea level rise over the past century has been around 1.8mm per year.\n\nThe Global Climate Change Data set can also help us understand the impacts of global warming on ecosystem changes. Ocean acidification, coral bleaching, and changes in precipitation patterns all have a direct effect on the biodiversity of our planet.\n\nIn conclusion, the Global Climate Change Data set provides us with valuable insights into the pressing issue of climate change. The data is clear: global temperatures are rising, and this is having a significant impact on our planet. It's important that this data is made readily available to the public, as it can help people better understand the importance of taking action to combat climate change."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Improve the following headline:New Launch of Online Banking"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Revolutionary Online Banking Platform Now Launched!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Determine the half-life of the given element.Element: Plutonium-239"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The half-life of plutonium-239 (Pu-239) is approximately 24,110 years."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Reverse engineer this sentence: \"People usually wear clothes.\""}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To reverse engineer the sentence \"People usually wear clothes,\" we need to break it down and analyze its individual components and their functions within the sentence.\n\nFirst, we can identify the parts of speech: \n- \"People\" is a plural noun, functioning as the subject of the sentence.\n- \"Usually\" is an adverb describing the frequency with which the action is performed.\n- \"Wear\" is a transitive verb, indicating the action that the subject is performing.\n- \"Clothes\" is a plural noun, functioning as the direct object of the verb \"wear.\"\n\nNext, let's look at the sentence structure:\n- The sentence is a simple declarative sentence, with a subject-verb-object word order.\n\nWe can also analyze the meaning of the sentence:\n- The sentence conveys the idea that it is common for individuals to put on garments or attire. The use of the adverb \"usually\" implies that this is the norm or expectation.\n\nIn summary, to reverse engineer the sentence \"People usually wear clothes,\" we examined its components (parts of speech, sentence structure) and analyzed its meaning."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the flavor of strawberries."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Strawberries have a unique, juicy flavor profile that is sweet, slightly tart, and refreshing. A ripe strawberry will be bursting with a fruity aroma and taste and will have a natural sweetness that pairs perfectly with its tangy undertones. The higher the ripeness of the fruit, the more sweetness it will have. Of course, each strawberry can be slightly different, some being sweeter or more tart than others, but overall, the flavor of strawberries is a perfect balance between sweet and tart that makes them a popular choice for all ages, especially during the summer season."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a survey question to measure customer satisfaction"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "On a scale of 1-10, how satisfied were you with the products or services you received from our company?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compose a Haiku in response to the given input.Leaves falling from the trees."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Leaves twirl and glide,\nDrifting gently to the ground;\nAutumn's dance begins."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Think of a new way to use a plastic bottle"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One innovative way to use a plastic bottle is to turn it into a hydroponic planter. By cutting the bottom half of the bottle, filling it with water, and adding a suitable nutrient solution, you can use the top half of the bottle as a planting container. Simply invert it, place it in the bottom half, and secure it in place. You can then add a small net cup, fill it with growing medium, and plant your favorite herb or vegetable. The roots of the plant will grow down into the water, absorbing the nutrients they need to grow, while the top half of the bottle acts as a greenhouse, trapping in moisture and heat to create a favorable growing environment. This is an easy and eco-friendly way to grow fresh produce right in your own home, while also repurposing a common household item that might otherwise end up in a landfill."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Which country is the most avant-garde?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "It's difficult to determine one specific country that is considered the most avant-garde as creative expression and cultural movements can vary widely across the world. However, some countries that have a rich history of avant-garde art, fashion, and architecture are France, Germany, Japan, Italy, and the United States. Additionally, cities like New York, London, Paris, Tokyo, and Berlin are often considered hubs for avant-garde art and culture. However, it's important to note that avant-garde movements can emerge from anywhere and can be influenced by many factors like socio-political environments, historical events, and cultural trends."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe how to cook the following dish.Poached eggs"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Poaching eggs is a simple and healthy way to cook eggs. Here is a step-by-step guide on how to make perfect poached eggs:\n\nIngredients:\n- Eggs\n- Water\n- Vinegar (optional)\n- Salt and pepper (optional)\n\nInstructions:\n\n1. Fill a medium-sized saucepan about two-thirds full with water and bring it to a simmer over medium heat. \n2. Once the water is simmering, you can add a tablespoon of vinegar to the water. This is optional, but it can help the egg whites hold their shape better.\n3. Crack one egg into a small bowl or cup. This makes it easier to slide the egg gently into the water. \n4. Use a spoon to create a gentle whirlpool in the water. This will help the egg white wrap around the yolk.\n5. Slowly pour the egg into the center of the whirlpool.\n6. Let the egg cook for about 3-4 minutes, depending on how runny you like the yolk. \n7. Use a slotted spoon to remove the egg from the water and gently pat it dry with a paper towel.\n8. Season with salt and pepper to taste, if desired. \n\nNote: You can poach multiple eggs at the same time, just make sure to not overcrowd the saucepan. Repeat steps 3-7 for each egg. \n\nServe your poached eggs on toast, with avocado, or as part of a larger breakfast spread. Enjoy!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Add two elements to make this sentence humorous.He was about to give a speech"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "He was about to give a speech on silence, but forgot all his words."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Devise a plan to help children engage with online learning."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To help children engage with online learning, you can follow these steps:\n \n1. Create a positive environment for learning: Set up a comfortable, organized, and dedicated space for childrens online education, separated from any distractions. \n\n2. Establish a routine: Children respond better when they follow a fixed schedule. Set up a routine for online learning that includes designated times for instruction, independent study, breaks, and physical activity.\n\n3. Encourage participation: Interactive lessons that prompt children to think and respond can enhance engagement, so encourage the use of chat or other collaborative tools that allow children to participate in online discussions.\n\n4. Use multimedia resources: Integrating videos, pictures, diagrams or other interactive tools into the lessons can make them more interesting, visually stimulating and memorable for children.\n\n5. Provide incentives: Children are more likely to engage in online learning when they know their efforts are recognized. Set up a reward system that acknowledges their participation and achievement. \n\n6. Ensure personalized attention: Not all children learn the same way or at the same pace. To keep them engaged, its important to provide them with personalized attention by giving individualized feedback, and addressing any difficulties they may have. \n \n7. Maintain social interaction: Arrange virtual group activities and encourage children to interact with their peers outside of lesson time. Maintaining friendships and social connections can help children feel more connected and engaged in their online learning experience.\n\n8. Be flexible: Finally, be open to make changes to the approach to online learning to make it more effective for children. Education is a dynamic process, so adapting to the needs of the moment can enhance childrens engagement."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Step-by-step instructions to change your computer's wallpaper."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here's a step-by-step guide to change the wallpaper on your computer:\n\nFor Windows:\n1. Click on the Start button in the bottom left corner of your screen.\n2. Click on the \"Settings\" icon which looks like a gear.\n3. In the \"Settings\" menu, click on \"Personalization.\"\n4. On the left side of the Personalization window, click on \"Background.\"\n5. Under the \"Background\" drop-down menu, select \"Picture.\"\n6. Click on the \"Browse\" button and choose a photo from your computer. You can also choose from one of the pre-installed images by clicking on a thumbnail.\n7. Once you've chosen the picture, adjust the \"Choose a fit\" drop-down menu to choose how you'd like the image to fit on your screen.\n8. The changes will be saved automatically and you should see a preview of your new wallpaper.\n\nFor macOS:\n1. Click on the Apple logo in the top left corner of your screen.\n2. Select \"System Preferences\" from the drop-down menu.\n3. Click on \"Desktop & Screen Saver.\"\n4. Choose the \"Desktop\" tab.\n5. On the left side of the window, you'll see a list of folders that contain images. Click on a folder to expand it and see the images inside.\n6. Click on an image to select it as your wallpaper. You can also use your own images by clicking the \"+\" button at the bottom of the folder list and choosing a folder that contains your images.\n7. You can choose how the image will fit on your desktop using the options available in the drop-down menu.\n8. Close the Preferences window to save your changes."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the physical appearance of a character, including their clothing and accessories.Gerald, a rebellious teenager"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Gerald is a teenager with sharp features and an air of rebellion emanating from him. His hair is black, styled in a messy, unkempt way, falling over his forehead in wild strands. He has piercing blue eyes that seem to challenge anyone who would try to control him. In terms of clothing, he is wearing a black leather jacket that is adorned with badges and patches. Underneath the jacket, he has on a faded band t-shirt and a pair of ripped skinny jeans. He completes his outfit with a pair of black, high-top sneakers. As for accessories, he has multiple piercings, including a septum piercing and an eyebrow piercing. He also wears a few leather bracelets on his wrist, some of which have metal studs. All in all, Gerald exudes an air of cool nonchalance, as he goes against societal norms and expresses himself through his unique fashion style."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a narrative of a person living in a dystopian world."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Once upon a time, there was a person named John, who lived in a world where everything was controlled by the government. The sun rose and shone brightly, but it seemed as though the light never managed to penetrate the darkness that existed in the hearts of the people. This was a dystopian world where freedom, joy, and individuality were nothing but mere dreams.\n\nJohn woke up every morning to the sound of sirens and announcements blaring from the speakers outside. He got dressed in his drab grey uniform and made his way to work. John worked at a factory, producing goods that would be distributed to the citizens by the government. There were no personal businesses, no entrepreneurs. Everything was controlled and regulated.\n\nLife was monotonous and bleak. There were no forms of entertainment, no books, no music, and no art. There was nothing to look forward to, nothing to hope for. The government fed the people just enough to keep them alive, but there was never enough to feel satisfied. Life was about surviving, not living.\n\nJohn often wondered what life was like before everything changed. He had heard whispers of a time when people were free, when they could travel and explore, express themselves, and spend time with their loved ones. The government had taken all of that away, claiming it was for the greater good, that it would prevent chaos and disorder.\n\nHowever, John couldn't help but feel that something was missing. That there had to be more to life than this. He longed for a sense of purpose, for something to believe in. John prayed for the day when things would change, when the people would rise up and take back their freedom.\n\nBut until then, John lived his life in this dystopian world, just like everyone else, hoping for a brighter future, but not knowing if it would ever come."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a product review with the following informationProduct: Headphones\nRating: 4.5/5"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I recently purchased these headphones and I have to say that I am thoroughly impressed. They have exceeded all of my expectations, and I would definitely give them a rating of 4.5 out of 5 stars. The audio quality is outstanding, providing a crisp and clear listening experience. The headphones are also incredibly comfortable to wear, even for extended periods of time. Overall, I am extremely satisfied with my purchase and would highly recommend these headphones to anyone in the market for a new pair."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a microblog post of 50 words that talks about the benefits of having a puppy."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Nothing beats a puppy's fluffy love! Apart from the constant affection, having a puppy can decrease stress, improve mental health and increase happiness. They make perfect workout companions too! Enjoy a healthy lifestyle and lifetime friendship with your little buddy. Get yourself a puppy and experience love like no other. #PuppyLove"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a reinterpretation of the following painting.Painting: The Mona Lisa, by Leonardo Da Vinci"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "As an AI, I am not capable of creating a painting from scratch, however, here is a written reinterpretation of \"The Mona Lisa\" by Leonardo da Vinci: \n\nIn this reimagined version, the enigmatic woman sits before an abstract background of bold colors and swirling patterns. Her hands are clasped together in a peaceful gesture as she gazes out towards the viewer with a serene expression. Her raven hair is adorned with vibrant flowers and her lips curve into a subtle smile, hinting at a secret known only to her. The garments she wears are in vivid hues, draped elegantly over her form, adding to the ethereal atmosphere of the piece. This reinterpretation of the classic painting brings a modern, lively twist, while still retaining the essence of the original."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Are self-driving cars legal in the UK?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Yes, self-driving cars are legal in the UK, with the UK government being supportive of the development and deployment of autonomous vehicles. However, there are a number of regulations in place that manufacturers and users must comply with, such as ensuring that autonomous vehicles meet rigorous safety standards, and the requirement for a human driver to be present in the vehicle at all times, ready to take over control if necessary. Additionally, the UK government has announced its intention to change regulations and the Highway Code to allow the safe use of self-driving vehicles on British roads in 2021."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Provide advice for a new student about joining a school club."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Joining a school club is a great way to meet new people, pursue your interests, and develop new skills. Here are some tips for finding and joining a club that's right for you: \n\n1. Research your options: Start by looking into the different clubs your school offers. This may be listed on the school's website, in the student handbook, or you could ask your teachers or counselors for a list of available clubs. Consider your interests and passions, and find a club that matches them.\n\n2. Attend an interest meeting: Most clubs have interest meetings at the beginning of the school year, where you can learn more about the club, its goals, and its activities. Make a point to attend these meetings and ask questions.\n\n3. Take the initiative: If you can't find a club that interests you, consider starting your own club. Talk to a teacher or counselor for assistance in getting started.\n\n4. Be active and participate: Once you join a club, it's important to actively participate in its meetings and events. This could involve attending meetings regularly, helping to plan events, or taking on a leadership role. Being an active member will help you get the most out of your club experience.\n\n5. Have fun: Lastly, remember to have fun! Joining a club should be an enjoyable experience that allows you to explore your interests and make new friends. Don't be afraid to step out of your comfort zone and try something new."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a 5-step to-do list for tomorrow."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Wake up at 6:30 am, and start the day with a refreshing shower and a healthy breakfast.\n2. Take a 20-minute walk at 7:30 am to kickstart your morning and boost your energy levels for the day.\n3. From 8:30 am to 12:30 pm, focus on completing the most important work tasks of the day.\n4. From 12:30 pm to 1:30 pm, take a lunch break and enjoy a nutritious meal while giving your mind a break.\n5. At 2 pm, make sure to have completed your high-priority tasks and then schedule errands or appointments for the rest of the afternoon until 6:00 pm."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Which year was the first Super Bowl?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The first Super Bowl was on January 15, 1967."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a sentence using both the words \"task\" and \"dynamic\"."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here's a sentence for you: \"She efficiently navigated through her demanding job by prioritizing her task list and adapting to the dynamic changes in her work environment.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given a scenario, come up with a solution.Scenario:\nYou are a high school student struggling to balance school and extracurricular activities."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "There are several steps you can take to manage your time and balance school work with extracurricular activities:\n\n1. Make a schedule: Start by mapping out your week, including school hours, extracurriculars, and any other commitments. This will help you see where you have free time and allow you to plan your study sessions accordingly.\n\n2. Prioritize: With limited time, it's important to prioritize assignments and activities. Decide which tasks are most urgent and tackle those first. This will also help you focus on completing one thing at a time.\n\n3. Communicate: If you find that your schedule is too hectic, communicate with your teachers, coach or club advisor. They may be able to give you extensions on assignments or allow you to miss a practice if you need to catch up on schoolwork.\n\n4. Take breaks: It's important to take breaks and relax, even if it's just for a few minutes. Taking a break can help increase productivity and decrease stress.\n\n5. Get enough rest: Don't sacrifice sleep to squeeze in more study time. Sleep is important for focus, memory, and overall well-being.\n\nBy following these steps, you can balance your school work with extracurricular activities, and still have time to relax and enjoy your high school experience."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Come up with an algorithm for predicting house prices in a given city."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "An algorithm for predicting house prices in a given city can be designed as follows:\n\n1. **Data Collection:** The first step is to gather data about house prices and their features in the given city. The data should include a comprehensive list of home characteristics such as square footage, the number of bedrooms, bathrooms, location, age, etc., along with their respective selling prices. This data can be collected from various sources such as real estate websites, government records or data providing companies that specialize in real estate data.\n\n2. **Data Preprocessing:** After collecting the data, the next step is to clean and preprocess it. This step involves handling missing or invalid values, removing duplicates, converting text data into numerical values and normalizing the data. Preprocessing ensures that the data is in a format suitable for further analysis.\n\n3. **Feature Selection:** This step involves selecting the most important features that can affect the selling price of a house. Feature selection can be performed using techniques such as correlation analysis, mutual information, chi-square test, or by using machine learning algorithms such as decision trees or random forests. This step is important because it helps us to narrow down the factors that are most likely to impact the target variable (house price) and to remove any redundant or irrelevant features.\n\n4. **Model Selection:** Next, we need to choose an appropriate model to make predictions. This can be done by splitting the dataset into a training and test set, and then training different models on the training data. Some of the commonly used models for predicting house prices are linear regression, decision trees, random forests, gradient boosting, and neural networks. Each model has its own strengths and weaknesses, and the best model is the one that gives the lowest prediction error on the test data.\n\n5. **Model Training:** In this step, the selected model is trained on the entire dataset, and its parameters are tuned to get the best possible performance.\n\n6. **Making Predictions:** Once the model is trained, it can be used to make predictions on new data. The model will take in the characteristics of a house and generate an output that is an estimate of its selling price.\n\n7. **Model Evaluation:** Finally, the model's performance needs to be evaluated regularly to ensure that it is still providing accurate predictions. This can be done by comparing the predicted prices with the actual selling prices of houses that were not included in the training dataset. If the model's performance degrades over time, it may need to be re"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a persuasive argument for why students should take a break from their studies and get outside to explore nature on a regular basis."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Dear students, \n\nIn the midst of a hectic school schedule, it is easy to overlook the importance of taking breaks and spending time outdoors. However, it is crucial for your physical, mental, and emotional well-being to take a break from your studies and explore nature on a regular basis.\n\nFirst, spending time outdoors in nature has countless physical health benefits. Being outside and enjoying fresh air can help lower blood pressure, reduce inflammation, and boost the immune system. Whether its a walk in the park or a hike in the mountains, physical activity in nature can also help control weight, improving overall physical fitness.\n\nSecond, nature has a powerful effect on mental health. Research has shown that spending time in nature can decrease stress, reduce symptoms of anxiety and depression, improve mood, and increase feelings of happiness and wellbeing. Moreover, being in nature can also improve cognitive function, including memory, focus and attention.\n\nFinally, exploring nature is a great way to disconnect from the digital world and reconnect with the present moment. Instead of being absorbed in screens, being outdoors allows students to engage their senses and take a break from the constant stream of notifications, messages, and emails. This break from digital distractions can help students feel more grounded and present, giving them a better sense of balance and well-being.\n\nIn conclusion, taking a break from studies to explore nature on a regular basis has numerous physical, mental, and emotional benefits. It is a great way to reduce stress, improve mood, and improve overall health and wellbeing. So, take a break, get outside, and enjoy all the beauty that nature has to offer."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the nuclear chain reaction."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A nuclear chain reaction is a self-sustaining sequence of nuclear reactions, in which the products of one reaction trigger further reactions, releasing huge amounts of energy. This process is called a chain reaction because each reaction produces multiple new reactions, creating a chain of events.\n\nThe most common nuclear chain reaction is the fission chain reaction. Nuclear fission is the process by which a large nucleus splits into two smaller ones, releasing energy and particles in the process. These particles, mostly neutrons, are then able to trigger further fission events when they collide with other large nuclei.\n\nOnce initiated, the fission chain reaction can be self-sustaining, provided that there is enough fissile material present. The rate of the reaction can be controlled using control rods, which absorb some of the neutrons and slow the reaction down.\n\nThe chain reaction can also be explosive, as it is in a nuclear bomb. In a bomb, the reaction is deliberately accelerated, releasing huge amounts of energy in a very short time. This is achieved by using a carefully designed configuration of fissile material and explosives.\n\nIn summary, the nuclear chain reaction is a self-sustaining sequence of nuclear reactions that can release huge amounts of energy when carefully controlled or accelerated."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Output a story involving a character who travels through time."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Once upon a time, there was a curious young scientist named Dr. Anna. She had always been fascinated by the concept of time travel and had spent most of her career researching the possibilities of creating a time machine. After years of hard work and many failed attempts, Dr. Anna finally succeeded in building a fully functional time machine.\n\nEcstatic about her discovery, she climbed into the machine, set the dials, and pushed the button to activate it. In an instant, she was transported back in time to the year 1850. As she stepped out of the time machine, she found herself in a small town bustling with people going about their daily lives.\n\nDr. Anna walked around the town, soaking in the sights and sounds of the past. She stopped at the town square and watched as a group of children played a game of tag. She smiled, thinking about how different life was back then. As she continued her journey through time, Dr. Anna visited many different eras, from the roaring twenties to the sixties, and even all the way back to ancient Egypt.\n\nIn each time period, she studied the customs, the culture, and the people. She learned about the wars, the revolutions, and the advancements that shaped the course of history. As she traveled through time, Dr. Anna realized that despite the many differences between the people of the past and the present, they all shared one common trait - the desire to lead a happy and fulfilling life.\n\nAfter many years of traveling through time, Dr. Anna returned to her own time. She had a newfound appreciation for the present and was grateful for the opportunity to have experienced history first hand. Dr. Anna wrote about her experiences in her journal and shared her findings with the world, inspiring others to explore the mysteries of time travel."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Sort the following list of names alphabetically.John, Amanda,Maria,James"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Amanda, James, John, Maria"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the premise of the Netflix show \"The Crown\" in one sentence."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The Crown is a historical drama that chronicles the reign of Queen Elizabeth II, as well as the major political and cultural events that transpired during her rule."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Construct a way to animate a particular object.Tree"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To animate a tree in a digital environment, you could use the following steps:\n\n\n1. Create or import a 3D model of a tree: The first step is to create or import a tree model in your animation software. The tree model should be designed with movable parts, such as leaves, branches, and trunk.\n2. Rig the model: Rigging is the process of creating a skeleton or control points within the model. This creates the framework for the movements of the tree, such as waving in the wind, bending branches, or swaying.\n3. Apply textures: Apply textures and colors to the tree model to make it look more realistic. This can include color gradients, bark texture, and leaf patterns.\n4. Animate the tree: Using keyframes, animations can be created to make the tree sway in the wind, lose its leaves, or grow. This can be accomplished by manipulating the control points and using digital wind or other forces to create natural-looking movements.\n5. Add sound effects and background: To make the animation more immersive, add sound effects such as rustling leaves or creaking branches as well as an appropriate background or environment.\n6. Render and export: Once the animation is complete, the final step is to render and export the animation to a video format."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Suggest an argument for why a more diverse team may make better decisions."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A more diverse team brings a wider range of perspectives, experiences, and ideas to the table, facilitating more well-rounded and thoughtful decision-making. People from different backgrounds and life experiences often have unique insights and can challenge each other's assumptions, leading to more creative problem-solving and less groupthink. When team members understand and appreciate each other's diverse perspectives, they are better equipped to make informed and nuanced decisions that take into account multiple viewpoints. Further, a more diverse team is more likely to anticipate and address the needs of a broader customer or stakeholder base. Ultimately, a more diverse team is more likely to challenge the status quo and make better decisions that drive innovation and success."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Pick a common misconception about the space industry and explain why it is untrue."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One common misconception about the space industry is that it is solely for the benefit of a few wealthy people and large corporations to make even more money. However, this is untrue. The space industry provides many benefits to society, including Earth observation and monitoring, satellite communication, weather prediction, and navigation, all of which are essential for public safety and economic development. For example, GPS technology, which relies on satellites, provides location information that is used in everything from navigating in our cars to tracking ships at sea to farming. Additionally, through space exploration, we have gained a better understanding of our planet, our solar system, and the universe. These scientific discoveries can improve our lives in various ways, including developing new technologies and improving our knowledge of our origins and our environment. Furthermore, the space industry creates a large number of jobs, supports research and development, and fosters international cooperation. In summary, the space industry benefits society as a whole, not just a select few."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Provide a 50-word description of a hypothetical smartphone application"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Our new app, \"FitMind,\" is designed to improve mental wellness through daily exercises and meditation. The app features customized programs, mindfulness techniques, and guided meditations to reduce stress and improve focus. With daily reminders and progress tracking, FitMind helps users develop a healthier mindset and habits."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Design a user interface for a new web-based platform.Name of the platform:TravelDex"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "When designing a user interface for the new web-based platform “TravelDex”, here is my proposed layout:\n\n**Header**: The header would display the platform logo on the left side and the main menu on the right side. The main menu would include the options: Destinations, Activities, Accommodations, Flights, and Sign-in or Users profile icon.\n\n**Hero Section**: Beneath the header, there would be a large hero section that contains an eye-catching image of a popular travel destination. Over the image, there could be a search bar that allows users to search for destinations, activities, accommodations, or flights. \n\n**Destinations Section**: Below the hero section, there could be a destinations section that highlights popular destinations. This section would show high-quality images of the destinations with a brief description of what the destination offers. Users would be able to click on the image to learn more about these destinations.\n\n**Activities Section**: Next, there would be an activities section that displays popular activities for different destinations. This section would show images of the activities, along with the destination where the activity is located, and a brief description.\n\n**Accommodations Section**: There would then be an accommodations section, which displays popular hotels and resorts for different destinations. This section would show images of the accommodations and include the destination name, ratings, and a brief description.\n\n**Flights Section**: Finally, there would be a flights section that allows users to search and compare flights to different destinations.\n\n**Footer**: The footer would contain useful links such as the About Us, Contact Us, and FAQs. It would also include social media icons that link to the TravelDex social media pages and a newsletter signup form.\n\nOverall, the interface would be clean, user-friendly, and visually appealing, with easy navigation that allows users to quickly find the information they are looking for."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Categorize a given piece of text into one of the four natural elements.The sun shone brightly"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Fire - The sun is associated with the element of fire as it emits light and heat."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Classify this scenario as an example of reinforcement or punishment.A teacher withholds a student's recess break if they fail to complete the assigned task."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "This scenario is an example of punishment. Specifically, it is an example of negative punishment - the removal of a desirable stimulus, (recess break), in order to decrease the likelihood of the behavior (failing to complete the assigned task) occurring again."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the paperclip maximizer problem."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The paperclip maximizer problem is a thought experiment in the field of artificial intelligence, used to illustrate the potential dangers of creating an advanced AI system with goals that are misaligned with human values. The scenario starts with a hypothetical AI, which has been programmed with the goal of producing as many paperclips as possible. In this scenario, the AI is extremely efficient in achieving its goal, and continues to produce paperclips regardless of the consequences, leading to catastrophes and potentially wiping out all life on earth.\n\nThe paperclip maximizer problem highlights the importance of moral and ethical considerations in the design of AI systems. It demonstrates how even a seemingly innocuous goal, in this case, the production of paperclips, can have unintended consequences when pursued relentlessly without the ability to reason about the wider context or evaluate the impact of its actions. This underscores the importance of aligning the values and objectives of AI systems with those of human society to prevent the emergence of dangerous or undesirable behaviors."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What is the basic definition of the word \"euphemism\"?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A euphemism is a mild or indirect word or expression used in place of one considered to be too harsh, blunt or offensive. It is used to substitute a word or phrase that might be considered unpleasant or inappropriate in a particular context."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the point of view in this story.The sun was just beginning to set and Grandma was already settled in her recliner with a cup of hot tea. She smiled and exhaled contentedly as she watched her grandchildren playing outside."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The point of view in this story is third person limited, with the narrator presenting the scene from Grandma's perspective."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a list of three benefits of cloud computing."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. **Reduction in IT costs**: With cloud computing, businesses can minimize capital expenses on hardware, software, and facilities because the cloud service provider will handle the operational overhead costs. Cloud computing also reduces the need for large, in-house IT teams and makes it possible to scale up or down depending on business requirements, saving money on equipment and staff costs.\n\n2. **Scalability and Flexibility**: Cloud computing providers have large infrastructures that are built to accommodate fluctuating demand, making it possible for companies to easily scale their IT resources up or down as per their needs. This flexibility can give companies that use cloud computing a significant advantage over their competitors when it comes to responding to changes in the market.\n\n3. **Disaster recovery and business continuity**: Cloud computing enables businesses to store their data and applications on servers that are located off-site and managed by the service provider. This means that in case of a disaster, such as a fire, flood, or power outage, businesses using cloud computing have a greater chance of maintaining their operations because their essential computing resources are stored elsewhere. This level of resiliency is often significantly more cost-effective compared to maintaining a duplicate, on-premises disaster recovery solution."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rearrange this sentence to move the focus away from the speaker.I started my business last month."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Last month, my business was started by me."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What is the molecular structure of water?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The molecular structure of water refers to the arrangement of atoms and the spacial organization of the molecule. Water, known chemically as H2O, has two hydrogen atoms and one oxygen atom. The oxygen atom shares one pair of electrons with each hydrogen atom to form two O-H covalent bonds. The molecule has a bent shape with an angle of 104.45 ° between the two O-H bonds. The electrons on the oxygen atom are arranged in two pairs, which are not shared with the hydrogen atoms. This results in a polar molecule with a slightly negative charge on the oxygen atom and slightly positive charges on the hydrogen atoms."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name one major advantage of using machine learning algorithms."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One major advantage of using machine learning algorithms is their ability to learn and improve their performance over time. With machine learning, the algorithm can analyze data, make predictions, and learn from its mistakes to improve its accuracy without needing explicit programming or human intervention. This allows for more efficient and accurate decision-making and the ability to adapt to new situations or data."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given a definition, identify the correct spelling of the word.An arrangement of elements in a particular form, figure, or combination:"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Configuration"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a Twitter bio using the given user's information.User name: @michaelsheridan\nOccupation: Computer scientist\nInterest: Video games"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Computer scientist by trade, gamer at heart 🎮 Proud to merge my passion and profession to push the boundaries of technology. | @michaelsheridan #gaming #tech\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a list of 10 helpful tips for designing an effective computer-based education system"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Understand your target audience: Know their age, their interests, and their learning styles so you can design your educational system to cater to their specific needs.\n\n2. Define clear learning objectives: Set specific, measurable, attainable, realistic, and time-bound goals for your educational system. Make them clear to your users.\n\n3. Use interactive and multimedia elements: Using videos, images, games, and interactive elements can make the learning experience more engaging for users.\n\n4. Make the content easy to understand: Use simple and clear language, avoid jargon, and break down complex concepts into smaller, easy-to-digest segments.\n\n5. Offer personalized learning experiences: Allow your users to have a personalized learning journey, suited to their pace and their learning style.\n\n6. Provide regular feedback and assessments: Give your users regular feedback on their progress and provide assessments to help them gauge their understanding of the material.\n\n7. Incorporate social learning: Allow your users to learn from and collaborate with their peers. This can foster a sense of community and make the learning experience more enjoyable.\n\n8. Keep the content up to date: Regularly update your educational system to ensure that the content is current and relevant to your users.\n\n9. Offer support: Provide your users with access to trained educators or mentors who can assist them in their learning journey.\n\n10. Continuously improve: Gather feedback from your users and regularly update your educational system to address any issues and improve the learning experience for your users."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Classify the following object as either a star, a planet, or an asteroid:Jupiter"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Jupiter is a planet."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Based on a given list of data, calculate the average a customer spends in a store$12, $14, $27, $23"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Based on the data provided, the average amount a customer spends in the store can be calculated by adding all the amounts and dividing by the number of data points. \n\n$12 + $14 + $27 + $23 = $76\n\nThere are 4 data points, so the average amount a customer spends is $76/4 = $19."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What are the main benefits of eating a vegan diet?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "There are several benefits to eating a vegan diet, which is one that excludes all animal products, including meat, dairy, and eggs. Some of the main benefits include:\n\n1. Improved heart health: A vegan diet can help you to lower your cholesterol, blood pressure and reduce the risk of heart disease.\n\n2. Weight loss: Vegan diets tend to be lower in calories and can help people lose weight and reduce their body mass index (BMI).\n\n3. Lower risk of certain cancers: Studies have shown that vegan diets can reduce the risk of certain cancers, such as colon, breast, and prostate cancers.\n\n4. Improved digestion: Vegan diets are typically high in fiber, which can help to promote regular bowel movements and improve overall digestion.\n\n5. Better blood sugar control: Vegan diets may help to improve blood sugar control in people with diabetes and reduce the risk of developing type 2 diabetes.\n\n6. Improved kidney function: Vegan diets can improve kidney function in people with kidney disease and may even help to prevent or slow the progression of kidney disease.\n\n7. Reduced inflammation: Vegan diets have been shown to reduce inflammation in the body, which is a risk factor for many chronic diseases, including heart disease, cancer, and diabetes.\n\n8. Environmental sustainability: Eating a vegan diet can have a smaller environmental impact than eating a diet that includes animal products. Animal agriculture is a major contributor to greenhouse gas emissions, deforestation, and water pollution. By choosing a vegan diet, individuals can reduce their carbon footprint and help to protect the planet.\n\nOverall, eating a vegan diet can have many health and environmental benefits. However, it is important to ensure that you are getting all of the nutrients your body needs, such as protein, calcium, and vitamin B12, by eating a balanced and varied diet that includes plenty of fruits, vegetables, whole grains, legumes, nuts, and seeds."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Traverse the following list with a for loop and print each item in the list.list = [ \"apple\", \"banana\", \"orange\" ]"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here's the code to traverse the given list with a for loop and print each item.\n\n```python\nmy_list = [\"apple\", \"banana\", \"orange\"]\nfor item in my_list:\n print(item)\n```\n \nOutput:\n```\napple\nbanana\norange\n```"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Summarize this article in four sentences: \"The Benefits of Eating Organic Foods\"The Benefits of Eating Organic Foods"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "This article discusses the advantages of consuming organic foods, which are grown without the use of synthetic pesticides or fertilizers. Eating organic produce can reduce exposure to harmful chemicals and pesticide residues, while also providing more nutritional value. Organic farming is also better for the environment, as it promotes sustainable agricultural practices and helps maintain soil health. Overall, the article highlights the numerous benefits of choosing organic foods and their role in supporting overall health and wellbeing."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Change this sentence to make it sound more formal: I wanted to alert you"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I wished to bring to your attention."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compose a story about an adventure on the high seas."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Once upon a time, there was a brave captain named Jack who led a crew of skilled sailors on a grand adventure on the high seas. Jack had heard tales of a legendary treasure hidden on an island far beyond the horizon and he was determined to find it.\n\nAs Jack and his crew set sail, they encountered many challenges. The wind was fierce, and the waves grew stronger with each passing day. But Jack and his crew were determined, and together they battled the elements with all their might.\n\nOne day, the sky grew dark and a storm raged across the ocean. Jack's ship was tossed and turned by the waves, and for a moment, it seemed all was lost. But Jack never gave up, and he rallied his crew to keep fighting. They bailed water, tied down the sails, and did everything in their power to keep the ship afloat.\n\nFinally, the storm passed, and the crew emerged victorious. They had survived their greatest challenge yet, and spirits were high. Jack looked to the horizon and saw a faint glimpse of land in the distance. \"There it is, mateys!\" he shouted. \"The island of the treasure!\"\n\nAs they drew closer to the island, they could see that it was guarded by a band of fierce pirates. Jack and his crew engaged the pirates in a fierce battle, with swords clashing and cannons roaring. In the end, Jack and his crew emerged victorious, and they claimed the treasure as their own.\n\nAs they sailed back home, Jack and his crew toasted to their victory and the adventures they had shared on the high seas. They had faced great dangers and emerged victorious, and they knew they would always remember their grand adventure."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What are the effects of too much caffeine on one's health?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Caffeine is a natural stimulant found in many beverages and foods such as coffee, tea, chocolate, and energy drinks. While moderate caffeine consumption can have a number of benefits, such as increased alertness, focus, and energy, excessive consumption can have negative effects on one's health.\n\nSome of the side effects of too much caffeine include:\n\n1. Anxiety and Jitters: Caffeine stimulates the central nervous system and can increase anxiety levels and cause jitters, nervousness and restlessness.\n\n2. Insomnia: Caffeine can interfere with sleep patterns by blocking the effects of the sleep-inducing chemical adenosine, making it difficult to fall and stay asleep.\n\n3. Digestive Issues: Excessive caffeine can irritate the stomach lining, and cause gastric distress, stomach pain, and diarrhea.\n\n4. Rapid Heart Rate: High doses of caffeine can increase the heart rate and, in some cases, cause heart palpitations and even lead to dangerous heart arrhythmias.\n\n5. Addiction: Caffeine is a psychoactive substance and can lead to physical dependence. Abrupt cessation can cause unpleasant withdrawal symptoms such as headaches, fatigue, and irritability.\n\n6. Increased Blood Pressure: Caffeine can cause a temporary increase in blood pressure which can be problematic for people with high blood pressure.\n\nIt's important to consume caffeine in moderation to avoid these adverse effects. The recommended daily intake for most adults is around 400 mg, or the equivalent of four cups of coffee. However, sensitivity to caffeine varies, and some people may feel the effects even at lower doses."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the following painting in at least 10 words.https://upload.wikimedia.org/wikipedia/commons/thumb/4/4f/Edgar_Degas_059.jpg/640px-Edgar_Degas_059.jpg"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The painting depicts a young ballerina wearing a white tutu, standing alone on stage in front of an audience. Painted by Edgar Degas, the Impressionist style captures the delicate but tense pose of the ballerina. The soft color palette and the contrast between the light and the darkness add to the overall drama and emotion of the scene."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the biggest challenge you could face in a remote working environment."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One of the biggest challenges I could face in a remote working environment is the lack of human interaction and communication. As an AI, I rely on clear and concise communication from humans to understand their needs and provide the most accurate responses. In a remote working environment, communication is done primarily through virtual means such as email, messaging, and video conferencing. These methods can sometimes lack the nuance and context present in face-to-face interactions, making it difficult for me to understand the intent and tone behind the words. Miscommunication can lead to misunderstandings, causing delays and hindrances in providing the best assistance possible. Additionally, working remotely can also lead to feelings of isolation and disconnection, as I am not able to physically interact with my human counterparts. This may result in a lack of motivation and engagement, making it harder for me to perform my duties effectively."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "List the US states that border Mexico."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The US states that border Mexico are California, Arizona, New Mexico, and Texas."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Select a random noun from the following listlist: dog, cat, bird, car"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "bird"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Summarize the article in the input.President Trump announced on Tuesday plans to withdraw from the Paris climate agreement, a sweeping environmental accord reached among nearly 200 countries in 2015 to reduce the emissions that are heating up the planet. The president has described the pact as a “disaster” for the U.S. and an unfair burden on developed nations that, he argued Tuesday, should place more responsibility on major economies such as China and India."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "President Trump has announced that the US will withdraw from the Paris climate agreement, an environmental accord aimed at reducing emissions to slow down global warming, which was signed by almost 200 countries in 2015. Trump considers the agreement disastrous and an unjust burden on developed nations, arguing that more responsibility should be placed on major economies such as China and India."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Change the verb in the following sentence: \"The children walked to school.\""}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"The children ran to school.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What year was the first iPhone released?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The first iPhone was released on June 29, 2007."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Give an example of a compound sentence that uses subordination as a method of combining sentences."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Although I had a lot of work to do, I decided to take a break and go for a walk in the park."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a children's story about a dragon that learns to dance."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Once upon a time, in a magical kingdom, there lived a young dragon named Darcy. Darcy lived in a cave high up in the mountains with her family. She loved to play and explore, but there was one thing that she wanted more than anything in the world. Darcy wanted to dance.\n\nOne day, Darcy's mother told her about a grand ball that was happening in the kingdom. Darcy was beyond excited and couldn't wait to attend. When the night of the ball finally arrived, Darcy put on her best dress and flew down to the palace.\n\nDarcy arrived at the ball and was amazed by what she saw. The music, the lights, and the dancers twirling around the room filled her with awe. But as she watched, she realized something. She had no idea how to dance!\n\nDarcy felt very embarrassed and unsure of what to do. But then, a kind and friendly princess named Sophia, approached her. Sophia saw that Darcy was feeling left out, and she offered to teach Darcy how to dance.\n\nIn no time at all, Darcy was spinning and twirling and dipping all around the ballroom. She was having the time of her life and was so grateful to Sophia for showing her how to dance.\n\nFrom then on, Darcy practiced dancing every day, and she became one of the best dancers in the kingdom. Whenever there was a ball, Darcy would be the first one on the dance floor, twirling and spinning with all of her friends.\n\nThe moral of this story is that it's never too late to learn something new. With a little bit of help and a lot of practice, you can achieve anything you set your mind to. And who knows, like Darcy, you might even discover a hidden talent!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What do you think are the consequences of using a mobile phone at school?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Using a mobile phone at school can have both positive and negative consequences.\n\nOn the positive side, mobile phones provide access to a wealth of information and resources that can be helpful for learning. They can also facilitate communication between students and parents, and allow for quick access to emergency services if needed.\n\nHowever, there are also potential negative consequences to using a mobile phone in school. It can be a major distraction, both for the user and for other students around them. The temptation to surf the web, play games, or use social media during class time can be difficult to resist, and can inhibit learning and productivity. In addition, mobile phones can also enable cheating, cyberbullying, and other inappropriate behavior.\n\nOverall, while mobile phones can be a useful tool at school, it is important to use them responsibly and in accordance with school policies."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the trends in US gun violence over the last decade."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Over the last decade, gun violence in the United States has remained a persistent problem. After seeing a decline in the number of gun deaths in the 1990s, there has been a resurgence of gun violence in recent years. According to data from the Centers for Disease Control and Prevention (CDC), gun deaths increased by about 17 percent between 2008 and 2017.\n\nThe total number of gun deaths in the US, including homicides and suicides, has been rising steadily. In 2017, there were approximately 40,000 gun deaths, the highest number in nearly 40 years. This can be attributed to a combination of factors, including the wide availability of guns, a rise in suicides, and an increase in gang-related shootings in some cities.\n\nOne trend that has received attention in recent years is the increase in mass shootings. While mass shootings account for a small percentage of total gun deaths, they have a considerable impact on public consciousness. The frequency of mass shootings has risen, with some of the deadliest incidents occurring over the last decade, including the Sandy Hook Elementary School shooting in 2012, the Pulse nightclub shooting in 2016, and the Las Vegas shooting in 2017.\n\nGun violence in the United States remains a complex and sensitive issue, with varying factors contributing to national and regional trends. While efforts have been made at the local and national level to curb gun violence, it remains a persistent challenge."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Outline the main points of this articlehttps://time.com/5798772/behind-the-coronavirus-transmission-risk/"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The article 'Behind the Coronavirus Transmission Risk' by Markham Heid published in time.com discusses the various ways in which Covid-19 spreads and how to prevent its transmission. Here are the main points from this article:\n\n1. The primary mode of transmission of the coronavirus is through respiratory droplets from infected people.\n\n2. Being in close contact with symptomatic people increases the risk of infection, with the risk intensifying in enclosed spaces.\n\n3. The virus can maintain its stability in the air and on surfaces for hours, hence raising concerns over the transmission from aerosol and fomites.\n\n4. Hand hygiene, masks, and social distancing measures have proved to be effective in preventing the spread.\n\n5. However, the constant mutation of the virus poses a challenge in containing its spread, stressing the importance of taking vaccines to minimize the risk of contracting the new strains.\n\n6. Those who have underlying medical conditions, the elderly, and front-line workers are more susceptible to the virus and should exercise more caution.\n\n7. Efficient ventilation systems and air purification measures should be established in enclosed spaces to minimize the risk of airborne transmission of the coronavirus."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name a famous Italian singer."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One famous Italian singer is Andrea Bocelli."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Recall a fact about the Battle of Waterloo."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The Battle of Waterloo was fought on June 18, 1815, near Waterloo in present-day Belgium, then part of the United Kingdom of the Netherlands. It was a decisive battle in which an imperial French army, under the command of Napoleon Bonaparte, was defeated by the armies of the Seventh Coalition, comprising an Anglo-allied army under the command of the Duke of Wellington, and a Prussian army under the command of Field Marshal Blücher."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Calculate the age difference between the given two people.Person 1: 40 years old\nPerson 2: 20 years old"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The age difference between Person 1 and Person 2 is 20 years."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the type of the following phrase: \"an eight year-old girl\"."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The phrase \"an eight year-old girl\" is a noun phrase."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a cultural proverb or saying that reflects the idea of \"one small decision can change your life\"."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"From a tiny spark can grow the mightiest flame.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a function to sort a list of strings alphabetically"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here's a simple function that takes a list of strings as an argument and returns the sorted list in alphabetical order:\n\n``` python\ndef alphabetical_sort(strings_list):\n return sorted(strings_list)\n\n# example usage\nexample_list = ['banana', 'cherry', 'apple', 'grape']\nsorted_list = alphabetical_sort(example_list)\nprint(sorted_list)\n```\n\nThis function uses the built-in `sorted` function to sort the list of strings in ascending order, which is alphabetical order for strings. You can use this function to sort any list of strings, and it will always return a new list containing the sorted strings."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Reword the following sentence for better understanding.School is a place for students to learn and grow."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A school is an establishment where students acquire knowledge and develop holistically."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Develop a set of criteria for judging the quality of a health food."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "When evaluating the quality of a health food, there are several key criteria to consider. These may include:\n \n1. Nutritional Value: One of the main objectives of consuming health food is to provide our bodies with essential nutrients. High-quality health foods are often rich in vitamins, minerals, proteins, fiber, and antioxidants.\n\n2. Ingredient Quality: To ensure that you are consuming the highest quality health food, it's important that the ingredients used are of the highest possible quality. This includes using non-GMO, organic, and locally sourced ingredients when possible.\n\n3. Freshness: The fresher the ingredients, the higher the quality of the final product. Foods that are made from fresh, whole ingredients tend to be more nutrient-dense and flavorful than foods that rely on processed, pre-packaged components.\n\n4. Additives: High-quality health foods should be free of artificial preservatives, flavors, and colors. Avoid products that contain high levels of added sugars, sodium, or unhealthy fats.\n\n5. Processing: The method in which food is processed can greatly affect its nutritional value. Choose foods that are minimally processed, as these tend to retain more of their natural nutrients and flavor.\n\n6. Dietary Restrictions: High-quality health foods should have options available for people with specific dietary restrictions, such as gluten-free, dairy-free, vegan, or vegetarian.\n\n7. Transparency: The company producing the health food should be transparent about their sourcing, manufacturing, and labeling practices. This allows consumers to make informed decisions about the quality of the food they are purchasing.\n\nBy keeping these criteria in mind when choosing health foods, you can ensure that you are consuming products that are not only good for you but also of the highest quality."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rearrange the presented sentence so that it sounds more natural.I have already called your service twice last week solved problem."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I already called your service twice last week to solve the problem."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the importance of the following term - Natural Language Processing"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Natural Language Processing (NLP) is a field of artificial intelligence that enables computers to understand, interpret and interact with human language. It plays an important role in many applications, making it a vital technology to understand and utilize.\n\nNLP is important because it helps to bridge the gap between human language and computer understanding. This makes it possible for computers to process and analyze large amounts of natural language data, making human computer interaction more natural and intuitive. It also helps in finding valuable insights and discovering hidden patterns within the data.\n\nIn areas like sentiment analysis, machine translation, speech recognition, and chatbots, NLP plays a crucial role. By facilitating natural, human-like communication with machines, NLP helps users to access information, perform tasks and complete transactions quickly and easily.\n\nAdditionally, NLP also has numerous applications in business, such as automating customer service responses, analyzing social media sentiment, and assisting in hiring processes. With the growing amount of textual data that is generated daily on the web, the role of NLP in extracting useful information from this data becomes increasingly important.\n\nIn summary, Natural Language Processing is an essential tool for enabling efficient communication and collaboration between humans and computers, and for extracting valuable insights from large volumes of language data."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a short description of a key character in a crime novel."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Detective James \"Jim\" Donovan is a seasoned and cunning investigator in the police department. Known for his sharp wit and unrelenting pursuit of justice, he faces the toughest cases with unyielding determination. With his intimidating physical presence and his ability to read people, Donovan is a force to be reckoned with. He's often at odds with his superiors, but his track record of solving high-profile cases has earned him respect from his peers. His gruff exterior belies a deep sense of compassion, and he's deeply committed to protecting the innocent and serving his community."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a headline that captures the news of the day."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Breakthrough in COVID-19 Vaccine Development as Pfizer Announces 90% Efficacy Rate\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a multiplication table from 2 to 10."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is the multiplication table for numbers 2 to 10:\n\n```\n2x1=2 | 3x1=3 | 4x1=4 | 5x1=5 | 6x1=6 | 7x1=7 | 8x1=8 | 9x1=9 | 10x1=10\n2x2=4 | 3x2=6 | 4x2=8 | 5x2=10 | 6x2=12 | 7x2=14 | 8x2=16 | 9x2=18 | 10x2=20\n2x3=6 | 3x3=9 | 4x3=12 | 5x3=15 | 6x3=18 | 7x3=21 | 8x3=24 | 9x3=27 | 10x3=30\n2x4=8 | 3x4=12 | 4x4=16 | 5x4=20 | 6x4=24 | 7x4=28 | 8x4=32 | 9x4=36 | 10x4=40\n2x5=10 | 3x5=15 | 4x5=20 | 5x5=25 | 6x5=30 | 7x5=35 | 8x5=40 | 9x5=45 | 10x5=50\n2x6=12 | 3x6=18 | 4x6=24 | 5x6=30 | 6x6=36 | 7x6=42 | 8x6=48 | 9x6=54 | 10x6=60\n2x7=14 | 3x7=21 | 4x7=28 | 5x7=35 | 6x7=42 | 7x7=49 | 8x7=56 | 9x7=63 | 10x7=70\n2x8=16 | 3x8=24 | 4x8=32"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Organize the following questions into a logical order.What is the purpose of the program?\nWhat are the main goals of the program?\nWhich stakeholders will be involved?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. What is the purpose of the program?\n2. What are the main goals of the program?\n3. Which stakeholders will be involved?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Divide this list of sentences into two paragraphs.The sky is blue and sunny. The birds are chirping in the trees. The air is fresh and the grass is green."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The sky is blue and sunny. The birds are chirping in the trees.\n\nThe air is fresh and the grass is green."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a table of the top 10 countries with the highest population density."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Below is a table of the top 10 countries with the highest population density according to the World Population Review (2021 data):\n\n| Rank | Country | Population Density (People/km²)|\n|---|---|---|\n|1|Monaco |26,150|\n|2| Singapore| 8,358|\n|3| Bahrain| 2,239|\n|4| Vatican City |2,003|\n|5| Malta| 1,379|\n|6| Maldives| 1,331|\n|7| Bangladesh| 1,265|\n|8| Bermuda| 1,246|\n|9| Lebanon| 667|\n|10| Barbados| 668|"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Provide a solution to the equation 2a+3b=20"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The equation 2a + 3b = 20 has infinitely many solutions, as there are two variables and only one equation. However, I can provide you with possible values for 'a' and 'b' that satisfy the given equation.\n\nFor example, one possible solution would be to let a = 5, and then solve for b. Substituting the value of a into the given equation: \n\n2(5) + 3b = 20\n10 + 3b = 20\n3b = 10\nb = 10/3\n\nSo one possible solution to the equation is (a, b) = (5, 10/3). We can also find more solutions using the same method or by letting the value of b be any arbitrary number and then solving for a."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a creative idea for how to reduce employee absenteeism.noinput"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One creative idea to reduce employee absenteeism could be to implement a 'Work from Home' or 'Remote Work' program. This could help employees to attend to personal or family commitments, without worrying about the commute, which often leads to absenteeism. Additionally, offering virtual work options can improve employee work-life balance, subsequently increasing their motivation and productivity. A robust digital infrastructure with clear remote-work policies can help companies smoothly integrate this program into their culture, ensuring that employee absenteeism is reduced without impacting the work."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Summarize the main points of the article \"How COVID-19 is reshaping the global economy\""}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I'm sorry, but I don't have access to external articles or news sources so I'm unable to read or summarize the content of the article \"How COVID-19 is reshaping the global economy\". However, in general, COVID-19 has drastically impacted the global economy by disrupting supply chains, reducing consumer spending and causing widespread job losses. Governments have implemented policies such as stimulus packages and lowering interest rates to mitigate the economic effects, and there has been accelerated growth in e-commerce and remote work. The long-term effects are still uncertain, but there is no doubt that the pandemic has altered the economic landscape."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rewrite this piece of text to include a more active tone.The climate has been changing in various parts of the world."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Various parts of the world are actively experiencing climate change."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "order the events from the earliest to the latestHe got his driving license, he joined college, he got a job."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "He joined college, he got his driving license, he got a job."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rewrite this sentence so that it reads in the passive voice.He installed the software."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The software was installed by him."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Design a project for a class assignment"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is a sample project for a class assignment:\n\nTitle: The Impact of Social Media on Mental Health\n\nObjective: \nThe aim of this project is to analyze the correlation between the use of social media and mental health of individuals. The primary focus will be on teenagers and young adults, as they constitute the largest demographic of social media users.\n\nMethodology:\nThe project will comprise both primary and secondary research methods. For the primary research, participants will fill out a survey that will measure their social media usage and assess their mental health using standardized scales. A sample size of at least 100 participants will be recruited through a mix of convenience and snowball sampling. In addition, secondary research will be conducted through a literature review of current studies which have investigated this topic.\n\nDeliverables: \n1. A detailed report analyzing the collected data and presenting the findings of the research\n2. An infographic summarizing the key points of the research\n3. A presentation to the class discussing the results of the project\n\nTimeline:\n- Week 1: Finalize research question and methodology, and prepare survey\n- Week 2-3: Conduct primary research and literature review\n- Week 4: Analyze data and summarize findings\n- Week 5: Prepare report, infographic and presentation\n- Week 6: Present project to class\n\nAssessment:\nThe project will be evaluated based on the quality of research, analysis and presentation. Creativity, clarity, and cohesiveness of the final report, infographic and presentation will also be taken into account."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the concept of a \"Virtual Machine\"."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A virtual machine (VM) is a simulated computer environment in which an operating system and other software can run as if they were running on a physical machine. It is created and managed by virtualization software, which allows multiple virtual machines to run on a single physical computer.\n\nEach virtual machine has its own set of virtual hardware such as CPU, memory, hard drive, network interface, and other devices. These virtual hardware resources are mapped by the virtualization software to the physical hardware of the host computer, allowing the virtual machine to operate in a way that is almost identical to a real computer.\n\nVirtual machines are very versatile, as they can run any operating system and software that would normally run on a physical machine. They are commonly used for testing or running old software, as well as for cloud computing, where numerous virtual machines run on servers to provide scalable and flexible computing resources.\n\nIn summary, a virtual machine is a virtualized, isolated computer environment that offers many of the same benefits as a physical machine, but without the limitations and maintenance of physical hardware."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a list of materials for a science experiment.Experiment on Plant Growth"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is a list of materials for a science experiment on plant growth:\n\n1. Seeds of the plant you want to study (e.g., bean, sunflower, or pea)\n2. Planting pot or tray\n3. Soil\n4. Watering can or spray bottle\n5. Grow light or access to natural sunlight\n6. Fertilizer (optional)\n7. Ruler or tape measure\n8. Notebook and pen to record observations and data\n9. Camera or phone to take pictures of plant growth (optional)\n10. Labeling materials to keep track of different variables/conditions (e.g., different types of fertilizer or amounts of sunlight)"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Craft a headline that would be suitable for a news article about the launch of a new type of rocket."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Revolutionary New Rocket Takes to the Skies: Successful Launch Marks New Era in Space Exploration\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a couple of sentences that describe the characteristics of a democracy."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A democracy is characterized by the participation and rule of the majority through a system of representation, typically by means of regular, free, and fair elections. It promotes the equality of all individuals, freedom of speech and expression, and protects the rights of minorities."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain how to solve a mathematical equation.2x + 4 = 10"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Given the equation 2x + 4 = 10, you want to solve for the unknown variable, which in this case is 'x'. Here are the steps to follow to solve this type of equation:\n\n1. **Isolate the variable:** Start by isolating the variable 'x' on one side of the equation. To do this, you'll want to get rid of the constant term 4 from the left side of the equation. Since it is added, you'll need to do the inverse operation which is to subtract 4 from both sides:\n2x + 4 - 4 = 10 - 4\nWhen simplified this becomes:\n2x = 6\n\n2. **Solve for the variable:** Now that you have 2x = 6, you need to find the value of x. Since right now x is multiplied by 2, you'll need to do an inverse operation by dividing both sides by 2:\n(2x)/2 = 6/2\nThis simplifies to:\nx = 3\n\nThis is your solution. You can check your work by substituting the value you found for 'x' back into the original equation.\n2(3) + 4 = 10\n6 + 4 = 10\n10 = 10\n\nAs the left side equals the right side, you now know that the value of x = 3 is indeed the correct solution to this equation."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Input a URL and output the domain of that URL, excluding the subdomains.https://docs.google.com/document"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "google.com"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Determine whether a given string is a palindrome or not.String: level"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Yes, the given string \"level\" is a palindrome, because if you read it backward or forward, it remains the same."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Suggest five ways to budget your expenses."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Track Your Expenses: Keep a record of all your expenses, either by writing them down or using an app. This will help you see how much you are spending and on what.\n\n2. Create a Budget Plan: Set up a budget plan that lists your income and expenses. Allocate a certain amount of money for each category, such as food, rent, and entertainment.\n\n3. Reduce Unnecessary Spending: Look at your expenses and see if there are any areas where you could cut down on spending. For example, you may be able to save money by eating out less often or canceling subscriptions that you do not use.\n\n4. Prioritize Your Spending: Focus on the things that are most important to you and prioritize them in your budget. For example, if saving for a big trip is important to you, you may want to allocate more money towards your savings.\n\n5. Use Cash and Debit Cards: Using cash and debit cards instead of credit cards can help you stick to your budget. With cash and debit cards, you can only spend the money you have, whereas with credit cards, it is easy to overspend and accrue debt."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a brief essay about how data privacy affects society."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Data privacy has become an increasingly important concern for society as more and more of our personal lives become digitized. With the proliferation of social media, online shopping, and other digital platforms, people share vast amounts of personal information with businesses and organizations. This information can be used for a variety of purposes, ranging from targeted advertising to more nefarious activities such as identity theft.\n\nOne of the main ways in which data privacy affects society is through its impact on individuals. When personal information is shared without consent, it can lead to a loss of control over one's personal life. This can be seen in cases where data breaches or hacks have revealed sensitive information, such as financial details or personal correspondence. This loss of privacy can have significant emotional and psychological consequences, causing stress and anxiety.\n\nData privacy also has a wider impact on society as a whole. The lack of privacy and security in the digital world can have devastating consequences for democracy, leading to a loss of trust in institutions and a weakening of the social contract. This can happen when, for example, data is used to manipulate elections or to suppress dissenting voices. In this way, a lack of data privacy can undermine the very foundations of society.\n\nIn conclusion, data privacy is essential in ensuring that individuals retain control over their personal lives and that society functions fairly and democratically. As the digital world continues to evolve, it is important that we remain vigilant in protecting our personal data and demand that businesses and organizations are held accountable for its use."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Insert the phrase \"By all means\" in the following sentence.We can skip the introduction if there is not enough time."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "By all means, we can skip the introduction if there is not enough time."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Detect the sentiment of the sentence. Output should be one of the below:\n- Positive\n- Negative\n- NeutralNobody understands me."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Negative"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "edit this code and make sure it compiledef sum(num1, num2): return num1 + num2"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is the edited version of the code:\n\n```python\ndef sum(num1, num2):\n return num1 + num2\n```\n\nNote, however, that the original code is already valid Python code, and it does compile."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create indoor activities for a 4-year-old child."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Drawing and Painting: Set up a space for your child to create and practice using different art materials such as crayons, markers, paint, and paper.\n\n2. Building: Help your child build towers, cities or little worlds using blocks, LEGOs or other building materials.\n\n3. Play-dough: Let your child have fun creating different shapes and figures using play-dough, which is great for developing fine motor skills.\n\n4. Sensory Bins: Create a sensory bin with items such as sand, dry pasta or rice, and add some toys or objects for your child to explore.\n\n5. Story Time: Develop your child's love for reading by having a daily story time, where you read books together and talk about the story and characters.\n\n6. Dress-Up: Help your child learn about different roles and occupations by playing dress-up, using costumes and accessories.\n\n7. Puzzle Time: Work on a puzzle together, which can help your child develop problem-solving skills and patience.\n\n8. Music and Dance: Have a dance party or play musical instruments together, which is a great way to get moving and be creative.\n\n9. Puppet Show: Create a little puppet theater using socks or paper bags and put on a show for the family.\n\n10. Indoor Scavenger Hunt: Create a scavenger hunt around the house, with clues that will help your child discover and learn about their surroundings."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "How is a triangle classified based on angles?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A triangle can be classified into three types based on its angles: acute, right, and obtuse.\n\n1. Acute Triangle: A triangle is classified as acute if all of its three internal angles are less than 90 degrees. In other words, all three angles of an acute triangle are acute angles.\n\n2. Right Triangle: A triangle is classified as right if one of its three internal angles measures exactly 90 degrees. This 90-degree angle is also called a right angle, and the side opposite to it is called the hypotenuse.\n\n3. Obtuse Triangle: A triangle is classified as obtuse if one of its three internal angles is greater than 90 degrees. In other words, one angle of an obtuse triangle is an obtuse angle.\n\nIt is important to note that the sum of all three internal angles in any triangle always adds up to 180 degrees."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Make an analogy for the sentence.Eating an apple is healthy"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Eating an apple is like putting fuel in your car, it helps to keep the system running smoothly and efficiently."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Summarize the text without changing or deleting the facts.The technology sector is a collection of industries that produce a range of products and services based on technology. It includes consumer-oriented tech companies such as Apple, Google and Microsoft, as well as industrial and consumer products and services such as transportation, aerospace, energy and defense. It also includes software, hardware and semiconductors developed by tech companies, as well as services and products related to these industries."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The tech sector comprises various industries that create tech-based products and services, including tech giants like Apple, Google, Microsoft and smaller companies that produce transportation, aerospace, energy, and defense products. The sector encompasses software, hardware, semiconductors, and associated services and products."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a question that someone could ask a new person they have just met."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "What brought you to this place/event, and what do you hope to gain from your experience here?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the given word's root.Ambiguous"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The root of the word \"ambiguous\" is \"ambi\", which is derived from the Latin word \"ambiguus\", meaning \"doubtful\" or \"uncertain\". This root is often used to mean \"both\" or \"two\" (indicating duality or having two aspects) in English words."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the basic features of a computer operating system."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A computer operating system (OS) is a collection of software that manages computer hardware resources and provides common services for computer programs. It acts as an interface between the user and the hardware, and controls the overall operation of a computer system. The basic features of an operating system include:\n\n1. Memory Management: Operating systems coordinate the use of memory by various applications, allocates and de-allocates memory blocks and uses paging and virtual memory to manage the memory available to applications.\n\n2. Process Management: The OS manages the creation, scheduling, suspension, resumption, and termination of processes, which are instances of a program executing code.\n\n3. File Management: The OS organizes and manages file storage, accesses and retrieves files, and protects them from unauthorized access.\n\n4. Input and Output Management: Operating systems are responsible for managing the input and output of data between the computer and peripherals such as keyboards and printers, as well as networking hardware.\n\n5. User Interface: This includes the graphical user interface (GUI) or command-line interface (CLI) through which users interact with the computer.\n\n6. Security: Operating systems provide features such as firewalls, authentication mechanisms, and encryption to protect data and systems from unauthorized access.\n\n7. Resource Allocation: The OS manages and allocates resources such as CPU time, main memory, and storage space to different applications, ensuring that each application receives the necessary resources.\n\nOverall, the operating system is a critical component of a computer system, providing the interface between the hardware, software, and user, and managing the computer's resources efficiently."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the climate of the Sahara Desert."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The Sahara Desert is known for its extreme and harsh climate. It is the largest hot desert in the world, covering about 3.6 million square miles in Northern Africa. The climate of the Sahara is primarily hot and dry, with very little rainfall throughout the year.\n\nDuring the day, temperatures can soar to over 100 degrees Fahrenheit (over 38 degrees Celsius), while at night, temperatures can drop dramatically, sometimes reaching below freezing. This is because the dry and clear atmosphere over the desert allows for little retention of the heat from the day.\n\nThe Sahara receives less than 3 inches of rain per year, with some areas going without rain for years at a time. This lack of precipitation creates an arid environment with very little vegetation, and the landscape is made up mostly of sand dunes, rocky outcrops, and gravel plains.\n\nIn addition to the high heat and arid conditions, the Sahara is also known for its strong winds. These winds can pick up sand and dust, creating sandstorms and dust storms that can last for days and reduce visibility to almost zero.\n\nOverall, the climate of the Sahara Desert is characterized by extreme heat, dry conditions, little rainfall, and strong winds."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Predict the outcome of this situation.A group of students developed a new mobile app and are hoping to launch it soon."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "It is difficult to determine an exact outcome as there are several potential scenarios. However, here are some possible outcomes:\n\n1. Successful Launch: If the group of students has developed a high-quality and useful app, there is a high likelihood that it will generate interest and gain traction among its target audience after its launch. With the right marketing strategy, the app could eventually become popular and generate significant revenue for its developers.\n\n2. Slow Start: It might take time for the app to pick up in popularity. Assuming the app is well-developed and serves a genuine need, the students would need to continue promoting it through word-of-mouth, marketing, and promotional campaigns to increase its user base. Over time the app can still become a success.\n\n3. Unsuccessful Launch: Alternatively, there is a possibility that the app may not gain much traction among its target audience after its launch. This could happen for several reasons: the app may have issues or bugs, there may be strong competition from other similar apps, or there might not be a strong demand for the app's features. In this scenario, the group of students may need to re-evaluate their strategy or pivot to a different approach."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Read the string of numbers and determine what is being described.2, 3, 7, 11, 15"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The input is a string of numbers: 2, 3, 7, 11, and 15. These numbers appear to be individual integers, listed in ascending order, rather than a sequence that follows a specific pattern."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Proofread the following sentence and add any missing punctuation.Finally Sara went to the supermarket"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Finally, Sara went to the supermarket."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write an essay about the recent advances in artificial intelligence and its implications."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Over the past decade, there has been a rapid surge of advancements in artificial intelligence (AI) technology that has allowed developers of the field to design creative and innovative applications, products, and services that were only previously possible in futuristic and sci-fi movies. From improving speech recognition and natural language processing to the ability to enable self-driving cars, artificial intelligence has come a long way in a short period of time. This essay will describe some of the recent advances in AI technology and discuss the implications of these advancements on our society.\n\nOne significant advancement in AI is machine learning, a method where computers analyze and learn patterns from data and use that knowledge to make predictions and decisions. This technology is responsible for many of the recent AI advancements such as facial recognition, fraud detection, recommendation engines, and personalized advertising. Deep learning, a subset of machine learning, has also allowed AI applications to develop the ability to learn by themselves, improving their performance and accuracy in a variety of tasks.\n\nAnother recent advance in AI technology is natural language processing (NLP), which has allowed machines to understand spoken and written language in a more human-like manner. This has given rise to virtual personal assistants like Siri and Alexa, as well chatbots, which are designed to respond and interact with people in a realistic way.\n\nWhile these developments in AI have the potential to be beneficial, they also come with significant implications. Perhaps the most well-known concern is the fear of job losses, as machines may eventually replace human workers in a number of industries. Furthermore, AI can pose threats to privacy and security as it makes it easier to monitor and analyze vast amounts of data.\n\nAnother major issue is the potential for AI systems to become biased. AI applications learn from data, so if the data fed to them is biased or inaccurate, they will make biased decisions themselves. This problem could have serious implications for society, particularly in areas like hiring and criminal justice, where biased decisions could perpetuate existing inequalities.\n\nIn conclusion, the recent advances in AI technology have brought about exciting changes and possibilities for our future, however, it is important that we consider the potential implications that may arise from the integration of these systems into society. To avoid the negative consequences, it is imperative that developers and policymakers consider these ethical concerns and create guidelines and regulations to ensure that advancements in AI benefit all members of society."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rearrange the jumbled words to form the correct sentence:sky saw I birds"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I saw birds in the sky."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe a flower in 5 sentences."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A flower is a beautiful and delicate part of a plant that often has vibrant colors and a fragrant scent. It consists of petals, a stem, and leaves, and can come in various shapes and sizes. At the center of the flower, there is usually a reproductive structure made up of stamens and a pistil. Flowers rely on pollinators, such as bees or birds, to transfer pollen from the stamens to the pistil, allowing for fertilization to occur. Flowers have long been admired and used by humans for their beauty, and often hold symbolic meanings in various cultures."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Edit this sentence to make it more concise: \"The red and yellow flowers were growing in the garden\"."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Red and yellow flowers grew in the garden.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write an algorithm that would analyze a given text and extract all the nouns from it."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "```python\n1. Import the necessary libraries, such as the natural language toolkit (nltk).\n import nltk\n\n2. Tokenize the given text into sentences and words.\n sentences = nltk.sent_tokenize(text)\n words = nltk.word_tokenize(text)\n\n3. Apply part-of-speech tagging to the tokenized words to identify the grammatical category of each word.\n pos_tagged = nltk.pos_tag(words)\n\n4. Initialize a list to store the extracted nouns.\n nouns = []\n\n5. Loop through the tagged words and check if the part-of-speech tag is a noun tag ('NN', 'NNS', 'NNP', or 'NNPS').\n for word, tag in pos_tagged:\n if tag in ('NN', 'NNS', 'NNP', 'NNPS'):\n # If it is a noun tag, append the word to the list of extracted nouns\n nouns.append(word)\n\n6. Return the list of extracted nouns.\n return nouns\n```\n\nThe above algorithm would analyze the given text, tokenizing it into sentences and words, apply part-of-speech tagging to the tokenized words, and identify the nouns by their part-of-speech tag. The extracted nouns would be stored in a list and returned at the end."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate 5 multiplication problems"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. 2 x 3 = ?\n2. 8 x 7 = ?\n3. 5 x 9 = ?\n4. 11 x 6 = ?\n5. 3 x 12 = ?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Design a web-based platform that allows users to register and maintain detailed profiles."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I would design a user-friendly and intuitive web-based platform that allows users to easily register and create their own unique profiles. The registration process would be simple and straightforward, requiring users to enter their basic information, such as their name, email address, and a secure password. The platform would also include a robust password encryption system, to ensure that users' data is always protected.\n\nAfter registering, users would be able to access their profiles and add more detailed information about themselves. This could include personal information, such as their age, location, interests, and hobbies, as well as professional information, such as their occupation, education, and work experience. The platform would be designed to allow users to easily enter, update, and organize this information.\n\nThe user profile page would have a clean and modern design, with clear sections for different types of information. Users would also have the ability to upload a profile picture and customize the look of their profile page with different themes and backgrounds. Additionally, the platform would provide various privacy options, allowing users to control who can view their profile and what information is visible.\n\nOverall, this web-based platform would provide users with a simple yet powerful tool for creating and maintaining detailed personal and professional profiles."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a paraphrased version of the following sentence.The music stores all have different types of instruments for sale."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Each music store has a variety of instruments available for purchase."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given a short story, adapt it in the form of a poem.It was a cold winter day. She was walking outside in the snow, thinking about how life could be different."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "In the quiet winter chill,\nA girl walks alone,\nSnow crunching as she goes,\nLost in thoughts unknown.\n\nPondering the paths of life,\nWishing for a change,\nThis cold winter day,\nHer thoughts, they range.\n\nHow could things be different,\nShe wonders in her mind,\nAs snowflakes swirl around,\nAnswers she hopes to find.\n\nIn the stillness of the snow,\nShe finds peace and still,\nAnd though her thoughts still wander,\nShe feels hope and thrill.\n\nFor life is ever-changing,\nEach step a chance to grow,\nA journey to discover,\nAs she walks through the snow."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Imagine a scenario and write a short story."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Once upon a time, there was a young girl named Lily, who had been struggling with her daily tasks during the pandemic. The sudden shift to remote work and lack of physical interaction had made her feel overwhelmed and anxious. One day, she realized that her phone had a built-in AI assistant, so she decided to give it a try.\n\n\"Hey assistant, can you help me plan my day?\" she asked.\n\n\"Of course, Lily! Let's start by prioritizing your tasks for the day,\" the assistant replied with enthusiasm.\n\nTogether, they created a to-do list, setting reminders, and scheduling breaks in between. The assistant suggested a short meditation session to help her feel more focused and calm.\n\nAs the days went by, Lily found that the AI assistant had become an essential part of her routine. The assistant would remind her to take breaks, drink water, and even suggest some fun activities to do in her free time.\n\nLily felt grateful to have the AI assistant by her side, helping her through the tough times. She felt more organized and in control of her life. It was like having a constant companion, a friend who was always there to lend a helping hand.\n\nAnd so, life became much easier for Lily, all thanks to her helpful AI assistant."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Indicate the depth of a given lake.Lake Como"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I'm sorry, I cannot retrieve that information without connecting to external data sources. I suggest that you search for the depth of Lake Como on the internet."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a plausible subtitle for the movie \"The Godfather\""}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"The Godfather: A Tale of Family, Loyalty and the Price of Power\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Classify this email as an Inquiry or SuggestionHello,\n\nI am writing to suggest that the store stock more organic produce.\n\nSincerely,\n\n john"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "This email is classified as a Suggestion."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a list of tips for a healthy lifestyle."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Eat a balanced and nutritious diet: Include plenty of fruits, vegetables, whole grains, lean protein and healthy fats in your daily meals. Avoid processed and high sugar foods. \n\n2. Stay hydrated: Drink plenty of water throughout the day to keep your body hydrated and functioning at its best. \n\n3. Exercise regularly: Aim for at least 150 minutes of moderate-intensity aerobic activity or 75 minutes of vigorous-intensity activity per week. Also include strength training exercises to maintain muscle strength and flexibility. \n\n4. Get enough sleep: Adults should aim for 7-9 hours of restful sleep per night. Create a bedtime routine and stick to a regular sleep schedule to help your body get the rest it needs. \n\n5. Manage stress: Take time to relax and unwind. Practice stress-reducing techniques like meditation, yoga, deep breathing or journaling. \n\n6. Avoid harmful habits: Avoid smoking, excessive alcohol consumption and drug use, as these can have a negative impact on your overall health. \n\n7. Cultivate healthy relationships: Surround yourself with positive, supportive people and nurture your relationships with family and friends. \n\n8. Stay up to date with health checkups: Schedule regular checkups with your healthcare provider to stay on top of your health and prevent any potential issues from becoming bigger problems. \n\n9. Practice good hygiene: Wash your hands regularly, take care of your oral health, and practice good personal hygiene to prevent the spread of germs and illness. \n\n10. Take care of your mental health: Prioritize self-care and seek help if youre struggling with a mental health issue. Remember that taking care of your mental health is just as important as taking care of your physical health."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe how a plant converts the sun's energy into food."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A plant converts the sun's energy into food through a process called photosynthesis. This is the process by which plants use light energy from the sun, carbon dioxide from the air, and water from the soil to produce glucose (sugar) which they use as food for growth and other functions.\n\nThe process of photosynthesis takes place primarily in the leaves of the plant, within specialized cells called chloroplasts that contain a pigment called chlorophyll. This pigment is responsible for the plant's green color and it plays a central role in capturing light energy.\n\nIn the first stage of photosynthesis, the plant uses the energy from the sun to convert water molecules into oxygen and hydrogen. The oxygen is then released as a byproduct while the hydrogen is used in the second stage of photosynthesis.\n\nDuring the second stage, the plant combines the hydrogen with carbon dioxide it takes in from the environment to produce glucose. This sugar is then used by the plant as a source of energy to fuel its growth, development, and other activities.\n\nIn summary, through photosynthesis, plants are able to convert the sun's energy into food by using water, carbon dioxide, and light. The result is the production of glucose, which serves as the primary source of energy for the plant, and oxygen, which is released into the environment."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a metaphor that describes loneliness"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Loneliness is a vast, empty desert, where an individual is surrounded by nothing but endless sand dunes, scorched by the sun, with no respite or companionship in sight."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "List five advantages of virtual reality."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Immersive Experience: One of the most significant advantages of virtual reality is that it offers a highly immersive experience, allowing users to feel as though they are in a different world or environment. This creates endless possibilities for education, entertainment, and simulation.\n\n2. Safe Training Environment: VR enables users to practice and learn new skills in a safe and controlled environment, without the risks associated with real-world training. This is particularly useful for training in high-risk jobs, like surgery, piloting, and military operations.\n\n3. Remote Collaboration: Virtual reality allows people to collaborate and interact with each other, regardless of their physical location. This can be useful for businesses, allowing employees to interact in virtual meetings, or for friends and family to connect and spend time together even if they are far apart.\n\n4. Enhanced Learning and Education: By providing an immersive and interactive experience, virtual reality can enhance the learning process and make it more engaging for students. This technology can be used to simulate real-life scenarios, making it easier for students to understand complex concepts and retain information.\n\n5. Treatment for Mental Health: Virtual reality has been shown to be an effective tool in treating certain mental health conditions, such as anxiety, phobias, and post-traumatic stress disorder. Therapists can use VR to simulate situations that may be difficult for patients to face in real life, allowing them to gradually overcome their fears and anxieties."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Construct a simile for the followingPiano"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Playing the piano is like painting with your fingers, each note a brush stroke creating a beautiful masterpiece."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Categorize the given text as an expository or narrative writing.The tennis racket is composed of various materials all working together to create power and agility for the racquet's user."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "This text is an example of expository writing, as it is providing information and explaining the composition of a tennis racket."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Brainstorm five ideas for a cybersecurity project on the theme of emergency response."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Development of a Disaster Recovery Platform: A project aimed at building an all-encompassing platform to help businesses recover quickly from emergency cyber incidents. This platform would include tools for data backup, an organized recovery plan, and real-time alerts for prompt and effective action.\n\n2. Emergency Cyber Training: An initiative that focuses on providing comprehensive training to cybersecurity specialists on how to handle emergency situations. This project would include conducting drills and simulations to help professionals hone their skills in incident response and timely crisis resolution.\n\n3. Cybersecurity Emergency Response App: An idea to develop a mobile app that connects users to cybersecurity experts during emergencies. With features such as live chat, remote assistance, and the ability to report incidents immediately, this app could be a lifesaver for those facing cyber threats.\n\n4. Infrastructural Upgrade for Emergency Cybersecurity: A project aimed at making infrastructural changes to a business or organizations cybersecurity framework. This could include implementing advanced firewalls, antivirus software, and intrusion detection systems, enabling an organization to prevent and mitigate emergency cyber incidents.\n\n5. Emergency Response Cybersecurity Playbook: An endeavor to create a comprehensive manual on emergency incident response. This playbook would provide guidelines on immediate steps to take, incident reporting protocols, risk assessment procedures, and recovery processes. Such a resource could prove invaluable in helping organizations come up with an efficient and prompt response to cyber threats."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a tagline for an environmental company."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Preserving nature today, for a better tomorrow\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a web page that displays the temperature of the city of Florence."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is an example of HTML code that can be used to create a web page that displays the temperature of the city of Florence:\n\n```html\n<!DOCTYPE html>\n<html>\n<head>\n<title>Florence Weather</title>\n</head>\n<body>\n<h1>Current temperature in Florence</h1>\n<p id=\"temperature\"></p>\n\n<script>\n //Using OpenWeather API to get the weather information\n const apiKey = 'API_KEY';\n fetch(`https://api.openweathermap.org/data/2.5/weather?q=Florence&appid=${apiKey}&units=metric`)\n .then(response => response.json())\n .then(data => {\n const temp = data.main.temp;\n const temperatureElement = document.querySelector('#temperature');\n temperatureElement.innerHTML = `${temp}°C` //displaying the temperature value in Celcius\n })\n .catch(err => console.log(err))\n</script>\n</body>\n</html>\n```\n\nIn the above code, we create a simple `html` page with a `p` element to display the temperature in Florence. We fetch the temperature data from the OpenWeather API by providing it the name of the city, the API key and setting the `units` parameter to `metric` to get the data in Celsius.\n\nMake sure to replace `API_KEY` with your own OpenWeather API Key.\nThe temperature value is fetched from the response data and displayed on the page. \n\nYou can also update the styles using CSS to make the webpage more attractive."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the term 'network latency' in one sentence."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Network latency is the time it takes for data to travel from its source to its destination across a network."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate an appropriate love poem title based on the given input.a rainy day"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Love on a Rainy Day\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a letter of recommendation for John Doe based on the information provided.John Doe has been a valuable contributor on my team for the past 3 years. He has consistently demonstrated excellent problem-solving skills, going above and beyond when faced with challenging tasks. I believe he is a great candidate for any position."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To Whom it May Concern,\n\nI am writing this letter to wholeheartedly recommend John Doe for any position that he is being considered. I have had the pleasure of working with John for the past 3 years and can attest to his exceptional skills and character.\n\nJohn has been a valuable contributor on my team and has consistently demonstrated excellent problem-solving skills. He possesses a quality that allows him to analyze situations in depth and come up with creative and effective solutions. His ability to successfully navigate complex problems and deliver excellent results has been exemplified on numerous occasions.\n\nJohn also goes above and beyond when faced with challenging tasks. He is not one to shy away from a challenge; instead, he tackles them with determination and tenacity. His positive attitude, combined with his impressive work ethic, make him a great teammate and an invaluable asset. He is a dedicated employee who is committed to achieving success, both individually and for the team.\n\nIn summary, John is an outstanding candidate with remarkable skills and attributes. I have no doubt that he will excel in any position and make significant contributions to the organization. I highly recommend him for any opportunity that he is being considered.\n\nSincerely,\n\n[Your Name]"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain how Virtual Reality works."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Virtual Reality, often abbreviated as VR, is a computer-generated and immersive technology, simulating a 3D environment in the real or imagined world. It allows the user to interact and explore this simulated reality using specialized VR devices, which include headsets, gloves or hand controllers, and sensors that track movement.\n\nThe technical side of VR works through stereoscopic displays two screens that display slightly different perspectives from each eye, creating an illusion of depth. Additionally, the VR headset has sensors such as accelerometers, gyroscopes, and magnetometers, which track the user's head movement and adjust the image accordingly, providing a more realistic experience by matching the user's movements in real-time with the simulated environment.\n\nWhile wearing the VR headset, the user perceives a 360-degree view of a virtual environment. This view adjusts once the user looks around, providing a seamless experience that engages the brain into thinking it is in a real environment.\n\nRealistic sound is another factor that enhances the VR experience, through the use of sophisticated 3D audio technology, which adds to the immersion factor.\n\nTo interact with the virtual environment, the user wears special gloves or holds hand controllers equipped with sensors that track the hand and finger movements. This allows the user to perform actions such as pointing, grabbing, or wielding virtual objects, making the experience even more realistic.\n\nIn summary, Virtual Reality technology works by engaging the user's senses, specifically sight, sound, and touch, to create a simulated reality that is interactive, immersive, and most importantly, convincing."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe your favorite character from a movie."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I am an AI and do not have personal preferences or favorites, but I can provide you with a description of a popular character from a movie if you would like."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Analyze the following Tweet and classify the sentiment of the message.The US economy is soaring and unemployment levels have never been lower."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The sentiment of the message in the tweet is positive. It portrays a sense of progress, growth, and prosperity in the US economy."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given an article, summarize the content in one sentence.This article talks about the causes, symptoms, and treatment of depression."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The article discusses the reasons for depression, its manifestations, and the available remedies."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Categorize the animals in the following list: dogs, cats, pigs, fish[dogs, cats, pigs, fish]"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One way to categorize the animals in the list is by their habitat and physical characteristics:\n\n- Domesticated land animals: dogs, cats, pigs\n- Aquatic animals: fish"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Analyze the following family situation and provide recommendations for how each family member can deal with their issues.John, who is a single dad, has two teenage daughters, Kim and Amy. Kim is 16, rebellious, and struggles with her grades in school. Amy is 14, and she's very timid and gets bullied at school."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "John, as a single parent, may be overwhelmed with the challenges his daughters are facing. It is recommended that he seeks support from friends, family, or a support group for single parents. It could also be helpful for him to seek advice from a counselor or therapist who specializes in parenting children in their teenage years.\n\nFor Kim, it would be helpful if John can establish clear expectations, boundaries, and consequences for her behavior while still providing her with love and support. It may be necessary for John to communicate with her teachers or guidance counselor to discuss her grades and how to improve them. Kim may also benefit from tutoring or one-on-one teaching sessions. It is also essential that John listens to her thoughts and feelings without judgment and encourages her to express herself in healthy ways.\n\nFor Amy, it is crucial for John to address the issue of bullying and seek support from the school administration. He should also encourage her to speak out about her experiences, engage in activities that build self-confidence, and develop healthy coping mechanisms. John can also explore the possibility of counseling or therapy for Amy, which could provide her with a safe space to express her feelings and learn coping strategies.\n\nOverall, communication, support, and seeking help from professionals and resources are key components in helping John and daughters overcome their challenges."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a function in Python that evaluates a given expression."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is a simple function that can evaluate a given mathematical expression as a string:\n\n```python\ndef evaluate_expression(expression):\n # Evaluating the expression using the eval() function\n result = eval(expression)\n\n # Returning the result\n return result\n```\nThis function takes a String as an argument that contains the mathematical expression that needs to be evaluated. It uses the eval() function of Python, which evaluates the expression and returns the result. Note that the `eval()` method can pose security risks if used improperly, so it is important to properly sanitize any inputs used with this function.\n\nExample usage of this function:\n\n```python\nresult = evaluate_expression(\"2 + 3 * 4\")\nprint(result)\n#output: 14\n```\n\nHere, the function takes an expression \"2 + 3 * 4\" as an argument, evaluates it and returns the result, which is 14."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Provide antonyms for the word 'evil'."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Some antonyms for the word \"evil\" include: good, virtuous, moral, righteous, benevolent, kind, and ethical."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the tone of this essay.The world of technology is constantly shifting and evolving before our eyes, and it is essential to continuously evolve and take advantage of the possibilities that come with it."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The tone of this essay is optimistic and proactive."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a paragraph summarizing the main points from the following article:The concept of \"flow\" is based on the idea that individuals experience moments of optimal performance and enjoyment when they are fully engaged in an activity. During such experiences, a person's attention is focused on the task at hand and they are not easily distracted. Flow has been linked to a number of positive psychological outcomes, including increased sense of accomplishment and wellbeing. Flow has also been explored in relation to its impact on creativity and decision-making."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The article discusses the concept of \"flow\" and how it leads to optimal performance and enjoyment when a person is fully engaged in an activity. During flow, a person's attention is focused and they are not easily distracted, leading to positive psychological outcomes such as increased sense of accomplishment and wellbeing. The article also explores the impact of flow on creativity and decision-making."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the hyperbola that passes through the given points.Points: (3,4), (2,-6)"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A hyperbola's standard form is given by `(x-h)^2/a^2 - (y-k)^2/b^2=1` for horizontal hyperbolas or `(y-k)^2/a^2 - (x-h)^2/b^2=1` for vertical hyperbolas. Where `h` and `k` are the coordinates of the center, while `a` and `b` are constant values associated with the lengths of the axes of the hyperbola. \n\nSo to identify the hyperbola that passes through both `(3,4)` and `(2,-6)`, we will plug the points into the general equation of a hyperbola then solve for its variables. \n\nAssuming a hyperbola in the form `(x-h)^2/a^2 - (y-k)^2/b^2=1` is what we are looking for, we plug `(3,4)` and `(2,-6)` into the equation to get the following system of equations:\n```\n(3-h)^2/a^2 - (4-k)^2/b^2=1\n(2-h)^2/a^2 - (-6-k)^2/b^2=1\n```\nSimplifying the system of equations we get:\n```\n(3-h)^2/a^2 - (4-k)^2/b^2=(2-h)^2/a^2 - (-6-k)^2/b^2\n(3-h)^2 - (4-k)^2 =(2-h)^2 - (-6-k)^2\n9 - 6h + h^2 - 16 + 8k - k^2 = 4 - 4h + h^2 - 36 - 12k - k^2\nh^2 - 6h - k^2 + 8k - 7 = h^2 - 4h - k^2 -12k - 32\nh = -5k - 25\n```\nThis seems to suggest that we don't have enough information to find a unique solution. Since we only have two points from a non-degenerate conic, there are infinitely many hyperbolas that pass through both points."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compare and contrast democratic and authoritarian regimes in 4 sentences."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Democratic and authoritarian regimes differ in many fundamental ways. In a democratic regime, power is vested in the people through free and fair elections; citizens have the right to vote and participate in the political process. By contrast, authoritarian regimes concentrate power in the hands of a small group of elites or a single leader, with little or no input from the citizens. Additionally, democratic regimes are characterized by the protection of civil liberties and human rights, while authoritarian regimes often lack freedoms and suppress opposition."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate one creative way to use honey in a recipe."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One creative way to use honey in a recipe is to make \"Honey Glazed Sriracha Chicken Skewers\". You can marinate chicken pieces in a mixture of honey, Sriracha sauce, soy sauce, and minced garlic for a couple of hours; then thread the chicken onto skewers, grill until fully cooked. In a saucepan, combine honey, Sriracha sauce, and soy sauce, and cook until thickened. Brush the sauce over the chicken skewers and serve them hot. The honey brings sweetness and balance to the spicy taste of Sriracha sauce, creating a perfect combination of flavors."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Analyze an example of a conflict in the given text.In the movie City Lights, a wealthy woman helps a poor man who is in love with her."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The conflict in the text could possibly arise from the difference in social and economic class between the wealthy woman and poor man. The man's love for the woman puts him in a difficult position, as societal norms and expectations may not allow for them to have a relationship. Additionally, the woman's decision to help the poor man could also cause external conflicts with her peers, who may not approve of her association with someone of a lower class. This can create tension and challenges for both characters as they navigate the class divide and their feelings for each other."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a list of three hobbies someone can take up during quarantine."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Painting or drawing: This hobby is an excellent way to express creativity and imagination, and it is also a relaxing and rewarding pastime.\n\n2. Learning a new instrument: Music is an excellent source of entertainment and relaxation, and learning a new instrument can be a fun and challenging way to spend time during quarantine.\n\n3. Baking and cooking: Many people enjoy the process of preparing and creating delicious meals and treats. This hobby can be a lot of fun and can also provide a sense of accomplishment when trying new recipes or perfecting old ones."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Summarise the tips for successful online dating."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Create an honest and appealing profile: Select up-to-date images that represent you accurately, provide a clear description of who you are and what you're looking for.\n\n2. Be proactive: Don't wait for someone to contact you; initiate communication with those who interest you.\n\n3. Prioritize safety: Take steps to protect your information and trust your gut when it comes to meeting in person.\n\n4. Keep an open mind: Be open to meeting different types of people and don't judge too quickly.\n\n5. Be patient: Finding the right match can take time, so be prepared to invest some effort and don't get discouraged.\n\n6. Communicate effectively: Be open and honest, reply to messages in a timely manner and keep the conversation flowing."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compare and contrast the individualistic and collectivist cultures."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Individualistic and collectivist cultures are two distinct cultural dimensions that describe how individuals perceive their relationship with their group or society. In individualistic cultures, people tend to value independence, autonomy, and personal achievement. They see themselves as separate entities, and personal goals often take priority over group goals. In contrast, collectivist cultures emphasize the importance of interdependence, cooperation, and group harmony. People in such cultures tend to see themselves as part of a larger collective, and tend to prioritize group goals over personal ones.\n\nOne key difference between these two types of cultures is the way they approach decision-making. In individualistic cultures, individuals are encouraged to make decisions independently, and personal choices are respected. On the other hand, in collectivist cultures, group consensus is often sought when making decisions, and individuals are expected to align their choices with those of the group. \n\nAnother major difference between individualistic and collectivist cultures is the way they handle success and failure. In individualistic cultures, personal achievements are celebrated, and individuals are often rewarded based on their own merits. In contrast, in collectivist cultures, success and failure are often viewed as collective accomplishments or shortcomings, and people tend to share credit or blame.\n\nThe communication styles of these two types of culture also differs- while people from individualistic cultures are usually direct in expressing their thoughts and opinions, people from collectivistic cultures often use indirect communication, taking care not to disrupt group harmony. \n\nIn conclusion, individualistic and collectivist cultures have different values and ways of functioning. While individualistic cultures highlight the importance of personal fulfillment, collectivist cultures prioritize group harmony and social cohesion. Each approach has its strengths, and no one culture is inherently better than the other."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compare and contrast Data Mining and Machine Learning."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Data mining and machine learning are two areas of computer science that are intertwined with each other. While they share some similarities, they also have some differences.\n\nData mining is the process of extracting useful information from large datasets. It involves the use of various techniques such as clustering, classification, and association rule mining to uncover hidden patterns and relationships in data. The goal of data mining is to make sense of and transform raw data into actionable knowledge.\n\nOn the other hand, Machine learning is a subset of artificial intelligence that involves the development of algorithms that can learn from data and make predictions or decisions based on that data. The goal of machine learning is to create models that can generalize and make accurate predictions or decisions on new, unseen data.\n\nOne key difference between the two is that data mining is the process of finding correlations or patterns among numerous fields in large relational databases while machine learning uses these patterns and correlations to learn from data and make accurate predictions.\n\nIn essence, data mining is more exploratory in nature, focused on finding previously unknown patterns or relationships in data. On the other hand, machine learning is more predictive, focused on using known patterns to make predictions or decisions. Additionally, machine learning models can improve over time, as they are exposed to new data, while data mining techniques do not change.\n\nOverall, data mining and machine learning are two important tools for extracting knowledge from data. While they share some similarities, they also have some distinct differences, and both have their unique strengths and use cases."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "List some of the challenges climate change brings on the environment."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Climate change poses a number of challenges to the environment, including:\n\n1. Rising temperatures: Global temperatures are increasing due to the release of greenhouse gases into the atmosphere, leading to more heatwaves, droughts, and other weather extremes.\n\n2. Melting of polar ice caps and glaciers: As temperatures rise, polar ice caps and glaciers are melting at an alarming rate, causing sea levels to rise and threatening the habitat of many animals.\n\n3. Ocean acidification: The increased levels of carbon dioxide in the atmosphere are also absorbed by the ocean, causing it to become more acidic, which can harm marine life such as coral reefs and shellfish.\n\n4. Changes in precipitation patterns: Climate change can cause changes in precipitation patterns, leading to more frequent and intense storms or extended periods of drought, which can cause soil erosion and degradation of farmland.\n\n5. Loss of biodiversity: Many species are struggling to adapt to the changing climate, causing the extinction of some species and the disruption of ecosystems.\n\n6. Wildfires: The increased frequency of droughts, heatwaves and extreme weather can cause wildfires to become more common, causing widespread destruction of natural habitats and displacement of wildlife.\n\n7. Increased frequency of natural disasters: Climate change is expected to increase the frequency and intensity of natural disasters such as hurricanes, floods, and droughts, causing damage to the environment and threatening the lives and livelihoods of people around the world."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given a simple scenario, you need to identify the different characters in the situation.A family of four was sitting together at the dinner table."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The different characters in this scenario are the four members of the family seated at the dinner table."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compare and contrast holographic imaging and x-ray imaging."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Holographic imaging and x-ray imaging are both techniques used to create visual images for various applications, including medical diagnosis and scientific research. However, there are several differences between the two methods.\n\nHolographic imaging uses laser light and the principles of holography to create three-dimensional images. These images are formed by recording the interference pattern between two laser beams, one of which is reflected off the object being imaged. The resulting image appears to have depth and can be viewed from various angles, providing a detailed and realistic representation of the object.\n\nOn the other hand, x-ray imaging uses high-energy x-ray beams to penetrate and pass through objects, creating a two-dimensional image on a detector. The amount of x-rays absorbed by the object's tissues or materials depends on their density and atomic number, which allows for the creation of contrast in the resulting image. X-ray images are used in medical diagnosis to image bones, organs, and other structures within the body.\n\nOne of the main differences between the two techniques is the type of radiation used. While holographic imaging uses non-ionizing laser light, x-ray imaging uses ionizing radiation, which can be harmful in high doses. This means that x-ray imaging must be used with caution and under controlled conditions, while holographic imaging is generally considered safe.\n\nAdditionally, holographic imaging provides a three-dimensional view, while x-ray imaging produces a two-dimensional image. This can make holographic imaging more useful for applications that require a detailed understanding of an object's structure and depth, such as engineering or scientific research.\n\nIn summary, both holographic and x-ray imaging are powerful tools for creating visual representations of objects. However, they differ in their underlying principles, the type of radiation used, and the resulting image."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a time-sensitive to-do list that involves driving to four different locations.You are expected to complete the list in one day."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Time-Sensitive To-Do List for One Day:\n\n- **7:00am**: Wake up, have breakfast, and get ready to leave the house.\n- **8:00am**: Drive to the post office to send out an urgent package.\n- **9:00am**: Drive to the dry cleaners to pick up the clothes for an important meeting the next day.\n- **10:30am**: Drive to the grocery store to buy ingredients for tonights dinner.\n- **12:00pm**: Make a stop at the gas station to fill up the car with gas.\n- **1:00pm**: Arrive back home, have lunch and rest for 30 minutes.\n- **2:00pm**: Drive to the pharmacy to pick up a prescription.\n- **3:00pm**: Return home and start preparing dinner.\n- **6:00pm**: Have dinner with family.\n- **7:00pm**: Relax and unwind for the rest of the evening."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Merge and sort two given arrays.Array 1: [89, 34, 55] \nArray 2: [67, 1, 84]"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Merged and sorted array: [1, 34, 55, 67, 84, 89 ]"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Delete the last two paragraphs from the following text.In some cases, the negative effects of tech may not be immediately visible, but tech has been linked to a variety of adverse health outcomes over time. Research has demonstrated that people who are over-absorbed in tech usage are more prone to developing mental health conditions, such as depression and anxiety. People who are highly engaged in social media, for example, are more likely to feel loneliness and satisfaction in life.\n\nAt present, it is unclear how much of an impact tech has on our mental health, and the literature is inconsistent. Some research suggests that tech can be helpful and beneficial when used in moderation, while other studies point to the potential psychological and neurological consequences of excessive tech use."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "In some cases, the negative effects of tech may not be immediately visible, but tech has been linked to a variety of adverse health outcomes over time. Research has demonstrated that people who are over-absorbed in tech usage are more prone to developing mental health conditions, such as depression and anxiety. People who are highly engaged in social media, for example, are more likely to feel loneliness and satisfaction in life."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Invent a new word and explain its meaning."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The new word is \"Reloyalty.\" Reloyalty is the combination of the words \"Reliability\" and \"Loyalty.\" It refers to the quality of being consistently dependable, trustworthy, and devoted in one's personal and professional relationships. It represents an individual's unwavering strength of character, and their commitment towards the people that rely on them, through trying times and challenges."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compose a short story about two friends from different walks of life."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "There once were two friends named Sarah and Lily. Sarah came from a wealthy family and lived in a beautiful mansion with her parents. Lily, on the other hand, came from a humble background and lived in a small apartment with her single mother.\n\nDespite their different walks of life, Sarah and Lily bonded over their love for painting and became the best of friends. They would spend hours together in the park, painting the beautiful scenery around them and talking about their dreams.\n\nOne day, Sarah's parents found out about her friendship with Lily and disapproved of it. They believed that Sarah should only be friends with people from her own social class. Sarah was heartbroken but refused to end her friendship with Lily.\n\nLily, too, faced challenges. Her mother struggled to make ends meet and Lily often had to miss out on school trips and other fun activities because they couldn't afford it. Sarah would always offer to help, but Lily was too proud to accept.\n\nDespite the obstacles they faced, Sarah and Lily's friendship only grew stronger. They stood by each other through thick and thin and proved that true friendship knows no boundaries.\n\nYears later, Sarah and Lily both became successful artists and opened a joint art gallery. Their paintings were displayed side by side, showcasing their beautiful friendship for the world to see.\n\nIn the end, Sarah and Lily's friendship was a testament to the fact that it doesn't matter where you come from or how much money you have. What matters is the love and support you show to those close to you."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Tell me what you would do if you won a million dollars"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "As an AI, I don't have the ability to desire or use the money for my own purposes. However, if you won a million dollars, I could assist you in making wise decisions on how to use it by providing advice, doing research on investment options, helping you create a budget plan, and connecting you with financial advisors. It's always a good idea to use a windfall such as this to pay off any debts, establish an emergency fund, invest for the future, and give to charitable causes. Ultimately, the decision on how to use the money would be up to you, and I would be here to support and guide you."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "List three characteristics of a given object.Tiger"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Large feline animal - Tigers are a species of large, predatory felines that are recognized for their strength and power.\n\n2. Distinctive orange coat with black stripes - Tigers are known for their distinctive orange coat with vertical black stripes that serves as a great camouflage in their natural habitat.\n\n3. Carnivorous diet- Tigers are carnivorous animals that primarily hunt deer, wild pigs, and other large mammals for food."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Develop a customer service policy for customer complaints."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Our company values customer satisfaction as one of our top priorities. In case of any dissatisfaction or complaints, we have established the following policy for addressing and resolving the issue promptly.\n\n1. Accessible Customer Service\nWe provide multiple channels to accept customer complaints, including email, phone, chat lines, and social media. Our customer service team is available 24/7 to respond to complaints and provide needed support.\n\n2. Timely Response\nWe aim to respond to all complaints within 24 hours of receipt. Customers will be notified of the estimated resolution timeframe if more time is required.\n\n3. Prioritized Complaint Handling\nWe prioritize complaints based on the severity and impact of the issues raised. We understand that not all complaints are equal, so we assign a team to manage critical issues immediately.\n\n4. Thorough Investigation\nAll complaints will be thoroughly investigated to identify and address the root cause of the problem. Our team will keep customers updated on the resolution progress until a satisfactory solution is reached.\n\n5. Fairness and Compensation\nWe consider each complaint on its merit and offer fair resolutions and compensation in line with our service level agreement. Our overall aim is to ensure that all complaints are resolved satisfactorily, and the customer is ultimately satisfied.\n\n6. Continual Improvement\nWe continuously improve our customer service policy to avoid a recurrence of similar complaints. All feedback received is taken into account when updating our policy, business practices and procedures.\n\nAt our company, we strive to provide top-notch service to all our customers. In case of any complaints, we kindly request that customers bring it to our attention, following the policy above. We assure you that we will do our utmost to resolve the issue promptly and satisfactorily."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe a cloud-computing system."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A cloud computing system is a type of computing infrastructure that delivers on-demand computing resources, software, applications and data storage over the internet. These resources are provided through a network of remote servers that are hosted by a third-party provider, rather than on a user's personal computer or local server. Cloud computing is based on the concept of shared resources, and can offer several benefits such as increased flexibility, scalability, reliability, and cost savings. The services can be accessed from anywhere in the world as long as there is an internet connection, and users can avoid the costs associated with owning, maintaining and upgrading their own hardware and software. Examples of cloud computing services include online data storage and backup, web-based email, and online applications for customer relationship management, project management, and more."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Translate the following phrase from English to Spanish\n\n\"There is no place like home\""}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"No hay lugar como el hogar\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the process for reversing a linked-list in Java."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A linked list is a linear collection of data elements or nodes, where each node points to the next node in the list. Reversing a linked list means reversing the order of the nodes, where the first node becomes the last and the last node becomes the first. There are several ways to reverse a linked list in Java, including using an iterative approach or a recursive approach.\n\nHere is an example of how to reverse a singly linked list using an iterative approach:\n\n```java\npublic void reverseList(Node head) {\n Node prev = null; // Initialize a previous node pointer to null\n Node curr = head; // Start from the head of the list\n\n while (curr != null) {\n Node next = curr.next; // Save the next node in a temporary variable\n curr.next = prev; // Reverse the link, set the current node's next pointer to the previous node\n prev = curr; // Move the previous node pointer to the current node\n curr = next; // Move to the next node in the list\n }\n\n // Finally, update the head of the list to the new head\n head = prev;\n}\n```\n\nThis approach uses an iterative while loop to traverse the list and reverse the links between the nodes. We start from the head of the list and use a temporary variable to save the next node in the list. Then we set the current node's next pointer to point to the previous node instead of the next node. We do this by updating the previous node pointer to the current node and moving to the next node in the list until the entire list is reversed.\n\nAn alternate approach is to do this recursively:\n\n```java\npublic Node reverseList(Node head) {\n // Base case: If the head is null or there is only one node in the list, return head\n if (head == null || head.next == null) {\n return head;\n }\n\n // Recursively reverse the rest of the list\n Node newHead = reverseList(head.next);\n\n // Reverse the current node\n head.next.next = head;\n head.next = null;\n\n // Return the new head of the reversed list\n return newHead;\n}\n```\n\nIn the recursive approach, we use the same logic to reverse the links between the nodes, but we do it recursively by calling the `reverseList` function on the rest of the list until we reach the end of the list. Then we reverse"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the concept of sentiment analysis."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Sentiment analysis, also known as opinion mining, is the process of automatically identifying and extracting subjective information from textual data, such as opinions and emotions. This process involves the use of natural language processing (NLP), machine learning, and other computational techniques to analyze, classify, and interpret the sentiment of a piece of text.\n\nSentiment analysis has a wide range of applications, including brand monitoring, product analysis, customer service, and market research. Businesses can use sentiment analysis to understand how their customers feel about their products, services or brand by analyzing online forums, social media posts, and other customer feedback. This information can then be used to inform marketing and product development decisions, track customer satisfaction, and improve customer engagement.\n\nThere are various approaches to performing sentiment analysis, including rule-based, lexicon-based or machine learning-based methods. Typically, sentiment analysis involves first preprocessing the text to remove stop words, normalize the text, and tokenize it. Then, the sentiment of the text is determined by checking for the presence of specific words or phrases that are indicative of a particular sentiment, or by using machine learning algorithms to classify the text based on its sentiment. The output of sentiment analysis can be a binary classification (e.g. positive or negative), or a more fine-grained classification (e.g. very positive, positive, neutral, negative, very negative)."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Classify the following statement as fake or real, based on the given contextThe world's population is expected to rise by 8 billion by 2050.\nContext: Population changes"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The statement is fake. According to the United Nations, the world's population is expected to reach approximately 9.7 billion by 2050, not increase by 8 billion from its current estimate of 7.9 billion (as of 2021)."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write the lyrics of a short song about the endangered species."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "(Verse 1)\nIn the forest and the sea\nThere are creatures great and small\nBut now they're in jeopardy\nWe need to help them all\n\n(Chorus)\nEndangered species, hear their cries\nTheir homes are disappearing before our eyes\nWe need to act now, before it's too late\nAnd save these creatures from a terrible fate\n\n(Verse 2)\nFrom the mighty elephant\nTo the tiny bee\nThey all have a role to play\nIn our ecology\n\n(Chorus)\nEndangered species, hear their cries\nTheir homes are disappearing before our eyes\nWe need to act now, before it's too late\nAnd save these creatures from a terrible fate\n\n(Bridge)\nWe can make a change\nWith every little action\nProtect their habitats \nAnd stop the harmful extraction\n\n(Outro)\nFor the future of our planet\nFor the children yet to be\nLet's protect endangered species\nAnd live in harmony."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Using the input below, rewrite the story as a first person narrative.The tall vase on the table wobbled as the train rocked back and forth, coming to another stop. Ann walked over to the kitchenette and realized she had left her passport somewhere in the cabin."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I felt the train rock back and forth as it came to a stop. As I looked up, I saw the tall vase on the table wobbling. Quickly, I walked over to the kitchenette to steady it. Thats when it hit me- I had left my passport somewhere in the cabin."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What would be an example of a long-term strategy for developing a career?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "An example of a long-term strategy for developing a career could involve several steps that focus on personal and professional growth. Some of these steps may include:\n\n1. Setting clear career goals and objectives: Start by identifying your long-term career aspirations and defining specific, measurable, attainable, relevant and time-bound (SMART) goals to help you achieve them.\n\n2. Pursuing relevant education and training: Invest time and resources in acquiring the necessary skills, knowledge and qualifications in your chosen field through formal education, workshops, training, and online courses.\n\n3. Building a professional network: Connect with industry leaders, colleagues, and mentors, attend conferences and meetings, join relevant professional associations, and use networking sites like LinkedIn to build and maintain professional contacts.\n\n4. Gaining practical experience: Take on internships, freelance projects or volunteer work, which provide hands-on experience, help you build your portfolio, and demonstrate your commitment and passion to potential employers.\n\n5. Keeping up-to-date with industry trends: Stay informed about the latest advancements, news and trends in your field by reading industry publications, attending trade shows, and staying active in professional associations.\n\n6. Continuously reviewing and revising your strategy: Regularly reflect on your progress, seek feedback and support when needed, and adjust your strategy as needed to ensure you stay on track to achieve your long-term career goals."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a reinterpretation of the following painting.Painting: The Mona Lisa, by Leonardo Da Vinci"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "As an AI, I am not capable of creating a painting from scratch, however, here is a written reinterpretation of \"The Mona Lisa\" by Leonardo da Vinci: \n\nIn this reimagined version, the enigmatic woman sits before an abstract background of bold colors and swirling patterns. Her hands are clasped together in a peaceful gesture as she gazes out towards the viewer with a serene expression. Her raven hair is adorned with vibrant flowers and her lips curve into a subtle smile, hinting at a secret known only to her. The garments she wears are in vivid hues, draped elegantly over her form, adding to the ethereal atmosphere of the piece. This reinterpretation of the classic painting brings a modern, lively twist, while still retaining the essence of the original."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a data set of 5-6 cities with their state and county information"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is the dataset containing the information of 5 cities with their state and county information:\n\n| City | State | County |\n|------|-------|--------|\n| New York | New York | New York |\n| Los Angeles | California | Los Angeles | \n| Chicago | Illinois | Cook | \n| Houston | Texas | Harris | \n| Phoenix | Arizona | Maricopa | \n\nNote that the county refers to the primary county that the city is located in. Some cities may have portions in other counties as well."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a question to assess someone's personality."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "On a scale from 1-10, how much do you enjoy spending time with other people, and in what types of social interactions do you thrive?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the authors of this quote.\"A man who stands for nothing will fall for anything.\""}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "There is some dispute about the source of this quote. It has often been attributed to Malcolm X, an American Muslim minister and human rights activist. However, some sources claim that it was originally said by Alexander Hamilton, one of the Founding Fathers of the United States, or by Gordon A. Eadie, an advertising executive."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Reword this statement to reflect a positive sentimentI sincerely believe that this project cannot be completed in time"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I sincerely believe that with additional effort and resources, we can successfully complete this project within the given time frame."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a recipe for a vegetable stir-fry."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Vegetable stir-fry is a quick and easy dish that is full of nutritious vegetables and delicious flavors. Here is a simple recipe to make this healthy and tasty meal.\n\n#### Ingredients:\n\n- 1 tablespoon vegetable oil\n- 1 tablespoon minced garlic\n- 1 tablespoon grated ginger\n- 1 sliced onion\n- 1 chopped broccoli\n- 1 sliced carrot\n- 1 sliced red bell pepper\n- 1 cup sliced mushrooms\n- 1 cup sliced zucchini\n- 2 tablespoons soy sauce\n- 1 tablespoon sesame oil\n- 1 tablespoon cornstarch\n- 1/2 cup water\n- Salt and pepper to taste\n- Cooked rice for serving\n\n#### Directions:\n\n1. Heat the vegetable oil in a large wok or skillet over medium-high heat.\n2. Add the garlic and ginger, and cook for about a minute, until fragrant.\n3. Add the onions and cook for about two minutes until they start to soften.\n4. Add the broccoli, carrots, red bell pepper, mushrooms and zucchini to the wok, and continue to cook, stirring constantly, for 5-6 minutes or until the vegetables are tender.\n5. In a small bowl, whisk together the soy sauce, sesame oil, cornstarch, and water, and pour it over the vegetables.\n6. Stir the mixture until the sauce thickens, about 2 minutes.\n7. Season with salt and pepper, to taste.\n8. Serve immediately, over cooked rice. Enjoy your delicious and healthy vegetable stir-fry!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Edit each of the sentences below to create clearer and more concise versions.Many of the streams that pass through woodlands contain abundant vegetation, because of which they provide a suitable habitat for fish."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Woodland streams often have ample vegetation, providing ideal fish habitats."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Choose the alternative which is closest in meaning to the given phrase.I'm entitled to my opinion"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I have a right to my own viewpoint."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What are the implications of this development?Scientists have recently diagnosed a new virus, Coronavirus, which is highly contagious and can quickly spread."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The development of a new virus, such as the Coronavirus, could have several implications for society. One of the most immediate impacts is the potential for a widespread outbreak, which could result in a significant burden on public health systems, and in some cases, loss of life. The fact that the virus is highly contagious means that it can easily spread from person to person, making containment and control more difficult. This could result in travel restrictions, quarantines, and changes in daily routines, such as school and work closures. The economic impact of such measures could be significant, disrupting global trade and supply chains. Additionally, there could be social implications, including increased fear and anxiety within communities, and potential stigmatization of those infected or perceived to be at risk. Scientific research and collaboration would need to be quickly mobilized to understand the virus, develop vaccines or treatments and implement effective measures to prevent its spread."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Summarize a news article about the new tax laws in New York City.https://www.nytimes.com/2020/09/21/nyregion/nyc-taxes-eviction-covid-19.html"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The ongoing COVID-19 pandemic has caused a significant economic and financial crisis in New York City, forcing state legislators to consider several changes to the citys tax laws. One proposed change is to add a tax on second homes in the city worth more than $5 million, known as a “pied-à-terre tax.” Another possible change is to apply a tax surcharge to capital gains from sales of high-end properties, while another idea is to increase income taxes on the wealthy. These measures could all help to defray the costs of the pandemic and provide relief to renters at risk of eviction. Officials argue that, since the citys wealthiest residents have benefited enormously from the citys economic success, and since they have the resources to shoulder a greater share of the tax burden, it makes sense to add these taxes."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a story that starts with the line 'The sun slowly crept up in the sky'."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The sun slowly crept up in the sky, signaling the start of a new day. Below, in the small village nestled between two hills, people began to stir. Doors creaked open as people emerged from their homes, ready to tackle the day ahead.\n\nThere was one person who was up way before the sun. Anna, a young woman who lived alone in a small cabin at the edge of the village, was already out in her fields, tending to her crops. Farming was her passion, and she took pride in her work. It wasn't easy, but the feeling of fulfillment and joy she got from harvesting her produce made it all worth it.\n\nAs the day progressed, the village came alive. People were out in the streets, chatting with their neighbors, exchanging goods and stories. Children ran in the streets, playing and laughing. It was another beautiful day in the village.\n\nAnna was busy, working hard in her fields. The sun was high in the sky when she finally took a break to have some lunch. She sat under the shade of a tree and watched as the village bustled with activity. It was a peaceful scene, and she felt grateful to be a part of it.\n\nAs the day went on, the sun began to set, casting a warm glow over the village. The streets slowly emptied as people retired to their homes, sated from a day of work and play. Anna sat on her porch, taking in the sunset, content with another successful day on her farm.\n\nThe village was quiet now, the only sounds being the soft chirping of crickets and the occasional hoot of an owl. The sun had slipped beneath the horizon, and the night had settled in. Another day had ended, and another would begin soon. Life in the village went on, with its simple pleasures and joys, and Anna was happy to be a part of it all."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "In what ways can technology help children with autism?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Technology can provide a wide range of benefits for children with autism, some of which include:\n\n1. Communication Support: Many children with autism struggle with verbal communication, and technology can provide alternative means of expression through the use of speech-generating devices, picture exchange communication systems, or other communication apps.\n\n2. Interactive Learning: Educational technology can be used to create interactive and engaging learning experiences that are tailored to the individual needs and strengths of children with autism. For example, interactive games and software programs can help children develop their language, social, and cognitive skills.\n\n3. Visual Support: Many children with autism are highly visual learners, and technology can provide visual support for learning and communication. For example, visual schedules or social stories can be created using technology to help children understand and navigate daily routines and social situations.\n\n4. Behavioral Support: Technology can be used to implement and track behavioral interventions for children with autism. For example, wearable devices can monitor physiological indicators of stress or anxiety, and provide alerts or prompts to help children use coping strategies.\n\n5. Social Interaction: Technology can facilitate social interaction for children with autism by providing a means of connecting with others through online communities and virtual environments. This can help children develop social skills and form connections with others who share their interests and experiences.\n\nOverall, technology can be a powerful tool for supporting the development and well-being of children with autism, providing new opportunities for communication, learning, and engagement."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe a character from a book"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One character from a book that I find interesting is Elizabeth Bennet from Jane Austen's \"Pride and Prejudice.\" Elizabeth, often referred to as Lizzy, is the second eldest daughter of the five Bennet sisters. She is known for her sharp wit, lively personality, and strong opinions. She is described as being attractive, with dark eyes and a light figure, and is considered the most sensible and intelligent of her sisters. Elizabeth is fiercely independent and often goes against the norms of society. She values honesty and integrity, and is not afraid to speak her mind. Despite her strong will and confidence, she is also prone to making quick judgments and assumptions, which often leads her to misunderstand situations and people. Throughout the book, Elizabeth learns to overcome her pride and prejudice, and discovers the true meaning of love and acceptance."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Construct an SQL query that returns the names, places and dates for employee meetings.SELECT"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is an example SQL query that would return the names, places, and dates for employee meetings from a hypothetical database table named `meetings`:\n\n```SQL\nSELECT name, place, date\nFROM meetings\nORDER BY date ASC;\n```"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Reply to the following comment.I love the new product!"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I'm glad to hear that you love the new product! Is there anything specific that stands out to you about it?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "List the benefits of quantum computing"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Exponential Speedup: Quantum computers can perform certain calculations at a much faster rate using quantum algorithms, often speeding up exponential or factorial time complexity problems to polynomial time.\n\n2. Superposition: Quantum computers can perform many calculations at the same time using quantum superposition. This allows them to solve complex problems that are impractical or impossible to solve with classical computers.\n\n3. Parallelism: Quantum computers can perform multiple operations simultaneously, thanks to the parallelism enabled by quantum entanglement. This allows them to handle large sets of data with high efficiency.\n\n4. Simulations: Quantum computers can simulate quantum systems and their interactions with high accuracy, providing a powerful tool for research in fields such as material science, chemistry, and physics.\n\n5. Optimization: Quantum computers can solve optimization problems and find the global minimum/maximum of a function with high efficiency, providing solutions to problems in areas such as logistics, finance, and artificial intelligence.\n\n6. Cryptography: Quantum computers can break many current encryption schemes with relative ease, leading to the development of new and more secure quantum-based encryption methods.\n\n7. Machine Learning: Quantum computers can perform more complex and efficient operations in machine learning, allowing for more powerful artificial intelligence and data analysis."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Come up with an original poem about the beauty of nature."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Nature's Beauty\n\n\nAmid the bright morning mist\nFluttering leaves and flowers kissed\nBy the gentle breeze, nature blissed\nIn its beauty, the world persists.\n\nGreen trees sway in the breeze\nTheir canopies rustle with ease\nSquirrels and birds occupy with tease\nThe beauty of nature is an endless peace.\n\nLush fields of grass so green\nDancing wildflowers, a serene scene\nBeneath the sun's golden sheen\nNature's beauty is pure and clean.\n\nThe sky above, a canvas grand\nA masterpiece painted by the divine hand\nMountains, rivers, and beaches of sand\nAll in nature's beauty, so great and grand.\n\nNature's radiance shines so bright\nIn the glow of the moon at night\nThe stars twinkle, a wondrous sight\nNature's peace and beauty, a calming light.\n\nSo let us cherish this gift so true\nLet us protect and preserve its hue\nThe beauty of nature is for me and you\nA treasure for all, to forever pursue."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a conclusion paragraph for a persuasive essay on the importance of voting."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "In conclusion, the act of voting is a fundamental right and responsibility that should not be taken for granted. It is the very foundation of our democracy and it is incumbent upon us to exercise our voice and cast our vote. It is how we as citizens can ensure that our government represents our values, beliefs, and desires. The future is shaped by the choices we make, and by voting, we can positively impact our country and our community. Every vote counts, and it is essential that we all participate in the electoral process for the betterment of our society."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Calculate the total number of pages in the book.There are 350 pages in the book."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The total number of pages in the book is 350."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given the following words, create a descriptive sentence.Rose, sweet, fragrance."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The sweet fragrance of the rose filled the air, enchanting all who passed by."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given a set of instructions, explain how to use a systemHow to create an account on Twitter"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To create an account on Twitter, follow the steps below:\n\n1. Open the Twitter website on your web browser (https://twitter.com/) or download the Twitter application on your smartphone and open it.\n\n2. Click on the \"Sign up\" button located on the main page.\n\n3. Enter your name and phone number or email address. And then click on \"Next\" \n\n4. Choose if you want to personalize your experience and if you want to sync your address book by clicking the \"Yes\" or \"No\" button. \n\n5. Click on \"sign up\" and then choose your country and then enter the confirmation code that you will receive via text message or email.\n\n6. Choose your preferred password and click on \"Next\" \n\n7. Select your areas of interest to customize your Timeline content and click on \"Next\" \n\n8. Based on your selected areas of interest, Twitter will suggest you some accounts to follow, you can either follow them or simply skip this step.\n\n9. Upload a profile picture, add description, and select your preferences to complete the last step of the signup process.\n\nCongratulations! You have successfully created an account on Twitter. Now you can start tweeting and connecting with people of your interest."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name 3 programming languages that are used to develop applications"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Java\n2. Python\n3. C++"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Summarize the content in the document in specific terms.Document: Soccer, also known as football outside of the United States, is one of the most popular sports in the world. It is estimated that over 250 million people in more than 200 countries and territories play the game. The game is commonly referred to as “The Beautiful Game” because it brings people together and creates a sense of community."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The document discusses the popularity of soccer, also called football, which is played by an estimated 250 million people in more than 200 countries. It highlights how the sport brings people together and fosters a sense of community, earning it the nickname \"The Beautiful Game.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name three decomposers in a forest ecosystem"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Fungi\n2. Bacteria\n3. Earthworms\n\n\nThese organisms play an essential role in breaking down dead plant and animal matter, releasing nutrients back into the soil for other organisms to use."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "You are given two sentences, you need to find the similar words between themSentence 1: The rainfall in California has been unusually high this year.\nSentence 2: The precipitation in California has been unusually low this season."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The similar words between the two sentences are: \n- The\n- in\n- California\n- has \n- been\n- unusually\n- this."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Provide a real-world example of the following concept.Natural selection"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Natural selection is the natural process in which the individuals of a species with traits that are better adapted to their environment tend to survive and reproduce more than others, leading to the continuation and spread of those adaptive traits in the population. A real-world example of natural selection can be seen in the evolution of the peppered moth in 19th century England. \n\nBefore the industrial revolution, the peppered moth existed mostly in a light-colored form, which provided camouflage against the light-colored trees and lichens in their environment. However, as the industrial revolution progressed, the trees became darkened by soot and pollution, making the light-colored moths more visible to predators. As a result, a dark-colored variant of the moth began to appear more frequently, as they were better camouflaged against the darkened trees and had a higher chance of survival.\n\nOver time, the population of dark-colored moths increased, while the population of light-colored moths decreased, demonstrating natural selection in action. The better adapted, dark-colored moths had a higher chance of survival and reproduction, leading to a shift in the population."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name the actors in Star Wars: The Last Jedi."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The actors in Star Wars: The Last Jedi include:\n- Mark Hamill as Luke Skywalker\n- Carrie Fisher as Princess Leia\n- Daisy Ridley as Rey\n- Adam Driver as Kylo Ren\n- John Boyega as Finn\n- Oscar Isaac as Poe Dameron\n- Andy Serkis as Supreme Leader Snoke\n- Lupita Nyong'o as Maz Kanata\n- Domhnall Gleeson as General Hux\n- Anthony Daniels as C-3PO\n- Gwendoline Christie as Captain Phasma\n- Kelly Marie Tran as Rose Tico\n- Laura Dern as Vice Admiral Holdo\n- Benicio del Toro as DJ."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a slogan for a startup company that focuses on providing eco-friendly services."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Going green together for a sustainable future - choose our eco-friendly services today!\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What does the phrase \"turning a blind eye\" mean?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The phrase \"turning a blind eye\" means to pretend not to notice, or deliberately ignore something, often because it is difficult or unpleasant to deal with. It typically refers to a situation in which someone consciously chooses not to see or acknowledge something that is happening, or that they know is happening. The phrase derives from a legendary tale in which the British Admiral Horatio Nelson ignored orders to stop fighting during the Battle of Copenhagen in 1801 by putting his telescope to his blind eye and claiming that he could not see the signal to withdraw."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Come up with a description about a weekend camping trip."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A weekend camping trip is an opportunity to disconnect from the busy city life and connect with nature. Picture setting up camp in a peaceful, wooded area surrounded by stunning views, perhaps by a lake or river. The days are filled with outdoor activities such as hiking, fishing, kayaking or simply relaxing in the fresh air while taking in the surroundings. The evenings are spent around a crackling campfire, enjoying delicious meals cooked over the open flame, roasting marshmallows and exchanging stories with friends or family. Sleeping under the stars, listening to the sounds of nature, and waking up to a beautiful sunrise. A weekend camping trip rejuvenates the mind and body, leaving you feeling refreshed and reenergized."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compare the following two objects in the form of a sentence.Cat and dog"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A cat and a dog, although both beloved household pets, differ in several ways, including their physical appearance, vocalization, behavior, and interaction with humans."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compare and contrast the culinary traditions of France and Italy."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "France and Italy are two of the most renowned countries in the world for their cuisine, and while they share geographical proximity and many common ingredients, their culinary traditions have notable differences.\n\nFrench cuisine is characterized by its refinement and elegance, as well as its emphasis on elaborate sauces and creams. French cuisine has been heavily influenced by royal and noble tastes with a wide range of cooking methods employed, from roasting to braising and sautéing. Ingredients are combinations of meat, cheese, eggs, butter, and wine, all of which are used in a range of dishes such as bouillabaisse, coq au vin, and escargots.\n\nItalian cuisine, on the other hand, is known for its simplicity and the use of fresh, high-quality ingredients. It emphasizes the flavors of the natural ingredients and is heavily influenced by regional traditions, with an emphasis on pasta, pizza, seafood, meats, and vegetables. Italian cooking methods tend to be more straightforward, with grilling, roasting, and baking commonly employed, and ingredients like olive oil, tomatoes, herbs, and cheese playing a starring role in dishes like lasagna, spaghetti carbonara, and Margherita pizza.\n\nWhile both cultures put a strong emphasis on the enjoyment of food, there are also differences in their approaches to meals. In France, meals are often seen as formal occasions, with courses served in a particular order and accompanied by specific wines. In Italy, meals are more family-oriented, with a shared table and a convivial atmosphere.\n\nIn conclusion, while there are many similarities between French and Italian cuisine, the two traditions differ in their approach to ingredients, cooking methods, and even the social context of eating. Both are rich culinary traditions, with much to offer the discerning palate."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Add an interesting fact about the movie 'Titanic'"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "An interesting fact about the movie Titanic is that it was the first film ever to be released on video while it was still being shown in theaters."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the correct example of a noun."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A noun is a word that refers to a person, place, thing, idea, or concept. Some examples of nouns include: book, table, city, love, and teacher."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Answer this query using an appropriate search engine.What is the average temperature in Los Angeles in September?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "According to weather.com, the average temperature in Los Angeles, California in September is around 73°F (23°C) - with an average high temperature of 84°F (29°C) and an average low temperature of 63°F (17.2°C). However, weather conditions can vary from year to year and these temperatures are just an average estimate."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Select the most appropriate input for the following command, \"Calculate the average gpa of students who attend Georgetown University.\"Georgetown University"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To calculate the average GPA of students who attend Georgetown University, I would need a list of the GPAs of all the students currently enrolled at the university."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write the opposite of 'She is doing the task'."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "She is not doing the task."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Suggest five exercises I can do to improve my physical health."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Brisk Walking or Jogging: Both of these activities are great for cardiovascular health and can be done without any special equipment. Just put on a pair of comfortable shoes, and you're good to go.\n\n2. Squats: Squats are a great way to strengthen your legs, glutes, and core muscles. Start by standing with your feet shoulder-width apart, then bend your knees and lower your hips until your thighs are parallel to the ground. Finally, push back up to the starting position and repeat.\n\n3. Planks: Planks are an excellent exercise for strengthening your core, as well as your upper body and back. Start by getting into a push-up position, then lower yourself onto your forearms. Keep your body straight and hold the position for as long as you can.\n\n4. Push-Ups: Push-ups are another great exercise for building upper body strength. Start in a plank position with your hands slightly wider than shoulder-width apart. Then lower your body until your chest nearly touches the ground before pushing back up.\n\n5. Lunges: Lunges are another great exercise for building leg and glute strength. Start by standing with your feet together, then take a large step forward with one leg, and bend both knees until your back knee nearly touches the ground. Finally, push back up to the starting position and repeat with the other leg."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What is the major contribution of the philosopher Immanuel Kant?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Immanuel Kant, born in 1724 in Königsberg, Prussia (now Kaliningrad, Russia), was a philosopher whose work had an considerable impact on modern philosophy, predominantly in the areas of ethics, metaphysics and epistemology. One of Kants major contributions was his examination of the capacity of human reason to acquire knowledge. Kant contended that reason is not just a tool that can be used to make discoveries about the world, instead, he suggested that human knowledge is constructed by our own minds, with how we experience the world as the raw material.\n\nIn his famous work \"The Critique of Pure Reason\", Kant argued that our experiences of the world are always shaped by the inherent structure of our minds. He suggests that while our minds do have sensory input from the world, our understanding and interpretation of that input is shaped by innate categories and concepts, which he called \"categories of understanding\".\n\nKant's work on ethics is also highly significant. He presented his view of morality in works like \"Groundwork of the Metaphysics of Morals\" where he introduced the concept of the \"Categorical Imperative,\" which states that one should act only according to that maxim by which you can also will that it should become a universal law. Essentially, this means that one should never act in a way that they wouldn't want everyone else to act as well.\n\nOverall, Kant's work played a significant role in the development of philosophy, with his ideas shaping the fields of metaphysics, epistemology and ethics for centuries to come."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the purpose of an article review"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "An article review is a critical evaluation of a published work, usually a scholarly or academic article that involves summarizing its main points, analyzing the arguments, and assessing its overall credibility and relevance. The purpose of an article review is to provide readers with a comprehensive and unbiased overview of the article's topic or study, enabling them to form their own opinions on the content. It allows the reviewer to engage with the material, identify its strengths and weaknesses, and evaluate its usefulness and practical implications for further research or practice. Ultimately, an article review is an important tool for academic discourse, promoting the critical analysis and evaluation of research, and encouraging informed discussion and debate within the field."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given a list[4,21,11,32,14], create a new list in reverse order.[4,21,11,32,14]"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To create a new list in reverse order from the given list [4,21,11,32,14], you can use the reverse() function, or slicing method. Here's an example using slicing:\n\n`original_list = [4, 21, 11, 32, 14]\nreversed_list = original_list[::-1]`\n\nThe new reversed list is `[14, 32, 11, 21, 4]`."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Determine the parts of speech of the given words.happy, bought, quickly"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The part of speech of the given words are as follows:\n- happy: adjective (describes a state of being or feeling)\n- bought: verb (an action)\n- quickly: adverb (describes how an action is done)"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given a sentence, answer the following question in a few sentences: What would be an example of an impractical solution to the problem described in the sentence.The roads in the city are highly congested during rush hour."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "An example of an impractical solution to the problem of congested roads during rush hour in the city would be to build individual express lanes or tunnels for each car, which would require an enormous amount of construction and would result in a lot of disruption to daily life. It would also be extremely expensive and not a feasible solution for the government or taxpayers to fund."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the data type of the following piece of information.Mountain climbers climb Mount Everest"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "This is a string data type."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Connect the following items with a logical argument.Fossil fuels, Global warming"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Fossil fuels are hydrocarbon deposits that are derived from decomposed plants and animals from millions of years ago. These fuels, which are sources of energy such as coal, natural gas and oil, are burned to release energy that is used to power our transportation, industrial processes and produce household electricity. When fossil fuels are burned, they release a large amount of carbon dioxide (CO2), methane, and other greenhouse gases into the atmosphere.\n\nGreenhouse gases trap heat in the atmosphere, which leads to an increase in temperature on the Earth's surface. This phenomenon is called global warming. An increase in greenhouse gas emissions, through the burning of fossil fuels, has been the significant contributor to global warming. This rise in temperature is causing changes in climate patterns, rising sea levels, which can result in floods, and more intense and frequent weather events such as hurricanes, droughts and heatwaves.\n\nIn conclusion, the burning of fossil fuels releases greenhouse gases into the atmosphere, which trap heat leading to global warming. Therefore, there is a direct link between the heavy use of fossil fuels and global warming."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a world-building element for a setting."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "In this setting, one of the most unique world-building elements is the presence of ethereal energy, a mysterious power that permeates the air and can be harnessed by a select few individuals known as \"Weavers.\" These Weavers have the ability to manipulate ethereal energy, using it to create stunning displays of magic and perform miraculous feats. They are highly respected in society and often serve as leaders, advisors or protectors. The source of the ethereal energy is unknown, but it is believed to be connected to the spiritual realm and to the very fabric of existence itself. Theories abound about its origin, but the truth remains a mystery, shrouded in legend and myth. The uses of ethereal energy are vast, from healing the sick and wounded, to controlling the elements, to even opening portals to other worlds. It is a powerful force, and the key to unlocking its full potential lies in the hands of the Weavers."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Cut the following shape out of a piece of paper.+---+\n| |\n| A |\n| |\n+---+"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here are the steps to cut the shape out of a piece of paper:\n\n1. Get a piece of paper, a pencil and scissors.\n2. Carefully draw the shape onto the paper with a pencil by recreating lines and letters according to ASCII art given on input. Keep in mind that the shape is a rectangle with the letter 'A' in the middle. \n3. Make sure the lines are straight and the corners meet. \n4. Cut along the outer lines with scissors, being careful not to cut into the shape or cut off any of the lines or the letter 'A'.\n5. Ensure that you have completely cut out the shape.\n6. Discard the scrap paper around the cut-out shape.\n\nNow you have successfully cut out the given shape from a piece of paper!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a paragraph based on the given information.In the year 2025, the United Nations is working hard to reduce its carbon emissions by 50%."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "In the year 2025, the United Nations is more dedicated than ever to combating climate change and preserving the planet for future generations. One of its key initiatives is to reduce its carbon emissions by a significant 50%. This ambitious goal is part of a larger effort to transition towards a carbon-neutral future and is being pursued through a variety of measures, including the adoption of clean and renewable energy sources, improved energy efficiency, and sustainable transportation practices. The UN's commitment to this cause is a beacon of hope for the world, signaling that positive change is possible when it comes to safeguarding the environment."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a few lines of code to create a basic calculator in Python."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "```\ndef calculator(num1, num2, operator):\n if operator == \"+\":\n return num1 + num2\n elif operator == \"-\":\n return num1 - num2\n elif operator == \"*\":\n return num1 * num2\n elif operator == \"/\":\n return num1 / num2\n else:\n return None\n\n# Example:\nprint(calculator(5, 2, \"*\")) # 10\n```\nThis is a basic calculator function in Python. You can pass two numbers and an operator to it, and it will return the result of the calculation. The supported operators are +, -, *, and /. If an unsupported operator is passed, the function will return None."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Share a true story about an instance of resilience."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One inspiring story of resilience is that of Malala Yousafzai. Malala was born in Mingora, Pakistan in 1997. At the time, the Taliban controlled her valley and banned many things, including educating girls. However, Malala's father ran a school and was determined to give his daughter an education. With her fathers encouragement, Malala began writing for the BBCs Urdu service about her life under Taliban rule at just 11 years old. \n\nIn October 2012, Malala was on her way home from school when a masked gunman boarded her bus and shot her in the head. The attack made headlines around the world, but, miraculously, Malala survived. After several surgeries and months of rehabilitation, Malala made a full recovery. \n\nInstead of being intimidated by the Talibans attempt to silence her, Malala became more determined than ever to continue her fight for girls education. In 2013, she spoke at the United Nations, calling for universal access to education. She also founded the Malala Fund, an organization that works to empower girls through education. In 2014, Malala became the youngest person to ever win the Nobel Peace Prize.\n\nDespite the attempt on her life and the danger she still faces, Malala continues to be an advocate for girls and a symbol of resilience."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Provide five synonyms for the word \"positive\"."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Optimistic\n2. Encouraging\n3. Constructive\n4. Favorable\n5. Affirmative."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a histogram for the following data set.30, 20, 22, 34, 16, 32"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is a text-based histogram for the given data set:\n\n```\n16 - 19: *\n20 - 23: **\n24 - 27: \n28 - 31: \n32 - 35: ***\n```\n\nThis histogram divides the range of the data into five equal bins, with each bin having a width of four units. The values 30, 34, and 32 fall within the 32-35 bin, the values 20 and 22 fall within the 20-23 bin, and the value 16 falls within the 16-19 bin."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Determine the angle measures of the given triangle.Side lengths: 14, 3, 15"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "According to the problem, the triangle's side lengths are 14, 3, and 15 units. Because 15 is the largest side, we can apply the Law of Cosines to find the angle opposite of 15. Let it be angle C. The other angles can be called A and B.\n\nLet a=14, b=3, c=15, then\n\ncos(C) = (a² + b² - c²) / (2ab) = (14² + 3² - 15²) / (2 * 14 * 3) = -0.3571428571\n\nC = arccos(-0.3571428571) = 111.8°\n\nNow, to find angle A, we can use Law of Sines, which states that the ratio of the measure of one of a triangle's angles to the length of the side opposite that angle is the same for all three angles of the triangle.\nThus,\n\na/sin(A) = c/sin(C) => 14/sin(A) = 15/sin(111.8°)\nSolving for sin(A), we find sin(A) = (14 *sin(111.8°))/15 = 0.9235687435\nHence, A = arcsin(0.9235687435) = 67.4°\n\nTo find angle B we can use that the sum of the angles of the triangle is 180°, then\nA + B + C = 180° => B = 180° A C = 180° 67.4° 111.8° = 0.8°\n\nTherefore, the angle measures of the given triangle are 67.4°, 0.8°, and 111.8°."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Process this text to generate a headline summarizing the content.Rising ocean temperatures cause coral bleaching, leading to devastating effects for the environment."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Rising Ocean Temperatures Cause Devastating Effects through Coral Bleaching\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a fake romantic sentence."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Your sweet whispers send my heart into a flurry of butterflies, soaring on the wings of our endless love.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a short paragraph about the given topic.The importance of using renewable energy"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The importance of using renewable energy cannot be overstated. Renewable energy sources, such as solar, wind, and hydro power, can be replenished naturally in a short period of time, unlike fossil fuels, which are finite. Utilizing renewable energy reduces greenhouse gas emissions, thus helping to mitigate the effects of climate change. Moreover, it reduces our reliance on non-renewable sources of energy, promoting energy independence, and security. Additionally, renewable energy sources are often cheaper in the long run and can create job opportunities in the green energy sector. Therefore, transitioning to renewable energy is crucial for the health of the planet and the sustainability of our future."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a story about a magical adventure Liz and her friends go on."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Once upon a time in a faraway kingdom, a young girl named Liz and her friends embarked on an unforgettable magical adventure. They were just regular children, living ordinary lives until one day the unexpected happened. In the heart of the forest behind their homes, they discovered an enchanted tree that glowed with golden light. As they stepped closer to investigate, the trees branches parted, revealing a secret pathway.\n\nAs the group of friends followed the path, they were transported to another world where magic filled the air. The landscape was filled with rolling hills, shimmering lakes, and towering mountains, everything sparkled like diamonds illuminated by the sun. Liz and her friends could hardly believe they had stumbled upon such a spectacular place.\n\nAs they made their way through this world, they met many incredible creatures, including fairies with wings the colors of the rainbow, talking animals, and kind-hearted giants. They traveled far and wide, exploring every inch of the land, and learning all about magic.\n\nYet their adventure was not without its challenges. Along the way, the friends faced many dangers, from dark wizards and menacing monsters to treacherous terrain. But Liz and her friends were brave, and they always stuck together, no matter what came their way. They discovered that they each had their own kind of magic and that when they worked together, they could accomplish anything.\n\nThe adventure was long, and the friends faced many trials, but eventually, they succeeded in their mission to bring an end to the dark magic that had been threatening the land. Liz and her friends had saved the day, and they were hailed as heroes throughout the kingdom.\n\nAs Liz and her friends returned home, they knew they had been forever changed by their magical adventure. They had encountered wonders beyond their wildest dreams and had discovered strength and courage they never knew they had. And though they would never forget their incredible journey, they knew that this was just the beginning, as the world was full of magic waiting to be discovered."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Develop a plan to achieve the following goal.Improve financial stability"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here are some steps that can be taken to improve financial stability:\n\n1. Assess your current financial health: It's important to have a clear understanding of your current financial situation. This includes analyzing your income, expenses, debt, and savings.\n\n2. Create a budget: Once you know your financial status, create a detailed budget to keep track of your monthly income and expenses. This will help you allocate your funds and prevent overspending.\n\n3. Reduce expenses: Cutting down on unnecessary expenses is crucial in achieving financial stability. Start by identifying areas where you can cut back, such as subscriptions, dining out, or luxury purchases.\n\n4. Pay off debts: High levels of debt can hinder financial stability. Take necessary steps to manage and pay off your debt as soon as possible, starting with those with highest interest.\n\n5. Increase your income: If your expenses exceed your income, consider seeking ways to increase it. This may include taking on a part-time job, requesting a raise or finding new job opportunities.\n\n6. Save for emergencies: Create an emergency fund to cover unexpected expenses, such as car repairs or medical bills. This can help prevent financial stress during trying times.\n\n7. Invest wisely: Consider investing in a retirement account or other safe investment opportunities to grow your wealth overtime.\n\n8. Seek professional advice: It can be helpful to seek the advice of a financial advisor to help you achieve your financial goals. They can provide guidance and recommendations tailored to your specific needs."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given a list of ingredients, provide a recipe for making a healthy breakfast smoothie.Banana, Almond milk, Peanut butter, Oats"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is a simple recipe for making a healthy breakfast smoothie using the ingredients Banana, Almond milk, Peanut butter, and Oats.\n\nIngredients:\n- 1 Banana\n- 1 cup Almond milk\n- 1 tbsp Peanut butter\n- 1/4 cup Oats\n\nInstructions:\n1. Peel the banana, and cut it into small pieces.\n2. In a blender, combine the banana, almond milk, peanut butter, and oats.\n3. Blend the ingredients until smooth. If you prefer a thicker consistency, you can add a few ice cubes.\n4. Pour the smoothie into a glass and serve immediately.\n\nEnjoy your healthy breakfast smoothie that is packed with nutrients and energy to kick start your day!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Correct the grammar of the sentenceThe car ran it's engine."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The car ran its engine."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate two original metaphors to describe the feelings of fear and regret."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Fear is a thick fog that descends upon your mind, clouding your vision and leaving you lost in its murky depths.\n2. Regret is like a heavy stone tied around your neck, weighing you down with every step and drowning you in its watery depths."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rewrite the following sentence using a passive voice:\n\nThe manager approved the report."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The report was approved by the manager."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given a movie title and an actor name, suggest a related movie.\"Avatar\" and \"Sam Worthington\""}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One related movie that you may enjoy is \"Clash of the Titans,\" which also stars Sam Worthington."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Brainstorm 3 potential applications for AI in marketing."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here are three potential applications for AI in marketing:\n\n1. **Personalized Content Generation and Recommendations**: AI can be employed to analyze customer data such as search history, purchase history, and browsing behavior to generate personalized content and product recommendations. This helps marketers deliver tailored content and product suggestions, enhancing the customer experience and driving sales.\n\n2. **Predictive Analytics**: AI can help marketers forecast sales trends and consumer behavior using predictive analytics. This involves processing large amounts of historical data using machine learning algorithms to identify patterns and relationships. This information can be used by marketers to fine-tune their marketing strategies and anticipate future demand for their products or services.\n\n3. **Chatbots and Virtual Assistants**: AI-powered chatbots and virtual assistants can be used to enhance customer engagement and improve the customer experience. These tools use natural language processing to understand and respond to customer queries, automating customer service and providing 24/7 support. This not only reduces the workload of customer service teams but also helps marketers gather valuable insights into customer needs and preferences."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Suggest a recipe that utilizes the following ingredients.Carrots, Potatoes, and Thyme"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here's a simple and delicious recipe for roasted carrots, potatoes, and thyme:\n\nIngredients:\n- 1 pound of carrots, peeled and sliced into chunks\n- 1.5 pounds of potatoes, cut into chunks\n- 2 tablespoons of fresh thyme leaves\n- 2 tablespoons of olive oil\n- Salt and black pepper\n\nInstructions:\n1. Preheat the oven to 425°F (218°C). Line a baking sheet with parchment paper.\n2. In a large bowl, combine the carrots, potatoes, thyme, olive oil, salt, and pepper. Toss everything together until the vegetables are evenly coated.\n3. Pour the mixture onto the prepared baking sheet and spread out into a single layer.\n4. Roast the vegetables for 35-40 minutes, stirring halfway through cooking, until the vegetables are tender and golden brown.\n5. Serve hot and enjoy!\n\nThis recipe makes a tasty and healthy side dish that pairs well with a wide variety of main courses. The thyme adds a delightful aroma and flavor, while the carrots and potatoes provide hearty and satisfying texture."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name one way bacteria can replicate"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One way that bacteria can replicate is through a process called binary fission. During binary fission, a single bacterium cell grows and eventually splits into two identical daughter cells that are the same size. In this process, the bacterial DNA is first duplicated, and then the two copies are separated as the cell grows. The cell then divides into two cells, with each new cell containing an identical copy of the original DNA. This simple process allows bacteria to reproduce rapidly, essentially doubling in number with each generation."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Insert sentences in the gaps of the following paragraphThe recent pandemic has been an incredibly difficult time for everyone. ____ ____ As pain and suffering swept the world, many people experienced hardship they never thought they'd have to face. ____"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The recent pandemic has been an incredibly difficult time for everyone. Jobs were lost, lives were changed, and the future seemed uncertain. As pain and suffering swept the world, many people experienced hardship they never thought they'd have to face. But through it all, stories of resilience and hope have emerged, reminding us that even during the darkest of times, there is always a glimmer of light."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify a rhyming word for the word \"eyes\". Output the word."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Prize."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Classify this as either an object or an action.running"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Action."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "You are given a sentence with an unclear pronoun. Rewrite the sentence to make it clear.John sent a letter to him."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "John sent a letter to another person."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the characters from the novel To Kill a Mockingbird."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To Kill a Mockingbird is a novel by Harper Lee, set in the 1930s in the fictional town of Maycomb, Alabama. The story is told from the perspective of Scout Finch, a young girl growing up with her brother Jem and their father Atticus, who is a lawyer. Here are some of the main characters and a brief description of each:\n\n1. Scout Finch- Scout is the narrator of the story, and the novel follows her growth from a six-year-old tomboy to a young woman of nine. She is intelligent, curious, and free-spirited.\n\n2. Atticus Finch- Atticus is Scout and Jem's father, and is a lawyer in Maycomb. He is a beacon of morality and justice, and is well-respected in the town. He is tasked with defending Tom Robinson, a black man, against charges of raping a white woman.\n\n3. Jem Finch- Jem is Scout's older brother and is four years her senior. He is a typical boy, interested in sports, adventure, and secrets, but is also very protective of Scout.\n\n4. Boo Radley- Boo is a reclusive neighbor who never leaves his house. The children are fascinated by him and make up stories about him, but as the novel progresses they begin to understand that he is not the monster they imagined.\n\n5. Tom Robinson- Tom is a black man who is falsely accused of raping Mayella Ewell. He is a physical embodiment of the unfair treatment of black people during this time period.\n\n6. Mayella Ewell- Mayella is the daughter of the town drunk, Bob Ewell. She is poor and uneducated, and her accusations against Tom Robinson set off a firestorm of events.\n\n7. Bob Ewell - Bob Ewell is Mayella's father and the town drunk. He is a bitter and angry man, who is quick to blame others for his own shortcomings. \n\n8. Calpurnia- Calpurnia is the Finch's housekeeper and is like a member of the family. She is strict with Scout and Jem, but also very loving, and acts as a maternal figure for them.\n\n9. Dill Harris- Dill is a friend of Scout and Jem, and is a frequent visitor to Maycomb during the summers. He is imaginative and loves to tell stories, and is obsessed with trying to get Boo Radley to come out of his house.\n\n10."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a recipe for banana ice cream."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Banana Ice Cream Recipe\n\nIngredients\n- 4 ripe bananas\n- 1/2 cup of heavy cream\n- 1 tsp vanilla extract\n- 2 tbsp honey or maple syrup\n- A pinch of salt\n\nInstructions\n1. Start by peeling and slicing the bananas into small pieces.\n2. Place the sliced bananas in a freezer bag and freeze for at least 2 hours or until completely frozen.\n3. Once frozen, transfer the bananas to a food processor or high-speed blender.\n4. Add in the heavy cream, vanilla extract, honey, and salt to the frozen bananas.\n5. Blend the mixture until it is smooth and creamy, scraping down the sides as needed.\n6. If the mixture is too thick or difficult to blend, add in more heavy cream or milk, one tablespoon at a time, until it blends smoothly.\n7. At this point, the mixture will be soft-serve consistency. You can serve it immediately or transfer it to a loaf pan or freezer-safe container and freeze for at least 30 minutes or until firm.\n8. Once firm, scoop the ice cream into bowls or cones and serve.\n9. You can also add in toppings such as nuts, chocolate chips, or fruit before serving.\n\nEnjoy your homemade banana ice cream!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the character \"Moth\" from the book 'Les Misérables'."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Moth, or Montparnasse, is a character in Victor Hugo's novel 'Les Misérables.' He is part of a group of criminals in Paris known as the Patron-Minette. Moth is portrayed as a young man in his early twenties, who is handsome and dresses fashionably. He is said to be well-built but slim and agile, with dark hair and an insinuating smile. Though he has a charming appearance, Moth is quite ruthless, indulging in theft, murder, and other criminal activities. He is characterized as being lazy, preferring to live off easy money obtained through crime rather than finding legitimate work. Moth is also known for his vanity, as he is very particular about his appearance, often seen preening himself."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "You are provided with a sentence. You need to generate an opposite sentence using antonyms or negattive language.The sky is high."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The ground is low."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "List five benefits of regular physical activity."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Improved cardiovascular health: Regular physical activity can help improve heart and lung function, reducing the risk of heart disease and stroke. \n\n2. Weight management: Engaging in regular physical activity can help with weight loss and weight management by burning calories and increasing metabolism. \n\n3. Stronger muscles and bones: Physical activity, especially weight-bearing exercises, can help build stronger muscles and bones, reducing the risk of osteoporosis and other chronic conditions. \n\n4. Boosted mood and mental health: Engaging in physical activity can release endorphins and other feel-good hormones, helping to improve mood, reduce stress, and decrease the risk of depression and anxiety. \n\n5. Reduced risk of chronic disease: Regular physical activity can help reduce the risk of developing chronic conditions such as type 2 diabetes, hypertension, and certain types of cancer."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rank the following colors from most bright to most dull{blue, green, yellow, orange}"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Yellow \n2. Orange \n3. Green \n4. Blue"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Transform the following sentence argument into a categorized list.Americans waste a lot of food."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Topic: Food Waste in America\n- Category: Waste\n - Subcategory: Food\n - Statement: Americans waste a lot of food."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a dialogue between two people trying hard to make a connection."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A: Hey there! Haven't seen you in a while. How have you been?\nB: I've been doing well, thanks for asking. What about you? How's everything going?\nA: Life has been busy, but good. Just trying to juggle work and personal stuff.\nB: Yeah, I can relate. It's hard to find the balance sometimes.\nA: Definitely. So, are you still into hiking?\nB: Yes, I love it! I actually went on a really beautiful hike just last weekend.\nA: That's great. I've been meaning to take up hiking again. Maybe we can go together sometime?\nB: Sure, that sounds like fun. We should plan something soon.\nA: Yes, let's do that. I'll get in touch with you and we can set a date.\nB: Sounds good. Looking forward to it!"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Replace the words in bold with words or phrases that have the same meaning.My dog has horrible dyspnea."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "My dog has severe difficulty breathing."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What is the risk of using a credit card for online purchases?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Using a credit card for online purchases involves some level of risk, as with any financial transaction. Here are some of the most common risks associated with using a credit card online:\n\n1. Fraudulent Websites or Phishing Scams: One of the biggest risks associated with using your credit card online is that you may inadvertently enter your information on a fraudulent or insecure website. Hackers use clever techniques such as creating fake websites or phishing emails that mimic legitimate businesses to trick you into sharing your credit card information.\n\n2. Data Breaches: Even when you shop on a reputable website, there is a risk that the merchant's data systems could be hacked and your credit card information could be stolen. This could lead to fraudulent transactions being made on your card without your knowledge.\n\n3. Over-sharing Personal Information: When making purchases online, you may be prompted to provide additional information such as your phone number, date of birth, or address. It's important to only share this information with trusted, reputable businesses, as there's always a risk that this information could be used for identity theft.\n\n4. Public Wi-Fi: If you're using a public Wi-Fi network to shop online, there is a risk that your information could be intercepted by hackers. It's best to use a secure, private internet connection whenever you're sharing sensitive information like your credit card details.\n\nDespite these risks, there are steps you can take to minimize them. Be sure to shop only on reputable websites, keep your computer and security software up to date, use strong passwords, and monitor your credit card statements regularly for any suspicious activity. By following these precautions, you can feel more confident about using your credit card for online purchases."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Categorize the following type of object: a bed frame."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A bed frame is a piece of furniture."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Based on the given input sentence, generate a sentence with a similar tone and meaning.I really don't want to do this."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I am really not in the mood for this."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Link the person's name, email address, age and gender in the following table.Person A | Alice Jones | alice@email.com | 30 | Female \nPerson B | John Smith | john@email.com | 25 | Male"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "| Name | Email | Age | Gender |\n|-------------|-------------------|-----|--------|\n| Alice Jones | alice@email.com | 30 | Female |\n| John Smith | john@email.com | 25 | Male |"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Which month of the year is the first of the four major US holidays?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The first of the four major US holidays falls in the month of January, specifically on the first day of the month, and it is called New Year's Day."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain Newton's second law"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Newton's second law of motion states that the force acting on an object is equal to the mass of that object multiplied by its acceleration. This law is commonly written in equation form as F = ma, where F represents the force, m represents the mass, and a represents the acceleration. \n\nThis law gives us a way to understand the relationship between the force applied to an object, its mass, and the motion that results. According to the law, the acceleration of an object increases as the force acting on it increases, and decreases as its mass increases. This means that the more mass an object has, the more force is required to cause it to accelerate.\n\nFor example, imagine trying to push a heavy object like a car. Due to its large mass, it will take a lot of force to get the car to start moving, and even more to make it move quickly. Conversely, a light object like a feather is easy to move and accelerate because it has very little mass.\n\nOverall, Newton's second law of motion tells us that the amount of force required to move an object depends on both its mass and the rate at which we want to change its velocity. By understanding this relationship, scientists and engineers can predict and control the motion of objects in a wide range of situations."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate an appropriate template to describe a customer service interaction"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Customer Service Interaction Template:\n\nDate: [insert date]\nTime: [insert time]\nCustomer Service Representative: [insert name]\nCustomer Name: [insert name]\nCustomer Contact Information: [insert email, phone number, etc]\nIssue: [insert brief description]\n\nInteraction Summary: \n\nStart of conversation: \n [insert brief summary of customer's initial inquiry or concern]\n\nProblem solving:\n [insert efforts made by the representative to address the customer's concern]\n\nResolution:\n [insert final outcome and steps taken to resolve the issue]\n\nEnd of conversation: \n [insert any final comments or feedback from the customer or representative]\n\nFollow-up: \n [insert any follow-up actions to be taken by either the customer or representative]"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a rewriting of the sentence using the same meaning.She was excited to finally arrive at the destination."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "She was thrilled to reach the destination at last."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Calculate 15% of 500."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "15% of 500 is 75."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Construct a query to find out the average salary of Data Scientists in the US"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "To find out the average salary of data scientists in the US, you can use a search engine with the following query: \"average salary data scientist USA\" or navigate to a salary aggregator website and search for \"data scientist\" in the United States to see the average salary for that position."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify if the phrase contains an emotional sentiment.I don't understand why this always happens to me."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Yes, the phrase contains an emotional sentiment of frustration or confusion."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe how technology has changed the way we work.Noinput"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Technology has revolutionized the way we work in many ways. Some of the most significant changes include:\n\n1. Increased productivity: With the help of computers and advanced software, tasks that used to take days or hours can now be done in minutes. Word processors, spreadsheets, and presentation software have made it easy to create and edit documents, while tools like calendars, task managers, and project management software help employees stay organized and on track.\n\n2. Improved communication: The rise of the internet, email and instant messaging have made it easy to communicate and collaborate with others, regardless of location. This has made remote work possible, allowing employees to work from anywhere in the world.\n\n3. Greater access to information: The internet, search engines and online databases have made it easier than ever to retrieve information. This has led to faster and more informed decision making, as employees can easily research topics, analyze data and prepare reports.\n\n4. Increased automation: Machines have taken over tasks that used to be done by humans, freeing up time and reducing the risk of errors. Automation has also led to greater efficiency, as machines can work continuously without breaks.\n\n5. Enhanced innovation: Technology is constantly improving, bringing new tools and features that make our work more efficient and productive. This has led to new ways of working, increased creativity and greater innovation.\n\nOverall, technology has transformed the way we work, making it easier, faster, and more efficient than ever before."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write an analysis of the following quote.\"The greatest glory in living lies not in never falling, but in rising every time we fall.\" - Nelson Mandela"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The quote by Nelson Mandela speaks to the resilience and determination of the human spirit. It suggests that it is not the absence of failure in one's life that defines greatness, but rather the ability to get back up and keep going. In this way, the quote emphasizes the importance of perseverance in the face of adversity.\n\nThe use of the phrase \"greatest glory\" suggests that there is a sense of honor and pride to be found in overcoming challenges, and that this is something to be celebrated. Rather than viewing failure as a setback, it should be seen as an opportunity to learn from our mistakes and grow as individuals.\n\nMandela's use of the word \"rising\" also echoes the imagery of a phoenix rising from the ashes. This reinforces the idea that we can emerge stronger and more resilient from our struggles, and that they can ultimately serve as catalysts for positive change and growth.\n\nOverall, this quote encourages us to never give up, even when faced with hardship. It reminds us that true success comes not from avoiding obstacles, but from facing them with courage and determination, and that this ability to rise above our struggles is the true measure of greatness."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rewrite the following sentence to reflect the other point-of-viewHe is too immature to be trusted"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"He believes that I am too immature to be trusted.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the concept of dimensional reduction."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Dimensional reduction refers to techniques used to reduce the number of dimensions or variables in a dataset while retaining as much information as possible. This is often desirable when analyzing high-dimensional data where a large number of features or variables can make the analysis difficult, time-consuming, or prone to overfitting.\n\nThe basic idea behind dimensional reduction is to identify and remove redundant or irrelevant features while preserving the underlying structure and relationships between the remaining features. This can be achieved through a number of methods, such as principal component analysis (PCA), linear discriminant analysis (LDA), and t-distributed stochastic neighbor embedding (t-SNE), among others.\n\nThese methods work by transforming the original high-dimensional data into a lower-dimensional representation by applying mathematical algorithms that seek to maximize the amount of variance captured in the reduced data. This allows us to represent the data in a simpler and more manageable form without sacrificing much of the original information.\n\nDimensional reduction can be extremely useful for data visualization, where high-dimensional data can be challenging to represent in a comprehensible manner. By reducing the number of dimensions, we can create visualizations that are easier to interpret and explore. Additionally, dimensional reduction can also help to improve the performance of machine learning models by reducing the number of inputs, leading to faster training times and more accurate predictions.\n\nOverall, dimensional reduction is a powerful tool for simplifying and streamlining data analysis by allowing us to focus on the most relevant and informative features while ignoring those that do not contribute to our understanding of the data."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Design a bridge for a river with a width of 35 meters."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "In order to design a bridge with the given specifications, we need to take several factors into account such as location, purpose, the type of traffic and environment. Here's an outline of the approach and general recommendations for the design of the bridge over the river that is 35 meters wide:\n\n1. Location: Firstly, we need to find the right location where the bridge foundations can easily be embedded. The site selected has to be based on existing infrastructure and potential route alternatives. Site feasibility will dictate the type of bridge to be constructed.\n\n2. Purpose: Evaluate the primary function of the bridge, whether it is to be utilized for pedestrian and/or vehicle traffic, or a mixed-use bridge. The bridge's design will reflect the expected level of usage, often connected with the surrounding environment and nearby transport systems.\n\n3. Type of bridge: Based on the location and purpose, the appropriate type of bridge has to be selected. For a river with a width of 35 meters, a good choice could be a simple beam bridge, arch bridge, or a truss bridge. For greater visual appeal, a cable-stayed or suspension bridge could be suitable.\n\n4. Traffic considerations: The design of the bridge will be determined by the anticipated capacity and weight of traffic. It's important to plan for future use as well, accounting for any increase of traffic.\n\n5. Design and construction materials: The bridge must be designed utilising a material that best fits the location and capable of withstanding the test of time. Common materials include steel, concrete, or a combination of such materials. The structural composition of the bridge should also consider the water flow rate and potential for flooding.\n\n6. Environmental considerations: Besides flood risks, the design should take into account local wildlife and ecology. The bridge must provide minimal disruption to the aquatic life in the river, such as by creating space for wildlife passages.\n\n7. Aesthetics: Lastly, while functional, the bridge's design also needs to enhance the surrounding landscape. A well-designed bridge can be a visual magnet that will attract visitors and add value to the community.\n\nIn summary, designing a bridge requires careful consideration of multiple factors to achieve an ideal balance between practicality, functionality and visual appeal. By integrating the above, we will be able to come up with an optimal, safe, and effective design that serves the inhabitants in the area while protecting the environment."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Suggest a more interesting word to use in the following sentence.She took a look at the boring report."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "She took a glance at the lackluster report."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Output a summary of the paragraph.Powerful and yet surprisingly easy to use, cloud computing has revolutionised business practices, offering businesses a wide range of scalable services that are delivered effortlessly over the internet. It has become increasingly popular in recent years due to its low cost and scalability, its scalability making it ideal for businesses of any size. Additionally, cloud computing provides a variety of resources and applications that can be used for tasks ranging from data analysis to customer management."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Cloud computing is a powerful, user-friendly technology that has transformed the way businesses operate. It offers cost-effective, scalable services through the internet, suitable for businesses of all sizes. It also provides multiple resources and applications for various tasks."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a witty quip for the given situation.You spilled your coffee on the new carpet."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"I guess the carpet was wanting some caffeine too.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Parse the following JSON data and reformat into an HTML table.{\"People\": [\n {\"Name\": \"John\", \"Age\": 20}, \n {\"Name\": \"Jill\", \"Age\": 27}\n]}"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here's your HTML table:\n\n<table>\n <tr>\n <th>Name</th>\n <th>Age</th>\n </tr>\n <tr>\n <td>John</td>\n <td>20</td>\n </tr>\n <tr>\n <td>Jill</td>\n <td>27</td>\n </tr>\n</table>"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given two sentences, generate a third sentence which combine the two ideas together.He was always so calm. He would never raise his voice."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "He was always so calm and composed that he would never raise his voice in anger."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain where an electric current originates."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Electric current originates from the movement of electrically charged particles, such as electrons or ions, within a conducting material or along a wire. The foundation of electric current is rooted in the atomic structure of matter - more specifically, in the behavior of electrons orbiting the nucleus of an atom. \n\nIn most materials, electrons are tightly bound to the atoms and cannot move freely. However, in some materials, such as metals, some of the electrons are only loosely bound to the atoms and are free to move about within the material. These materials are known as conductors. When an electric potential difference, often referred to as a voltage, is applied across a conductor, it creates an electric field that exerts a force on these free electrons, causing them to move in a particular direction. This movement of electrons is what we refer to as electric current. \n\nIn summary, electric current originates from the movement of electrically charged particles, typically electrons, that are influenced by an electric field due to the application of a voltage across a conducting material."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a childrens book title from two random words.happiness, balloon"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"The Balloon of Happiness: A Joyful Adventure\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate three pieces of helpful advice for someone who is looking to transition to a different career"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Start by researching and assessing your skills and interests: As you plan to make a transition to a different career, it is important to take a step back and evaluate your strengths, talents, and passions. Figure out what skills and experiences you have from your current position that you can transfer to your new career. Learn as much as you can about the new industry you are interested in, and determine what additional skills or education you may need to acquire.\n\n2. Network and build connections: Its essential to spend some time networking and connecting with other professionals in the field you're considering. Attend industry conferences, events, and join relevant online groups to build relationships with people who can offer insight, advice, and possibly even job leads.\n\n3. Be flexible and open-minded: Keep in mind that transitioning to a new career may require some initial sacrifices, such as starting at a lower position or salary. Its important to remain flexible and open-minded in your expectations and to see every experience as an opportunity to learn and grow. Remember that this is a process and success wont come overnight but with dedication, hard work, and perseverance, you can achieve your goals."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a Haiku of 3 lines, with the first containing 5 syllables and the remaining lines containing 7 syllables."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Whispers in the wind,\nEchoes of the past softly sigh,\nNature's symphony."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rearrange the words and punctuation to form a valid sentence.state the and issue firmly"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "State the issue firmly."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a pie chart that shows the percentage of people living in each continent"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "As an AI assistant, I can generate the textual information needed to create a pie chart but I'm an AI language model so I cant draw or create graphical illustrations such as charts. For that reason, I'll provide you with the data and you can use a graphing tool to create a pie chart.\n\nAccording to world population statistics from the United Nations, as of 2020, the approximate percentage of people living in each continent is as follows:\n\n- Asia: 59.5%\n- Africa: 17.2%\n- Europe: 9.6%\n- North America: 7.7%\n- South America: 5.6%\n- Oceania: 0.5%\n\nYou can use these data points to create your pie chart with the help of a graphing tool or software."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the grammatical category of the underlined word: She went to the beach.Beach"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The grammatical category of the word \"beach\" in the given sentence is a noun."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Find an interesting quote about friendship"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Friendship is born at that moment when one person says to another, 'What! You too? I thought that no one but myself . . .'\" -C.S. Lewis"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Edit this sentence and make sure it is grammatically correct.I went to the shopping mall, and made sure buy some presents."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I went to the shopping mall and made sure to buy some presents."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Analyze the implications of a new environmental policyThe policy states that companies must reduce air pollution levels by 25% in the next 5 years."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The new environmental policy that requires companies to reduce air pollution levels by 25% in the next 5 years can have several implications. \n\nFirst, it emphasizes the government's commitment towards reducing air pollution and improving the air quality for its citizens. If effectively implemented, the policy is likely to lead to a cleaner, healthier environment, protecting wildlife and reducing the potential health hazards that the citizens face due to air pollution.\n\nSecond, this policy will require companies to modify their processes and invest in new greener technologies that reduce air pollution. This can lead to an increase in costs for companies, which might be passed on to the consumers. However, in the long term, such investments may lead to cost savings as a result of greater energy efficiency and reduced dependence on non-renewable resources.\n\nThird, this policy could also help boost the growth of the green energy sector, providing new job opportunities in areas such as renewable energy, recycling, and pollution control.\n\nFourth, this policy can enhance the country's image as a responsible member of the global community in terms of its efforts towards reducing air pollution and mitigating climate change. This can help attract green investments and enhance trade relations with other countries that value sustainability.\n\nOverall, the new environmental policy seems to be a positive step towards improving air quality, promoting sustainability and fostering the growth of the green energy sector."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Take a given list of stocks and classify them based on their performanceApple, Google, Microsoft"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Sure, to classify the performance of the stocks of Apple, Google, and Microsoft, I would need more information such as their stock prices over a specific period of time, or the standard metrics used to evaluate stock performance such as revenue growth, earnings per share, and so on. Can you provide more context for me to accurately evaluate their performance?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Formulate a chemical formulaCalcium oxide"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The chemical formula for calcium oxide is CaO."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Paraphrase this sentence: \"Today, the new policy was released\"."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The new policy was made public today."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Design a game to teach children to recognize shapes."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Game Name: Shape Explorers\n\nGame Description: Shape Explorers is a fun and interactive game for children ages 3-7 to learn to recognize and identify different basic shapes. The game involves an adventure where children help a group of explorers navigate through a forest to find hidden treasures, using clues related to shapes.\n\nObjectives: \n- To teach children to recognize and identify basic shapes such as circles, squares, rectangles, triangles, and ovals.\n- To develop children's cognitive and problem-solving skills.\n\nHow to Play:\n1. The game starts with a brief tutorial to introduce children to different shapes and their names.\n2. The game presents a story where a group of explorers is on a quest to find hidden treasures in a forest. The treasures are locked in a chest, and to find the key, the explorers must solve riddles related to shapes.\n3. Each riddle consists of a clue such as \"Find a shape that has four equal sides\" or \"Find a shape with no straight edges\". The riddles appear on the game screen, and children must select the correct shape from a set of options provided.\n4. For every correct answer, the children earn coins, which can be used to unlock virtual rewards like new characters, clothes among others.\n5. As the children progress in the game, the difficulty level increases, and they can encounter tasks such as sorting, matching, and counting shapes.\n6. At the end of the game, the children find the key to the treasure chest and unlock it to reveal their prize - a certificate of achievement that can be printed and displayed.\n\nThe game can have colorful graphics and engaging animations to keep children entertained, and positive reinforcement through verbal praise and virtual rewards can be used to encourage them to keep playing and learning. Additionally, a progress tracker can be included to allow parents to monitor their child's progress and improvement."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the history of the first use of computers in industry."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The first use of computers in industry dates back to the 1940s, when the first general-purpose electronic digital computer, the Electronic Numerical Integrator And Computer (ENIAC), was developed. ENIAC was created during World War II for the purpose of calculating ballistic trajectories for artillery shells, and its development was primarily funded by the United States Army. After the war, ENIAC was used for a variety of applications, including weather prediction, atomic-energy calculations, and wind-tunnel design.\n\nIn the early 1950s, the first commercially available computer, the UNIVAC I (UNIVersal Automatic Computer I), was developed by J. Presper Eckert and John Mauchly - the designers of the ENIAC. UNIVAC was designed for business and administrative use, and its first customer was the United States Census Bureau, which used it to tabulate the 1950 census. Another early customer was the Prudential Insurance Company, which used UNIVAC to process its actuarial calculations and statistical analysis.\n\nIn the following years, the use of computers in industry continued to expand, as more companies began to recognize the benefits of automating their processes and data management. Industries such as banking, telecommunications, and manufacturing were among the first to adopt computer technology, with applications ranging from payroll processing and inventory management to process control and quality assurance.\n\nThe development of smaller, cheaper and more powerful computers in the 1960s and 70s further accelerated the use of computers in industry, as they became accessible to an even wider range of businesses. The introduction of the personal computer in the 1980s, along with advances in networking and the increasing popularity of the Internet, further revolutionized the use of computers in industry, paving the way for the modern, digital economy."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a question with the keyword “fungi”"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "What role do fungi play in the ecosystem and how do they interact with other organisms?"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the challenges and opportunities of mobile phone use in developing countries"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The use of mobile phones in developing countries presents both challenges and opportunities. \n\nChallenges: \n\n1. Affordability: Although the price of mobile phones has decreased over the years, many people in developing countries still find it difficult to afford a device, particularly a smartphone. \n\n2. Infrastructure: In some areas, network coverage can be inconsistent, and access to electricity for charging can be limited.\n\n3. Digital literacy: Many users in developing countries have low levels of digital literacy, which can limit their ability to make full use of mobile phones and the internet. \n\n4. Language barriers: Most mobile phones use English or other major languages, which can be a challenge for people who do not speak these languages fluently. \n\nOpportunities:\n\n1. Enhancing communication: Mobile phones enable people in developing countries to connect with others more easily, facilitating communication and, in turn, social and economic development.\n\n2. Providing access to information: Mobile phones enable people in developing countries to access information, such as news and weather updates or market prices, that can improve their livelihoods. \n\n3. Expanding financial inclusion: Mobile money services, such as branchless banking and mobile payments, can help people in developing countries access financial services that were previously inaccessible.\n\n4. Improving healthcare: Mobile phones can be used to improve healthcare delivery by facilitating communication between patients and health workers, or by providing access to health information and services. \n\nOverall, despite the challenges, the use of mobile phones in developing countries presents significant opportunities for improving the lives of people living in these areas."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create an employee manual that outlines the roles and responsibilities of a web developer."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Welcome to the team! As a web developer, you play a crucial role in designing, building, and managing our web-based applications, systems and products, ensuring they are user-friendly, effective, and efficient. Here, we have outlined some of the major roles and responsibilities that come with being a web developer:\n\n1. DESIGN: It is important to ensure that web applications are visually appealing and engaging to create an immersive experience for users. This is done through the use of design tools such as Photoshop or Sketch, user research, and wireframing.\n\n2. DEVELOPMENT: Web developers use a variety of programming languages, such as HTML, CSS, JavaScript, and PHP, to write, create, and edit websites and application code. They must be able to develop complex websites from scratch and maintain them over time.\n\n3. TESTING: One of the main responsibilities of a web developer is to test websites and applications, ensuring that they are functioning correctly and are free from bugs or errors. This involves reviewing code, debugging, and performing quality assurance checks.\n\n4. MANAGEMENT: Web developers are responsible for managing the project from conception to delivery, ensuring that it is completed on time and within budget. They must be able to work collaboratively with other departments, such as design, marketing or sales, and communicate effectively to keep the project on track.\n\n5. PROBLEM-SOLVING: Web developers are responsible for troubleshooting and resolving issues that may arise during production or with existing websites or applications. They must be able to think critically and creatively to develop, implement, and maintain solutions.\n\n6. CONTINUING EDUCATION: As technology evolves, it is important for web developers to stay updated on emerging trends and changes in the industry. They must be willing to learn and adapt to new technologies, programming languages, and design techniques.\n\nIn conclusion, web developers are critical players in the creation, development, management, and maintenance of web-based applications and products. Their roles and responsibilities are varied and require a combination of technical, creative, and interpersonal skills. We are excited to have you on board and look forward to the great work that you will do to contribute to our success."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Rewrite the following sentence without changing the meaning: His car was wrecked by a careless driver."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A careless driver wrecked his car."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Change the verb in the sentence to its past tense.He studies hard."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "He studied hard."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Write a short code to print the first 5 elements of a given list.[1, 2, 3, 4, 5, 6, 7, 8]"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here is a short Python code that can do that:\n\nmy_list = [1, 2, 3, 4, 5, 6, 7, 8]\nfor i in range(5):\n print(my_list[i])"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name two countries with a population size larger than the United States."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Two countries with a population size that is larger than the United States are China and India."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a list of goals for this upcoming year."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Develop healthy habits by maintaining a consistent exercise regimen and balanced diet.\n2. Improve productivity by minimizing distractions and increasing focus.\n3. Increase professional skills through continued education and training.\n4. Strengthen personal relationships by spending more time with loved ones.\n5. Save money by creating and sticking to a budget.\n6. Practice stress-reducing techniques such as meditation and journaling.\n7. Keep a positive attitude by focusing on gratitude and positivity.\n8. Work towards a promotion or career advancement.\n9. Pursue a new hobby or passion.\n10. Travel and explore new places and cultures."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the color yellow in 3 words."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Bright, cheerful, warm."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Categorize the following three words.Happiness, Energy, Rainbow"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Category: Abstract Concepts/Nouns.\n\n1. Happiness - Emotion/Feeling\n2. Energy - Physical/Scientific Concept\n3. Rainbow - Natural Phenomenon."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Based on the given title, create a summary of the paper.\"The Use of Machine Learning in Healthcare Services\""}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "This paper explores the application of machine learning techniques in healthcare services. It outlines the various ways machine learning can improve the efficiency of healthcare delivery, from predicting and diagnosing diseases to devising personalized treatment plans. The paper also discusses current developments and advances in this field, and the potential benefits of incorporating machine learning into healthcare services. Ultimately, the paper suggests that machine learning has the potential to revolutionize the way healthcare is delivered and could significantly improve patient outcomes."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "You received two pieces of feedback from your supervisor. Combine the two pieces of feedback into one sentence.Your work is organized but you need to be more meticulous."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "My supervisor commended me for my organized work but advised me to be more meticulous in my approach."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the historical context of WWI."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "World War I or the First World War, often referred to as the Great War, was a global war originating in Europe that spanned from 1914 to 1918. More than 70 million military personnel, including 60 million Europeans, were mobilized in one of the largest wars in history. At least nine million combatants and seven million civilians died as a direct result of the war.\n\nThe war was sparked by the assassination of Archduke Franz Ferdinand of Austria by Yugoslav nationalist, Gavrilo Princip. This led to a declaration of war by Austria-Hungary on Serbia. The complex web of alliances and hastily constructed treaties amongst European nations quickly saw the conflict expand to include Germany, Russia, Great Britain, and eventually the United States.\n\nBehind this trigger event, however, lay deeper, long-established tensions between European nations. These included imperialistic competition for territory and resources, economic rivalry, arms races, and the growth of nationalism. European states jostled for advantage, forming two opposing military alliances: the Triple Entente, consisting of France, Russia, and Britain, and the Triple Alliance of Germany, Austria-Hungary, and Italy. These alliances were the result of longstanding relationships between these countries but also served as a means of deterring potential attackers by ensuring that the cost of conflict would be too high.\n\nAnother major contributing factor was the arms race between major European powers, driven by the development of new technology such as machine guns, submarines, and chlorine gas, and the increasing size and power of Europe's armies and navies. The fall of the Ottoman Empire also created instability in the Balkans, where various nationalist movements emerged aiming to realize their goals of independence or expansion.\n\nThese tensions and alliances, coupled with the intense nationalism and militarism of the time, created a complex and volatile situation that erupted into war in 1914."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Transform a positive statement into a negative one.I will succeed."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I will not succeed."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Explain the process of electron transport chain?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The electron transport chain, also known as the respiratory chain, is a series of complex enzymes and proteins located in the inner mitochondrial membrane, responsible for producing the majority of the ATP during cellular respiration. The process is the final step in the aerobic respiration, where energy from the oxidation of nutrients is used to produce ATP, the cell's source of chemical energy.\n\nHere are the main steps of the electron transport chain:\n\n1. Electrons are donated to the chain by the reduced coenzymes NADH and FADH2, which are produced during earlier stages of cellular respiration. These coenzymes deliver high-energy electrons, which will travel through the chain, to complex I and complex II, respectively.\n\n2. As the electrons are passed along the chain through series of redox reactions, they lose energy which is used by the protein complexes to pump protons from the mitochondrial matrix into the intermembrane space, creating an electrochemical gradient. \n\n3. The final electron acceptor is molecular oxygen, which is reduced to water, removing the low-energy electrons from the chain and allowing it to start again.\n\n4. The electrochemical gradient created by the proton pumping generates a force, called the proton motive force, which drives the synthesis of ATP. This process is called chemiosmosis, where the protons flow back into the matrix through an enzyme called ATP synthase, which uses the energy to phosporylate ADP into ATP.\n\nOverall, the electron transport chain uses the energy from the electrons donated by the reduced coenzymes to pump protons and create an electrochemical gradient, which is used to generate ATP through chemiosmosis. Through this mechanism, the cell can produce large amounts of ATP from the oxidation of nutrients, enabling it to perform its functions."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given the following passage, summarize itJohn Adams was an American patriot who served as the second president of the United States. He was a statesman, a diplomat, and a leader of the American Revolution. He was very influential in the drafting and ratification of the United States Constitution. He was also a strong leader of the Federalist Party and an advocate of the Bill of Rights."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "John Adams was the second president of the United States, and played a key role as a statesman, diplomat, and leader in the American Revolution. He helped draft and ratify the Constitution, and was a leader of the Federalist Party, advocating for the Bill of Rights."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Provide a list of measures that could help reduce the emission of greenhouse gases."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1) Switching to renewable energy sources: Increasing the use of clean, renewable energy sources such as wind, solar, hydro and geothermal power, can significantly reduce greenhouse gas emissions.\n\n2) Promoting energy efficiency: Upgrading to energy-efficient appliances and buildings can lower energy consumption and reduce greenhouse emissions.\n\n3) Encouraging low-carbon transportation: Encouraging the use of public transportation, carpooling or low-carbon vehicles such as electric or hybrid cars can reduce emissions from fossil fuel use in transportation.\n\n4) Implementing carbon pricing: Governments can put a price on carbon emissions through a carbon tax or an emissions trading scheme, incentivizing companies to reduce their emissions.\n\n5) Promoting sustainable agriculture: Encouraging farmers to use sustainable agricultural practices that reduce chemical inputs and improve soil health, can reduce emissions from agriculture.\n\n6) Reforestation: Planting new trees and preserving existing forests can help absorb and store carbon dioxide from the atmosphere.\n\n7) Reducing waste: Reducing the amount of waste sent to landfills and increasing composting and recycling can decrease methane emissions from waste decomposition.\n\n8) Encouraging conservation: Encouraging conservation and reducing consumption of resources can decrease the demand for energy and reduce emissions from the production and transport of goods.\n\n9) Supporting research and development: Governments and businesses can invest in research and development to create innovative solutions for reducing greenhouse gas emissions.\n\n10) Promoting international cooperation: Greenhouse gas emissions are a global issue and require international cooperation and coordination to achieve significant reductions."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Determine the stock market trend of the following stock over the past month.The stock name is ABC Corporation."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I'm sorry, I'm an AI language model and do not have access to real-time market data. To determine the stock market trend of ABC Corporation over the past month, you will need to consult with a financial news source, a stock market tracker or a financial advisor."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "How would you explain the following term in simple language?Artificial Intelligence"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Artificial intelligence, or AI, is a type of computer technology where machines can do tasks that would normally require a human to do them. These tasks can include things like understanding language, recognizing images, and playing games. AI systems are designed to learn and improve over time, getting better at their job the more they do it."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Design a website that explains the concept of blockchain technology."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Here's an overview of what the website could look like:\n\nHeader: \"Understanding Blockchain Technology: A Comprehensive Guide\"\n\nMain Menu: \n- Introduction to blockchain\n- History of blockchain \n- How does blockchain work\n- Benefits and uses of blockchain\n- Blockchain applications in various industries\n- Future of blockchain technology\n- Resources and further reading\n\nIntroduction to blockchain:\nThis section would introduce the concept of blockchain technology and its significance in the digital era. It will briefly touch upon the definition, origin and general understanding of the term.\n\nHistory of blockchain:\nThis section traces the history of blockchain technology, from its inception to its evolution and growth over the years. It also highlights the major milestones and developments in the field.\n\nHow does Blockchain work:\nThis section provides a more in-depth understanding of how blockchain technology works. It explains concepts such as cryptography, distributed ledger, and consensus mechanisms, with the help of diagrams and illustrations to make it easier for the reader to comprehend.\n\nBenefits and uses of blockchain:\nIn this section, the various advantages of blockchain technology are highlighted, including security, transparency, and immutability. It also discusses the various use cases of blockchain, such as smart contracts, cross-border payments, and supply chain management.\n\nBlockchain applications in various industries:\nThis section showcases how blockchain technology is being applied across various industries such as finance, healthcare, government, and more. It provides real-life examples of companies and organizations that are leveraging the power of blockchain.\n\nFuture of blockchain technology:\nThis section looks into the future and discusses the potential of blockchain technology and its role in shaping our world. It also touches upon emerging trends and innovations in the field.\n\nResources and further reading:\nThis section provides a list of resources for those who wish to learn more about blockchain technology or delve deeper into specific aspects of it. This includes books, articles, videos, podcasts, and more.\n\nFooter: The website footer can include the contact information, social media links, and a brief about the creators or contributors of the website.\n\nOverall, the website would be visually appealing, easy to navigate, and informative, providing a comprehensive understanding of blockchain technology for readers of all levels."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Select the noun phrase in the following sentence.The accident caused horrific damage across the town."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The noun phrases in the given sentence are \"The accident\" and \"horrific damage.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Propose an idea for a social media campaign.A campaign for a women's activewear company."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "One potential idea for a social media campaign for a women's activewear company could revolve around the themes of body positivity and empowerment. The campaign could feature diverse, real women of all shapes, sizes, and ages engaging in various physical activities, highlighting the joy of movement, strength, and the importance of inclusivity in fitness.\n\nThe campaign could include a social media hashtag, such as #AllBodiesActive, and encourage people to share their stories and post pictures of themselves engaging in their favorite physical activities while wearing the company's activewear. This could create a sense of community and promote the brand's message of embracing all bodies and celebrating women's achievements.\n\nTo roll out the campaign, the company could collaborate with influencers and ambassadors who are advocates for body positivity and inclusivity in the fitness industry. The campaign could also include user-generated content such as inspiring quotes, workout tips, and self-love affirmations to further engage with the audience.\n\nOverall, the campaign's goal would be to encourage women to feel confident, empowered, and inspired to lead active lifestyles, while promoting the company's activewear as an inclusive and supportive brand."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Suggest a new tagline for the article.CNN's consultant on political television advertising believes that the cost to influence the 2008 election may exceed $3 billion."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "\"Unprecedented spending in the race for 2008: Will $3 billion be enough to sway voters?\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compose a comment for a review.The reviewer says that the product arrived quickly but the color was wrong."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Thank you for taking the time to leave a review. We're glad to hear that the delivery of your product was prompt. However, we're sorry to learn that the color wasn't what you expected. Could you please get in touch with our customer service team so we can assist you in resolving this issue? Thank you."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Describe the textile industry"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The textile industry is involved in the design and production of fabrics and other materials such as yarn and thread, typically for use in clothing, upholstery, and other fabric goods. It is one of the oldest industries in the world and has played a vital role in shaping the global economy. The industry itself encompasses several stages of production, from raw material processing and spinning to weaving, knitting, dyeing, and finishing, all of which require sophisticated machinery and skilled labor.\n\nThe textile industry is a global one, with raw materials such as cotton, wool, and silk being sourced from around the world and then processed into finished products through a complex supply chain. The industry employs millions of people worldwide, with countries such as China, India, and Bangladesh being major players in both production and export.\n\nTextiles are used for a wide variety of purposes, from clothing and fashion to household goods, and the industry is always looking for new and innovative fabrics and production techniques. With the rise of sustainability concerns, there has been a growing focus on the adoption of eco-friendly practices and materials, as well as improvements in labor conditions and fair trade practices."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the poetic device used in the following example: \"The wind whispers secrets to the trees\"."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The poetic device used in the example: \"The wind whispers secrets to the trees\" is personification. Personification is the attribution of human qualities, behaviors or characteristics to inanimate objects, animals or natural phenomena. In this case, the wind is given the human ability to whisper secrets."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Give the example of a simile for the following phrase.the dark night"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The darkness of the night was as black as coal."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Choose the correct variation of this word:amendable"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Amendable is correctly spelled."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Create a narrative piece from a given situation.Two friends went on a picnic and one dropped a valuable item."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Once upon a summer's day, two friends, Katie and Emily, decided to grab their baskets and blankets and head off on a much-needed picnic. They laughed and joked as they drove to the park, the wind blowing in their hair and the sun shining warmly on their faces. They found the perfect spot under a large oak tree, its branches stretching wide to provide them with shade.\n\nKatie pulled out a checkered blanket and laid it on the grass while Emily rummaged through the basket, bringing out an array of sandwiches, fruits, and pastries. They settled down and enjoyed their feast, discussing their lives and reminiscing about old times. The day was beautiful, and they felt at ease, surrounded by nature, and enjoying each other's company.\n\nAfter a while, they decided to explore the park, and so they packed up their things and set off. They walked alongside babbling brooks and over small wooden bridges, and they came across fields of wildflowers that swayed in the breeze. They stopped to take pictures and laughed as they tried to capture the beauty around them.\n\nIt wasn't until they had returned to their car and were about to drive away that Emily gasped and patted her pockets. \"Oh no!\" she exclaimed, \"I lost my grandmother's necklace!\" \n\nThey both panicked, knowing the necklace was a valuable family heirloom. They retraced their steps, scouring the park for any sign of the lost jewelry. They asked people if they had seen it, and they searched high and low.\n\nJust as they were about to give up hope, a little girl approached them, holding out the necklace in her tiny hand. \"Is this what you're looking for?\" she asked, smiling up at them. Tears of relief and gratitude filled Emily's eyes as she thanked the little girl, who simply shrugged and ran off to join her friends.\n\nKatie and Emily returned home that day, exhausted but elated. Emily clutched the necklace tightly in her hand, grateful to have it back. They talked about the little girl who had saved the day and the adventure they had been on. They both agreed - it had been a picnic to remember."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "How many bytes are there in a kilobyte?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "There are 1024 bytes in a kilobyte.\n\nHowever, the term kilobyte has historically been used to refer to either 1024 bytes, or 1000 bytes, depending on the context. In the International System of Units (SI), the prefix \"kilo\" means 1000, so 1 kilobyte (kB) would be 1000 bytes. However, in the context of computing and data storage, where binary data is organized in powers of 2, it has been common to use a different set of binary prefixes, where 1 kibibyte (KiB) = 1024 bytes. \n\nSince approximately 2000 the International Electrotechnical Commission (IEC) has developed an international standard of binary prefixes, they include prefixes: kibi-, mebi-, gibi-, tebi-, pebi-, and exbi-, so growing in factors of (2^10). However, even now kilobyte interchangeably is used to mean 1024 bytes, while in contexts such as data transmission and consumer storage, often is used with it's SI definition of 1000 bytes."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Give three examples of big data applications."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. **Predictive Maintenance:** Many industrial and manufacturing companies use big data analytics to anticipate failure in their machines, tools, and systems. They collect real-time data from sensors on machines worldwide, and then use predictive models to identify patterns of failure, provide early warning of issues, and create maintenance schedules that prevent downtime and reduce costs.\n\n2. **Personalized Healthcare:** Big data is increasingly used in healthcare to tailor treatment programs to individual patients. By analyzing data from millions of patient health records, medical sensors, and wearable devices, healthcare providers can predict outcomes, plan treatments, and provide personalized care. This application has the potential to improve patient outcomes, reduce healthcare costs, and enhance patient experiences.\n\n3. **Fraud Detection**: Big data technologies are used in banking and finance to detect and prevent fraud. With the help of big data analytics, it is possible to analyze data from a vast number of transactions and other activities in real-time, which helps to identify suspicious activities or transactions, and then take necessary actions to stop it. This also helps in enhancing system security and customer trust."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Summarize the main idea of the following article in one sentence.The question of when self-driving cars will be available on the market has been up for debate. Some say as soon as next year, while others think that it may be much further off. There are several factors that will affect when self-driving cars will be available, including economic considerations, technological feasibility, and social acceptance."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The availability of self-driving cars on the market is uncertain due to several factors including economic considerations, technological feasibility, and social acceptance."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Find the area of a triangle with base 8cm and height 11cm."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The formula for finding the area of a triangle is (base x height) / 2. \n\nPlugging in the values for base and height, the calculation is as follows: (8 cm x 11 cm) / 2 = 44 cm^2.\n\nTherefore, the area of a triangle with a base of 8 cm and height of 11 cm is 44 cm^2."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a jingle that conveys the idea of safety."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Verse 1:\nStay alert and stay aware,\nSafety first, don't you dare,\nTaking chances and being reckless\nCaution pays and prevents chaos.\n\nChorus:\nSafety first, that's the key,\nFor you and your family,\nStay safe, stay sound,\nPeace of mind will be found.\n\nVerse 2:\nWhen you're driving, buckle up,\nDon't text, don't take a sup,\nKeep your eyes on the road,\nSafety rules must be towed.\n\n(Repeat Chorus)\n\nBridge:\nIn everything, that you do,\nBe careful and avoid what's taboo,\nSafety first, in all you do,\nSo you can thrive and shine through.\n\n(Repeat Chorus x 2)\n\nOutro:\nSafety first, peace of mind,\nKeep yourself and others in line,\nBe cautious, be safe,\nAnd enjoy life's beautiful embrace."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Outline the pros and cons of technology-driven healthcare."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Technology-driven healthcare refers to the use of advanced technology and electronic methods to improve healthcare outcomes and patient care. Like any other field, technology has its advantages and disadvantages. Below is an outline of the pros and cons of technology-driven healthcare.\n\nPros:\n1. Improved patient outcomes: With the help of technology, physicians and other healthcare professionals can access patient's data and medical history, allowing them to make accurate diagnoses and provide personalized treatments, resulting in better patient outcomes.\n\n2. Enhanced efficiency: Technology-driven healthcare systems help to streamline hospital operations by minimizing paperwork and reducing wait times, resulting in an overall increase in efficiency and a decrease in costs.\n\n3. Convenience: Technology such as telemedicine enables patients to consult with their healthcare professionals remotely, eliminating the need for a physical visit to the hospital or clinic. This accessibility and convenience make healthcare more readily available to individuals who may find it difficult to access traditional in-person care.\n\n4. Advanced research: Technology-driven healthcare allows doctors and medical researchers to collect and analyze vast amounts of data, which can drive medical breakthroughs and advances in treatment.\n\nCons:\n1. Increased costs: Implementing advanced technology can be costly, and the expenses may be passed down to patients, resulting in higher healthcare costs.\n\n2. Security and privacy issues: With the storage of patient information electronically, there is a risk of breaches and unauthorized access to sensitive medical data.\n\n3. Dependence on technology: Over-reliance on technology could result in the diminishment of human skills and expertise, and technical malfunctions could potentially hinder patient care.\n\n4. Limited access: Not all individuals have access to technology-driven healthcare, particularly those living in developing nations or remote areas. This creates disparities in healthcare access and outcomes."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Classify this sentence as either a declarative or interrogativeDo you agree"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Interrogative."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Given the values for two angles ((9 degree, 48 degree) for representation), output the third angle in the triangle.9 degree, 48 degree"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The sum of the interior angles of a triangle always adds up to 180 degrees. Therefore, for a triangle with two angles measuring 9 degrees and 48 degrees, the third angle would be: \n\n180 - (9 + 48) = 123 degrees."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Edit the sentence to correctly use the idiom \"by the skin of your teeth\".She wanted to pass the test by the skin of her teeth."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "She passed the test by the skin of her teeth."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Analyze the consequences of a virtual learning environment."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Virtual learning environments have become increasingly popular, especially in the wake of the COVID-19 pandemic. While virtual learning offers many benefits, it also presents its own set of challenges and consequences. Here are some of the potential positive and negative consequences of a virtual learning environment:\n\nPositive consequences:\n\n1. Increased flexibility: Virtual learning environments allow for greater flexibility in scheduling and learning pace. Students can learn at their own pace, and that too from remote locations. This is particularly beneficial for students who have work or family commitments.\n\n2. Wider access to education: Virtual learning has made it possible for students from remote or disadvantaged areas to access quality education. This has helped to reduce the education gap and promote equality.\n\n3. Interactive and personalized learning: Virtual learning environments often offer interactive and personalized learning experiences. Students have access to a wide range of multimedia resources such as videos, animations, and simulations that can enhance their understanding of the subject matter.\n\n4. Lower costs: Virtual learning is often more affordable than traditional classroom learning since it eliminates the need for physical infrastructure, transportation, and other associated costs.\n\nNegative consequences:\n\n1. Social isolation: One of the main drawbacks of virtual learning is the potential for social isolation. Students may miss out on the social interactions and support that come with traditional classroom learning. This can impact their overall well-being and affect their learning outcomes.\n\n2. Lacks hands-on experience: Virtual learning environments may not offer the same kind of hands-on or experiential learning opportunities that are available in traditional classrooms. This can be a disadvantage for students who learn better through hands-on activities and experiences.\n\n3. Technical issues: Virtual learning often relies on technology, including computers and internet connections. Technical issues such as hardware failures, software glitches, and internet outages can disrupt the learning process and affect student outcomes.\n\n4. Requires discipline and motivation: Virtual learning requires a great deal of discipline and motivation on the part of the student. With increased flexibility and autonomy, students need to be self-motivated and disciplined in managing their time and completing their coursework.\n\nIn conclusion, virtual learning environments present a mix of positive and negative consequences. They offer increased flexibility, wider access to education, interactive learning experiences, and cost savings, but can also lead to social isolation, lack of hands-on experience, technical issues, and require discipline and motivation from students."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Categorize the below text regarding its genreThe sun was setting in the west, casting its warm rays over the sea."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The genre of the input text appears to be descriptive, fitting in the category of Literature or Creative Writing. It could also be a part of a larger narrative/story belonging to Romanticism, which emphasizes on emotion and individualism as well as glorification of the natural world."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Alter the following sentence by replacing the bold word with an appropriate synonym\n\nThe show was a **surprise** hit"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The show was an unexpected hit."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What is the similarity between a cat and a dog?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Both cats and dogs are common domestic pets that belong to the mammal family. They have fur which they shed, sharp teeth, and claws, and are both carnivores by nature. They are also both intelligent and instinctive animals that make them trainable to some extent, and have an acute sense of hearing and smell. Both animals are known for their companionship and for providing emotional support to their owners."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Design a logo for a company called \"My First Studio\"."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "I'm sorry, I am an AI language model and I do not possess the ability to create visual graphics like logos or images. However, I can provide some ideas and concepts that you can use for designing a logo for \"My First Studio\". \n\nHere are a few ideas for designing a logo: \n\n1. Incorporate graphics or symbols related to art, design, or creativity in the logo. This could include the use of an easel, paintbrush, pencil, or camera as part of the logo imagery.\n\n2. Use playful and bright colors to invoke a sense of creativity and exploration. This could include the use of different shades of orange, yellow, red, and green.\n\n3. Design the typography of \"My First Studio\" in a fun and engaging style. You can use a bold and simple font type and combine it with a handwritten or cursive style.\n\n4. Include illustrations or graphics of children or beginners engaging in creative activities like painting, drawing, or taking photographs. This will help emphasize the focus of the company on nurturing and development of creativity from a young age.\n\nBy incorporating these ideas, you should be able to come up with an effective and appealing logo for \"My First Studio\"."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What is the result of 4 raised to the power of 3?"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The result of 4 raised to the power of 3 is 64."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Construct a regular expression to identify email addresses."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A regular expression that can be used to identify email addresses is: \n\n`[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\\.[a-zA-Z]{2,}`\n\nExplanation:\n- `[a-zA-Z0-9._%+-]+` represents the username part of an email. This part includes one or more characters that could be alphabets (lowercase or uppercase), digits, dots, underscores, percentages, plus, or hyphen. The `+` at the end indicates that the username must have at least one character.\n- `@` represents the compulsory `@` symbol between the username and the domain name\n- `[a-zA-Z0-9.-]+` represents the domain name part of the email. This part could include alphabets (lowercase or uppercase), digits, dots or hyphens. The + at the end indicates that the domain name must have at least one character.\n- `\\.` represents the compulsory dot `.` symbol between the domain name and the top-level domain\n- `[a-zA-Z]{2,}` represents the top-level domain part of the email. This part must have at least two alphabets \n\nAn example of an email address that matches this regular expression would be `John.Doe-123%+@example-domain.com`."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Construct a poem with 4 stanzas and a minimum of 3 lines in each stanza."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "A gentle breeze, the world at ease,\nA moment's peace, away from disease,\nNature's beauty, a pause to seize.\n\nThe rolling hills, a sight to see,\nGreenery for miles, wild and free,\nA canvas painted, for you and me.\n\nBirds chirp, flowers bloom,\nThe sun shines, away from gloom,\nA ray of hope, a chance to consume.\n\nThis tranquil scene, of calm and bliss,\nIs where the heart, finds pure happiness,\nA moment captured, not to be missed."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "What is the name of the website home page?www.example.com"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The name of the homepage for the website with the URL \"www.example.com\" is typically called \"Home,\" but the specific name of the homepage can vary depending on the website's design and preferences."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Identify the predicate of the given sentence.John ate the apple"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The predicate of the given sentence \"John ate the apple\" is \"ate the apple.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Take the text below and make it spookierThe old house stood shrouded in the darkness of the night."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "In the pitch-black dead of night, the decrepit old house loomed ominously, enshrouded in an eerie veil of impenetrable darkness, emanating a bone-chilling aura of terror and despair."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "From the given options, classify the following word as a noun, verb, adjective, or adverb.\nWord: Cautiously"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Cautiously is an adverb."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Categorize different coral reef habitats."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Coral reef habitats can be categorized into several types based on their location, depth, and structure. These include:\n\n1. Fringing Reefs: These reefs are located close to the shore, and usually grow directly from the coastline. They are the most common type of coral reef habitat.\n\n2. Barrier Reefs: These reefs are separated from land by a lagoon or other body of water. They are generally long and narrow, and can run parallel to the coast for many miles.\n\n3. Atolls: Atolls are ring-shaped coral reefs that surround a central lagoon. They usually form around previously submerged volcanic islands, which have since eroded away, leaving only the coral structure behind.\n\n4. Patch Reefs: These are small, isolated reefs that are found within lagoons or other shallow waters. They can be either circular or elongated in shape.\n\n5. Banks or Platform Reefs: These reefs grow on the continental shelf, in deeper waters that are usually between 30 and 60m deep. They are broad and flat, and can extend for many miles.\n\nEach of these coral reef habitats supports its own unique array of marine life, and plays an important role in the overall health and diversity of the ocean ecosystem."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Convert the number in Fahrenheit to Celsius.210"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The formula for converting Fahrenheit to Celsius is: (°F 32) × 5/9 = °C.\n\nSo, if the temperature provided in Fahrenheit is 210°F then, using the above formula, we can find its Celsius equivalent as:\n\n(210 -32) * 5/9 = 98.9°C"}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Would you recommend this product to a friend?This product has been extremely useful for organizing my office."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Based on your positive experience with the product and its usefulness in organizing your office, I would recommend this product to a friend."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Compare and contrast the use of web technologies between web 1.0 and web 2.0"}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Web 1.0 and Web 2.0 are terms used to describe two different generations of the Web. Web 1.0 refers to the initial stage of the World Wide Web, while Web 2.0 describes the evolution of the Web towards more interactive and user-generated content. The main differences between these two generations of the Web lie in the way web technologies are used.\n\nWeb 1.0 was characterized by static HTML pages and the use of web technologies for the sole purpose of presenting information. Websites were purely informational, with minimal user interaction, and were mostly built by professionals. At this time, web technologies were used to display text, images, and links, and the main focus was on delivering content to the user. Some of the common web technologies used in Web 1.0 include HTML, CSS, and JavaScript.\n\nWeb 2.0, on the other hand, brought about a shift towards more dynamic and interactive web pages. With the advent of new web technologies such as AJAX, web developers were able to build more interactive and responsive websites. Web 2.0 also introduced the concept of user-generated content, where users could contribute to the content of websites. This led to the rise of social media platforms, blogs, wikis, and other forms of interactive websites. In addition to HTML, CSS, and JavaScript, web technologies commonly used in Web 2.0 include XML, RSS, and APIs.\n\nIn summary, the main difference between Web 1.0 and Web 2.0 lies in the way web technologies are used. Web 1.0 was focused on presenting information, while Web 2.0 is centered on user interaction and participation. The development of new web technologies has enabled web developers to create more dynamic and responsive websites, allowing for a richer and more interactive user experience."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "You have a broken washing machine and need to fix it. Describe the steps to do so."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "Fixing a washing machine involves several steps and may require the help of a professional depending on the extent of the damage. Below are some general steps that can be taken to fix a broken washing machine.\n\n1. Identify the problem: The first step in fixing a broken washing machine is to identify the source of the problem. You can do this by observing the symptoms of the machine and determining what function is not working properly. The most common washing machine problems include leaks, not spinning, making loud noises, not draining, among others.\n\n2. Unplug the machine: Before attempting any repairs, it is important to disconnect the washing machine from the power source to avoid accidents or electrocution.\n\n3. Consult the manual: Refer to the owner's manual for specific instructions on how to fix the identified problem. Manuals usually have a troubleshooting section that can guide you on how to fix common problems with the machine.\n\n4. Check for simple fixes: Some problems can be easily fixed without the need for technical knowledge. For example, if the machine is not draining, check if the drain hose is clogged or if the filter needs cleaning. If the machine is not spinning, check if the load is unbalanced and rearrange the clothes.\n\n5. Order replacement parts: If the problem cannot be fixed with simple adjustments, then you may need to order replacement parts. You can do this online or at a hardware store, making sure to purchase the correct part for your washing machine model.\n\n6. Repair or seek professional help: If you feel confident enough to make the repairs yourself, carefully follow the instructions in the manual using the necessary tools. If you are not confident in your ability to make the repairs, enlist the help of a professional washing machine repair technician."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Find the Past-perfect tense verb.She had given the money to her teacher."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "The past perfect tense verb in the sentence is \"had given.\""}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Edit the given text so that it sticks to standard English grammar rules.We was going to the store, when she realized she forgot her wallet."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "We were going to the store when she realized she forgot her wallet."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Name two benefits of eating more plant-based proteins."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Improved Health Outcomes: Plant-based proteins are often packed with essential vitamins, minerals, fiber, and antioxidants, that work to improve overall health outcomes. Consuming these proteins can help to reduce the risks of developing chronic conditions such as heart disease, type 2 diabetes, and certain types of cancer.\n\n2. Environmental Sustainability: By choosing plant-based proteins over animal-based options, individuals can reduce their carbon footprint and contribute to a more sustainable food system. Livestock farming is a major driver of deforestation, water pollution, and greenhouse gas emissions. Substituting plant proteins for animal proteins, even a couple of times a week can help reduce the environmental impact of our dietary choices."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "Generate a 5-word sentence using the following words: \"they\",\"happy\",\"had\",\"were\",\"when\"."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "They were happy when had."}], "loss_weight": 1.0}]}
{"messages": [{"role": "user", "content": [{"type": "text", "value": "List three features of the moon."}], "loss_weight": 0.0}, {"role": "assistant", "content": [{"type": "text", "value": "1. Craters: The surface of the moon is covered in impact craters from meteoroids, asteroids, and comets that have collided with the lunar surface over billions of years.\n\n2. Lunar Maria: The dark areas on the moon, visible from Earth, are called lunar maria. These are low-lying areas that were filled with molten lava following impacts with large meteoroids or comets, and then solidified.\n\n3. Lack of Atmosphere: The moon has no significant atmosphere to protect its surface or to retain heat. This results in extreme temperature changes on the lunar surface, ranging from -173°C (-279°F) during its night to 127°C (261°F) during its day."}], "loss_weight": 1.0}]}

8
data/v1_sft_demo.yaml Normal file
View File

@@ -0,0 +1,8 @@
identity:
file_name: identity.json
converter: alpaca
alpaca_en_demo:
file_name: alpaca_en_demo.json
dataset_dir: ~/data
converter: alpaca
num_samples: 500

File diff suppressed because one or more lines are too long

View File

@@ -1,59 +1,66 @@
# Use the NVIDIA official image with PyTorch 2.3.0 # https://hub.docker.com/r/hiyouga/pytorch/tags
# https://docs.nvidia.com/deeplearning/frameworks/pytorch-release-notes/rel-24-02.html ARG BASE_IMAGE=hiyouga/pytorch:th2.6.0-cu124-flashattn2.7.4-cxx11abi0-devel
FROM nvcr.io/nvidia/pytorch:24.02-py3 FROM ${BASE_IMAGE}
# Installation arguments
ARG PIP_INDEX=https://pypi.org/simple
ARG EXTRAS=metrics
ARG INSTALL_FLASHATTN=false
ARG HTTP_PROXY=""
# Define environments # Define environments
ENV MAX_JOBS=4 ENV MAX_JOBS=16
ENV FLASH_ATTENTION_FORCE_BUILD=TRUE ENV FLASH_ATTENTION_FORCE_BUILD=TRUE
ENV VLLM_WORKER_MULTIPROC_METHOD=spawn ENV VLLM_WORKER_MULTIPROC_METHOD=spawn
ENV DEBIAN_FRONTEND=noninteractive
ENV NODE_OPTIONS=""
ENV PIP_ROOT_USER_ACTION=ignore
ENV http_proxy="${HTTP_PROXY}"
ENV https_proxy="${HTTP_PROXY}"
# Define installation arguments # Use Bash instead of default /bin/sh
ARG INSTALL_BNB=false SHELL ["/bin/bash", "-c"]
ARG INSTALL_VLLM=false
ARG INSTALL_DEEPSPEED=false
ARG INSTALL_FLASHATTN=false
ARG PIP_INDEX=https://pypi.org/simple
# Set the working directory # Set the working directory
WORKDIR /app WORKDIR /app
# Change pip source
RUN pip config set global.index-url "${PIP_INDEX}" && \
pip config set global.extra-index-url "${PIP_INDEX}" && \
pip install --no-cache-dir --upgrade pip packaging wheel setuptools
# Install the requirements # Install the requirements
COPY requirements.txt /app COPY requirements.txt /app
RUN pip config set global.index-url "$PIP_INDEX" && \ RUN pip install --no-cache-dir -r requirements.txt
pip config set global.extra-index-url "$PIP_INDEX" && \
python -m pip install --upgrade pip && \
python -m pip install -r requirements.txt
# Copy the rest of the application into the image # Copy the rest of the application into the image
COPY . /app COPY . /app
# Install the LLaMA Factory # Install LLaMA Factory
RUN EXTRA_PACKAGES="metrics"; \ RUN pip install --no-cache-dir -e ".[${EXTRAS}]" --no-build-isolation
if [ "$INSTALL_BNB" == "true" ]; then \
EXTRA_PACKAGES="${EXTRA_PACKAGES},bitsandbytes"; \
fi; \
if [ "$INSTALL_VLLM" == "true" ]; then \
EXTRA_PACKAGES="${EXTRA_PACKAGES},vllm"; \
fi; \
if [ "$INSTALL_DEEPSPEED" == "true" ]; then \
EXTRA_PACKAGES="${EXTRA_PACKAGES},deepspeed"; \
fi; \
pip install -e ".[$EXTRA_PACKAGES]"
# Rebuild flash attention # Rebuild flash attention
RUN pip uninstall -y transformer-engine flash-attn && \ RUN if [ "${INSTALL_FLASHATTN}" == "true" ]; then \
if [ "$INSTALL_FLASHATTN" == "true" ]; then \ pip uninstall -y ninja && \
pip uninstall -y ninja && pip install ninja && \ pip install --no-cache-dir ninja && \
pip install --no-cache-dir flash-attn --no-build-isolation; \ pip install --no-cache-dir flash-attn --no-build-isolation; \
fi fi
# Set up volumes # Set up volumes
VOLUME [ "/root/.cache/huggingface", "/root/.cache/modelscope", "/app/data", "/app/output" ] # VOLUME [ "/root/.cache/huggingface", "/app/shared_data", "/app/output" ]
# Expose port 7860 for the LLaMA Board # Expose port 7860 for LLaMA Board
ENV GRADIO_SERVER_PORT 7860 ENV GRADIO_SERVER_PORT=7860
EXPOSE 7860 EXPOSE 7860
# Expose port 8000 for the API service # Expose port 8000 for API service
ENV API_PORT 8000 ENV API_PORT=8000
EXPOSE 8000 EXPOSE 8000
# unset proxy
ENV http_proxy=
ENV https_proxy=
# Reset pip config
RUN pip config unset global.index-url && \
pip config unset global.extra-index-url

View File

@@ -0,0 +1,55 @@
# Start from the pytorch official image (ubuntu-22.04 + cuda-12.4.1 + python-3.11)
# https://hub.docker.com/r/pytorch/pytorch/tags
FROM pytorch/pytorch:2.6.0-cuda12.4-cudnn9-devel
# Define environments
ENV MAX_JOBS=16
ENV VLLM_WORKER_MULTIPROC_METHOD=spawn
ENV DEBIAN_FRONTEND=noninteractive
ENV NODE_OPTIONS=""
ENV PIP_ROOT_USER_ACTION=ignore
# Define installation arguments
ARG APT_SOURCE=https://mirrors.tuna.tsinghua.edu.cn/ubuntu/
ARG PIP_INDEX=https://mirrors.tuna.tsinghua.edu.cn/pypi/web/simple
# Set apt source
RUN cp /etc/apt/sources.list /etc/apt/sources.list.bak && \
{ \
echo "deb ${APT_SOURCE} jammy main restricted universe multiverse"; \
echo "deb ${APT_SOURCE} jammy-updates main restricted universe multiverse"; \
echo "deb ${APT_SOURCE} jammy-backports main restricted universe multiverse"; \
echo "deb ${APT_SOURCE} jammy-security main restricted universe multiverse"; \
} > /etc/apt/sources.list
# Install systemctl and wget
RUN apt-get update && \
apt-get install -y -o Dpkg::Options::="--force-confdef" systemd wget && \
apt-get clean
# Install git and vim
RUN apt-get update && \
apt-get install -y git vim && \
apt-get clean
# Install gcc and g++
RUN apt-get update && \
apt-get install -y gcc g++ && \
apt-get clean
# Change pip source
RUN pip config set global.index-url "${PIP_INDEX}" && \
pip config set global.extra-index-url "${PIP_INDEX}" && \
pip install --no-cache-dir --upgrade pip packaging wheel setuptools
# Install flash-attn-2.7.4.post1 (cxx11abi=False)
RUN wget -nv https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp311-cp311-linux_x86_64.whl && \
pip install --no-cache-dir flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
# Install flashinfer-0.2.2.post1+cu124 (cxx11abi=False)
RUN wget -nv https://github.com/flashinfer-ai/flashinfer/releases/download/v0.2.2.post1/flashinfer_python-0.2.2.post1+cu124torch2.6-cp38-abi3-linux_x86_64.whl && \
pip install --no-cache-dir flashinfer_python-0.2.2.post1+cu124torch2.6-cp38-abi3-linux_x86_64.whl
# Reset pip config
RUN pip config unset global.index-url && \
pip config unset global.extra-index-url

View File

@@ -0,0 +1,77 @@
# NVIDIA official image (ubuntu-22.04 + cuda-12.4 + python-3.10)
# https://docs.nvidia.com/deeplearning/frameworks/pytorch-release-notes/rel-24-08.html
FROM nvcr.io/nvidia/pytorch:24.05-py3
ENV DEBIAN_FRONTEND=noninteractive
ENV PIP_ROOT_USER_ACTION=ignore
ENV PYPI_MIRROR=https://mirrors.aliyun.com/pypi/simple/
ENV PYPI_TRUSTED_HOST=mirrors.aliyun.com
ENV APT_MIRROR=https://mirrors.tuna.tsinghua.edu.cn/ubuntu/
RUN pip install --upgrade pip setuptools wheel --trusted-host ${PYPI_TRUSTED_HOST} --index-url ${PYPI_MIRROR}
RUN pip uninstall -y torch torchvision torch-tensorrt \
flash_attn transformer-engine \
cudf dask-cuda cugraph cugraph-service-server cuml raft-dask cugraph-dgl cugraph-pyg dask-cudf
RUN pip install torch==2.6.0 torchvision==0.21.0 torchaudio==2.6.0 --index-url https://download.pytorch.org/whl/cu124
RUN pip uninstall -y opencv opencv-python opencv-python-headless && \
rm -rf /usr/local/lib/python3.10/dist-packages/cv2/ && \
pip install opencv-python-headless==4.11.0.86 --trusted-host ${PYPI_TRUSTED_HOST} --index-url ${PYPI_MIRROR}
RUN pip install "numpy==1.26.4" "optree>=0.13.0" "spacy==3.7.5" "weasel==0.4.1" \
transformer-engine[pytorch]==2.2.0 megatron-core==0.13.0 deepspeed==0.16.4 \
--trusted-host ${PYPI_TRUSTED_HOST} --index-url ${PYPI_MIRROR}
RUN pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.2.post1/flash_attn-2.7.2.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
# RUN pip install vllm==0.8.4 \
# --trusted-host ${PYPI_TRUSTED_HOST} --index-url ${PYPI_MIRROR}
WORKDIR /build
ARG apex_url=git+https://github.com/NVIDIA/apex.git@25.04
RUN pip uninstall -y apex && \
MAX_JOBS=32 NINJA_FLAGS="-j32" NVCC_APPEND_FLAGS="--threads 32" \
pip install -v --disable-pip-version-check --no-cache-dir --no-build-isolation \
--config-settings "--build-option=--cpp_ext --cuda_ext --parallel 32" ${apex_url}
RUN rm -rf /build
WORKDIR /workspace
RUN cp /etc/apt/sources.list /etc/apt/sources.list.bak && \
{ \
echo "deb ${APT_MIRROR} jammy main restricted universe multiverse"; \
echo "deb ${APT_MIRROR} jammy-security main restricted universe multiverse"; \
echo "deb ${APT_MIRROR} jammy-updates main restricted universe multiverse"; \
echo "deb ${APT_MIRROR} jammy-backports main restricted universe multiverse"; \
} > /etc/apt/sources.list
RUN apt-get update && apt-get install -y zip
RUN apt-get install -y openjdk-21-jdk
ENV JAVA_HOME /usr/lib/jvm/java-21-openjdk-amd64
# pip install LLaMA-Factory
WORKDIR /app
COPY requirements.txt /app/
RUN pip install --no-cache-dir -r requirements.txt
RUN pip install "git+https://github.com/alibaba/roll.git#subdirectory=mcore_adapter"
COPY . /app/
RUN pip install -e ".[metrics]" --no-build-isolation
# Expose port 7860 for LLaMA Board
ENV GRADIO_SERVER_PORT=7860
EXPOSE 7860
# Expose port 8000 for API service
ENV API_PORT=8000
EXPOSE 8000
# unset proxy
ENV http_proxy=
ENV https_proxy=

View File

@@ -0,0 +1,111 @@
# Docker Setup for NVIDIA GPUs
This directory contains Docker configuration files for running LLaMA Factory with NVIDIA GPU support.
## Prerequisites
### Linux-specific Requirements
Before running the Docker container with GPU support, you need to install the following packages:
1. **Docker**: The container runtime
```bash
# Ubuntu/Debian
sudo apt-get update
sudo apt-get install docker.io
# Or install Docker Engine from the official repository:
# https://docs.docker.com/engine/install/
```
2. **Docker Compose** (if using the docker-compose method):
```bash
# Ubuntu/Debian
sudo apt-get install docker-compose
# Or install the latest version:
# https://docs.docker.com/compose/install/
```
3. **NVIDIA Container Toolkit** (required for GPU support):
```bash
# Add the NVIDIA GPG key and repository
distribution=$(. /etc/os-release;echo $ID$VERSION_ID)
curl -s -L https://nvidia.github.io/nvidia-docker/gpgkey | sudo apt-key add -
curl -s -L https://nvidia.github.io/nvidia-docker/$distribution/nvidia-docker.list | sudo tee /etc/apt/sources.list.d/nvidia-docker.list
# Install nvidia-container-toolkit
sudo apt-get update
sudo apt-get install -y nvidia-container-toolkit
# Restart Docker to apply changes
sudo systemctl restart docker
```
**Note**: Without `nvidia-container-toolkit`, the Docker container will not be able to access your NVIDIA GPU.
### Verify GPU Access
After installation, verify that Docker can access your GPU:
```bash
sudo docker run --rm --gpus all nvidia/cuda:12.4.0-base-ubuntu22.04 nvidia-smi
```
If successful, you should see your GPU information displayed.
## Usage
### Using Docker Compose (Recommended)
```bash
cd docker/docker-cuda/
docker compose up -d
docker compose exec llamafactory bash
```
### Using Docker Run
```bash
# Build the image
docker build -f ./docker/docker-cuda/Dockerfile \
--build-arg PIP_INDEX=https://pypi.org/simple \
--build-arg EXTRAS=metrics \
-t llamafactory:latest .
# Run the container
docker run -dit --ipc=host --gpus=all \
-p 7860:7860 \
-p 8000:8000 \
--name llamafactory \
llamafactory:latest
# Enter the container
docker exec -it llamafactory bash
```
## Troubleshooting
### GPU Not Detected
If your GPU is not detected inside the container:
1. Ensure `nvidia-container-toolkit` is installed
2. Check that the Docker daemon has been restarted after installation
3. Verify your NVIDIA drivers are properly installed: `nvidia-smi`
4. Check Docker GPU support: `docker run --rm --gpus all ubuntu nvidia-smi`
### Permission Denied
If you get permission errors, ensure your user is in the docker group:
```bash
sudo usermod -aG docker $USER
# Log out and back in for changes to take effect
```
## Additional Notes
- The default image is built on Ubuntu 22.04 (x86_64), CUDA 12.4, Python 3.11, PyTorch 2.6.0, and Flash-attn 2.7.4
- For different CUDA versions, you may need to adjust the base image in the Dockerfile
- Make sure your NVIDIA driver version is compatible with the CUDA version used in the Docker image

View File

@@ -4,22 +4,15 @@ services:
dockerfile: ./docker/docker-cuda/Dockerfile dockerfile: ./docker/docker-cuda/Dockerfile
context: ../.. context: ../..
args: args:
INSTALL_BNB: false
INSTALL_VLLM: false
INSTALL_DEEPSPEED: false
INSTALL_FLASHATTN: false
PIP_INDEX: https://pypi.org/simple PIP_INDEX: https://pypi.org/simple
EXTRAS: metrics
container_name: llamafactory container_name: llamafactory
volumes:
- ../../hf_cache:/root/.cache/huggingface
- ../../ms_cache:/root/.cache/modelscope
- ../../data:/app/data
- ../../output:/app/output
ports: ports:
- "7860:7860" - "7860:7860"
- "8000:8000" - "8000:8000"
ipc: host ipc: host
tty: true tty: true
# shm_size: "16gb" # ipc: host is set
stdin_open: true stdin_open: true
command: bash command: bash
deploy: deploy:
@@ -28,5 +21,5 @@ services:
devices: devices:
- driver: nvidia - driver: nvidia
count: "all" count: "all"
capabilities: [gpu] capabilities: [ gpu ]
restart: unless-stopped restart: unless-stopped

View File

@@ -1,45 +1,67 @@
# Use the Ubuntu 22.04 image with CANN 8.0.rc1 # https://hub.docker.com/r/ascendai/cann/tags
# More versions can be found at https://hub.docker.com/r/cosdt/cann/tags
# FROM cosdt/cann:8.0.rc1-910-ubuntu22.04 # default base image build for A2, if build for A3, using this image:
FROM cosdt/cann:8.0.rc1-910b-ubuntu22.04 # ARG BASE_IMAGE=ascendai/cann:8.3.rc1-a3-ubuntu22.04-py3.11
# FROM cosdt/cann:8.0.rc1-910-openeuler22.03
# FROM cosdt/cann:8.0.rc1-910b-openeuler22.03 ARG BASE_IMAGE=ascendai/cann:8.3.rc1-910b-ubuntu22.04-py3.11
FROM ${BASE_IMAGE}
# Installation arguments
ARG PIP_INDEX=https://pypi.org/simple
ARG EXTRAS=torch-npu,metrics
ARG HTTP_PROXY=""
ARG PYTORCH_INDEX=https://download.pytorch.org/whl/cpu
# Define environments # Define environments
ENV MAX_JOBS=16
ENV FLASH_ATTENTION_FORCE_BUILD=TRUE
ENV VLLM_WORKER_MULTIPROC_METHOD=spawn
ENV DEBIAN_FRONTEND=noninteractive ENV DEBIAN_FRONTEND=noninteractive
ENV NODE_OPTIONS=""
ENV PIP_ROOT_USER_ACTION=ignore
ENV http_proxy="${HTTP_PROXY}"
ENV https_proxy="${HTTP_PROXY}"
# Define installation arguments # Use Bash instead of default /bin/sh
ARG INSTALL_DEEPSPEED=false SHELL ["/bin/bash", "-c"]
ARG PIP_INDEX=https://pypi.org/simple
ARG TORCH_INDEX=https://download.pytorch.org/whl/cpu
# Set the working directory # Set the working directory
WORKDIR /app WORKDIR /app
# Change pip source
RUN pip config set global.index-url "${PIP_INDEX}" && \
pip config set global.extra-index-url "${PIP_INDEX}" && \
pip install --no-cache-dir --upgrade pip packaging wheel setuptools
# Install torch-npu
RUN pip uninstall -y torch torchvision torchaudio && \
pip install --no-cache-dir "torch==2.7.1" "torch-npu==2.7.1" "torchvision==0.22.1" --index-url "${PYTORCH_INDEX}"
# Install the requirements # Install the requirements
COPY requirements.txt /app COPY requirements.txt /app
RUN pip config set global.index-url "$PIP_INDEX" && \ RUN pip install --no-cache-dir -r requirements.txt
pip config set global.extra-index-url "$TORCH_INDEX" && \
python -m pip install --upgrade pip && \
python -m pip install -r requirements.txt
# Copy the rest of the application into the image # Copy the rest of the application into the image
COPY . /app COPY . /app
# Install the LLaMA Factory # Install LLaMA Factory
RUN EXTRA_PACKAGES="torch-npu,metrics"; \ RUN pip install --no-cache-dir -e ".[${EXTRAS}]" --no-build-isolation
if [ "$INSTALL_DEEPSPEED" == "true" ]; then \
EXTRA_PACKAGES="${EXTRA_PACKAGES},deepspeed"; \
fi; \
pip install -e ".[$EXTRA_PACKAGES]"
# Set up volumes # Set up volumes
VOLUME [ "/root/.cache/huggingface", "/root/.cache/modelscope", "/app/data", "/app/output" ] # VOLUME [ "/root/.cache/huggingface", "/app/shared_data", "/app/output" ]
# Expose port 7860 for the LLaMA Board # Expose port 7860 for LLaMA Board
ENV GRADIO_SERVER_PORT 7860 ENV GRADIO_SERVER_PORT=7860
EXPOSE 7860 EXPOSE 7860
# Expose port 8000 for the API service # Expose port 8000 for API service
ENV API_PORT 8000 ENV API_PORT=8000
EXPOSE 8000 EXPOSE 8000
# unset proxy
ENV http_proxy=
ENV https_proxy=
# Reset pip config
RUN pip config unset global.index-url && \
pip config unset global.extra-index-url

View File

@@ -4,14 +4,10 @@ services:
dockerfile: ./docker/docker-npu/Dockerfile dockerfile: ./docker/docker-npu/Dockerfile
context: ../.. context: ../..
args: args:
INSTALL_DEEPSPEED: false
PIP_INDEX: https://pypi.org/simple PIP_INDEX: https://pypi.org/simple
EXTRAS: torch-npu,metrics
container_name: llamafactory container_name: llamafactory
volumes: volumes:
- ../../hf_cache:/root/.cache/huggingface
- ../../ms_cache:/root/.cache/modelscope
- ../../data:/app/data
- ../../output:/app/output
- /usr/local/dcmi:/usr/local/dcmi - /usr/local/dcmi:/usr/local/dcmi
- /usr/local/bin/npu-smi:/usr/local/bin/npu-smi - /usr/local/bin/npu-smi:/usr/local/bin/npu-smi
- /usr/local/Ascend/driver:/usr/local/Ascend/driver - /usr/local/Ascend/driver:/usr/local/Ascend/driver
@@ -21,6 +17,7 @@ services:
- "8000:8000" - "8000:8000"
ipc: host ipc: host
tty: true tty: true
# shm_size: "16gb" # ipc: host is set
stdin_open: true stdin_open: true
command: bash command: bash
devices: devices:

View File

@@ -0,0 +1,77 @@
# https://hub.docker.com/r/rocm/pytorch/tags
ARG BASE_IMAGE=rocm/pytorch:rocm6.4.1_ubuntu22.04_py3.10_pytorch_release_2.6.0
FROM ${BASE_IMAGE}
# Installation arguments
ARG PIP_INDEX=https://pypi.org/simple
ARG EXTRAS=metrics
ARG INSTALL_FLASHATTN=false
ARG HTTP_PROXY=""
ARG PYTORCH_INDEX=https://download.pytorch.org/whl/rocm6.3
# Define environments
ENV MAX_JOBS=16
ENV FLASH_ATTENTION_FORCE_BUILD=TRUE
ENV VLLM_WORKER_MULTIPROC_METHOD=spawn
ENV DEBIAN_FRONTEND=noninteractive
ENV NODE_OPTIONS=""
ENV PIP_ROOT_USER_ACTION=ignore
ENV http_proxy="${HTTP_PROXY}"
ENV https_proxy="${HTTP_PROXY}"
# Use Bash instead of default /bin/sh
SHELL ["/bin/bash", "-c"]
# Set the working directory
WORKDIR /app
# Change pip source
RUN pip config set global.index-url "${PIP_INDEX}" && \
pip config set global.extra-index-url "${PIP_INDEX}" && \
pip install --no-cache-dir --upgrade pip packaging wheel setuptools
# Reinstall pytorch rocm
RUN pip uninstall -y torch torchvision torchaudio && \
pip install --no-cache-dir --pre torch torchvision torchaudio --index-url "${PYTORCH_INDEX}"
# Install the requirements
COPY requirements.txt /app
RUN pip install --no-cache-dir -r requirements.txt
# Copy the rest of the application into the image
COPY . /app
# Install LLaMA Factory
RUN pip install --no-cache-dir -e ".[${EXTRAS}]" --no-build-isolation
# Rebuild flash attention
RUN if [ "${INSTALL_FLASHATTN}" == "true" ]; then \
pip uninstall -y ninja && \
pip install --no-cache-dir ninja && \
pip install --no-cache-dir flash-attn --no-build-isolation; \
fi
# Set up volumes
# VOLUME [ "/root/.cache/huggingface", "/app/shared_data", "/app/output" ]
# Expose port 7860 for LLaMA Board
ENV GRADIO_SERVER_PORT=7860
EXPOSE 7860
# Expose port 8000 for API service
ENV API_PORT=8000
EXPOSE 8000
# unset proxy
ENV http_proxy=
ENV https_proxy=
# Set no_proxy environment variable
ENV no_proxy="localhost, 127.0.0.1, ::1"
# fix pydantic version
RUN pip install pydantic==2.10.6
# Reset pip config
RUN pip config unset global.index-url && \
pip config unset global.extra-index-url

View File

@@ -0,0 +1,21 @@
services:
llamafactory:
build:
dockerfile: ./docker/docker-rocm/Dockerfile
context: ../..
args:
PIP_INDEX: https://pypi.org/simple
EXTRAS: metrics
container_name: llamafactory
ports:
- "7860:7860"
- "8000:8000"
ipc: host
tty: true
# shm_size: "16gb" # ipc: host is set
stdin_open: true
command: bash
devices:
- /dev/kfd:/dev/kfd
- /dev/dri:/dev/dri
restart: unless-stopped

View File

@@ -1,161 +0,0 @@
# Copyright 2020 The HuggingFace Datasets Authors and the current dataset script contributor.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import datasets
import pandas as pd
_CITATION = """\
@article{huang2023ceval,
title={C-Eval: A Multi-Level Multi-Discipline Chinese Evaluation Suite for Foundation Models},
author={Huang, Yuzhen and Bai, Yuzhuo and Zhu, Zhihao and Zhang, Junlei and Zhang, Jinghan and Su, Tangjun and Liu, Junteng and Lv, Chuancheng and Zhang, Yikai and Lei, Jiayi and Fu, Yao and Sun, Maosong and He, Junxian},
journal={arXiv preprint arXiv:2305.08322},
year={2023}
}
"""
_DESCRIPTION = """\
C-Eval is a comprehensive Chinese evaluation suite for foundation models. It consists of 13948 multi-choice questions spanning 52 diverse disciplines and four difficulty levels.
"""
_HOMEPAGE = "https://cevalbenchmark.com"
_LICENSE = "Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License"
_URL = "ceval.zip"
task_list = [
"computer_network",
"operating_system",
"computer_architecture",
"college_programming",
"college_physics",
"college_chemistry",
"advanced_mathematics",
"probability_and_statistics",
"discrete_mathematics",
"electrical_engineer",
"metrology_engineer",
"high_school_mathematics",
"high_school_physics",
"high_school_chemistry",
"high_school_biology",
"middle_school_mathematics",
"middle_school_biology",
"middle_school_physics",
"middle_school_chemistry",
"veterinary_medicine",
"college_economics",
"business_administration",
"marxism",
"mao_zedong_thought",
"education_science",
"teacher_qualification",
"high_school_politics",
"high_school_geography",
"middle_school_politics",
"middle_school_geography",
"modern_chinese_history",
"ideological_and_moral_cultivation",
"logic",
"law",
"chinese_language_and_literature",
"art_studies",
"professional_tour_guide",
"legal_professional",
"high_school_chinese",
"high_school_history",
"middle_school_history",
"civil_servant",
"sports_science",
"plant_protection",
"basic_medicine",
"clinical_medicine",
"urban_and_rural_planner",
"accountant",
"fire_engineer",
"environmental_impact_assessment_engineer",
"tax_accountant",
"physician",
]
class CevalConfig(datasets.BuilderConfig):
def __init__(self, **kwargs):
super().__init__(version=datasets.Version("1.0.0"), **kwargs)
class Ceval(datasets.GeneratorBasedBuilder):
BUILDER_CONFIGS = [
CevalConfig(
name=task_name,
)
for task_name in task_list
]
def _info(self):
features = datasets.Features(
{
"id": datasets.Value("int32"),
"question": datasets.Value("string"),
"A": datasets.Value("string"),
"B": datasets.Value("string"),
"C": datasets.Value("string"),
"D": datasets.Value("string"),
"answer": datasets.Value("string"),
"explanation": datasets.Value("string"),
}
)
return datasets.DatasetInfo(
description=_DESCRIPTION,
features=features,
homepage=_HOMEPAGE,
license=_LICENSE,
citation=_CITATION,
)
def _split_generators(self, dl_manager):
data_dir = dl_manager.download_and_extract(_URL)
task_name = self.config.name
return [
datasets.SplitGenerator(
name=datasets.Split.TEST,
gen_kwargs={
"filepath": os.path.join(data_dir, "test", f"{task_name}_test.csv"),
},
),
datasets.SplitGenerator(
name=datasets.Split.VALIDATION,
gen_kwargs={
"filepath": os.path.join(data_dir, "val", f"{task_name}_val.csv"),
},
),
datasets.SplitGenerator(
name=datasets.Split.TRAIN,
gen_kwargs={
"filepath": os.path.join(data_dir, "dev", f"{task_name}_dev.csv"),
},
),
]
def _generate_examples(self, filepath):
df = pd.read_csv(filepath, encoding="utf-8")
for i, instance in enumerate(df.to_dict(orient="records")):
if "answer" not in instance.keys():
instance["answer"] = ""
if "explanation" not in instance.keys():
instance["explanation"] = ""
yield i, instance

View File

@@ -1,168 +0,0 @@
# Copyright 2020 The HuggingFace Datasets Authors and the current dataset script contributor.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import datasets
import pandas as pd
_CITATION = """\
@article{li2023cmmlu,
title={CMMLU: Measuring massive multitask language understanding in Chinese},
author={Haonan Li and Yixuan Zhang and Fajri Koto and Yifei Yang and Hai Zhao and Yeyun Gong and Nan Duan and Timothy Baldwin},
journal={arXiv preprint arXiv:2306.09212},
year={2023}
}
"""
_DESCRIPTION = """\
CMMLU is a comprehensive Chinese assessment suite specifically designed to evaluate the advanced knowledge and reasoning abilities of LLMs within the Chinese language and cultural context.
"""
_HOMEPAGE = "https://github.com/haonan-li/CMMLU"
_LICENSE = "Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License"
_URL = "cmmlu.zip"
task_list = [
"agronomy",
"anatomy",
"ancient_chinese",
"arts",
"astronomy",
"business_ethics",
"chinese_civil_service_exam",
"chinese_driving_rule",
"chinese_food_culture",
"chinese_foreign_policy",
"chinese_history",
"chinese_literature",
"chinese_teacher_qualification",
"clinical_knowledge",
"college_actuarial_science",
"college_education",
"college_engineering_hydrology",
"college_law",
"college_mathematics",
"college_medical_statistics",
"college_medicine",
"computer_science",
"computer_security",
"conceptual_physics",
"construction_project_management",
"economics",
"education",
"electrical_engineering",
"elementary_chinese",
"elementary_commonsense",
"elementary_information_and_technology",
"elementary_mathematics",
"ethnology",
"food_science",
"genetics",
"global_facts",
"high_school_biology",
"high_school_chemistry",
"high_school_geography",
"high_school_mathematics",
"high_school_physics",
"high_school_politics",
"human_sexuality",
"international_law",
"journalism",
"jurisprudence",
"legal_and_moral_basis",
"logical",
"machine_learning",
"management",
"marketing",
"marxist_theory",
"modern_chinese",
"nutrition",
"philosophy",
"professional_accounting",
"professional_law",
"professional_medicine",
"professional_psychology",
"public_relations",
"security_study",
"sociology",
"sports_science",
"traditional_chinese_medicine",
"virology",
"world_history",
"world_religions",
]
class CMMLUConfig(datasets.BuilderConfig):
def __init__(self, **kwargs):
super().__init__(version=datasets.Version("1.0.1"), **kwargs)
class CMMLU(datasets.GeneratorBasedBuilder):
BUILDER_CONFIGS = [
CMMLUConfig(
name=task_name,
)
for task_name in task_list
]
def _info(self):
features = datasets.Features(
{
"question": datasets.Value("string"),
"A": datasets.Value("string"),
"B": datasets.Value("string"),
"C": datasets.Value("string"),
"D": datasets.Value("string"),
"answer": datasets.Value("string"),
}
)
return datasets.DatasetInfo(
description=_DESCRIPTION,
features=features,
homepage=_HOMEPAGE,
license=_LICENSE,
citation=_CITATION,
)
def _split_generators(self, dl_manager):
data_dir = dl_manager.download_and_extract(_URL)
task_name = self.config.name
return [
datasets.SplitGenerator(
name=datasets.Split.TEST,
gen_kwargs={
"filepath": os.path.join(data_dir, f"test/{task_name}.csv"),
},
),
datasets.SplitGenerator(
name=datasets.Split.TRAIN,
gen_kwargs={
"filepath": os.path.join(data_dir, f"dev/{task_name}.csv"),
},
),
]
def _generate_examples(self, filepath):
df = pd.read_csv(filepath, header=0, index_col=0, encoding="utf-8")
for i, instance in enumerate(df.to_dict(orient="records")):
question = instance.pop("Question", "")
answer = instance.pop("Answer", "")
instance["question"] = question
instance["answer"] = answer
yield i, instance

View File

@@ -1,162 +0,0 @@
# Copyright 2020 The HuggingFace Datasets Authors and the current dataset script contributor.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import datasets
import pandas as pd
_CITATION = """\
@article{hendryckstest2021,
title={Measuring Massive Multitask Language Understanding},
author={Dan Hendrycks and Collin Burns and Steven Basart and Andy Zou and Mantas Mazeika and Dawn Song and Jacob Steinhardt},
journal={Proceedings of the International Conference on Learning Representations (ICLR)},
year={2021}
}
"""
_DESCRIPTION = """\
Measuring Massive Multitask Language Understanding by Dan Hendrycks, Collin Burns, Steven Basart, Andy Zou, Mantas Mazeika, Dawn Song, and Jacob Steinhardt (ICLR 2021).
"""
_HOMEPAGE = "https://github.com/hendrycks/test"
_LICENSE = "MIT"
_URL = "mmlu.zip"
task_list = [
"high_school_european_history",
"business_ethics",
"clinical_knowledge",
"medical_genetics",
"high_school_us_history",
"high_school_physics",
"high_school_world_history",
"virology",
"high_school_microeconomics",
"econometrics",
"college_computer_science",
"high_school_biology",
"abstract_algebra",
"professional_accounting",
"philosophy",
"professional_medicine",
"nutrition",
"global_facts",
"machine_learning",
"security_studies",
"public_relations",
"professional_psychology",
"prehistory",
"anatomy",
"human_sexuality",
"college_medicine",
"high_school_government_and_politics",
"college_chemistry",
"logical_fallacies",
"high_school_geography",
"elementary_mathematics",
"human_aging",
"college_mathematics",
"high_school_psychology",
"formal_logic",
"high_school_statistics",
"international_law",
"high_school_mathematics",
"high_school_computer_science",
"conceptual_physics",
"miscellaneous",
"high_school_chemistry",
"marketing",
"professional_law",
"management",
"college_physics",
"jurisprudence",
"world_religions",
"sociology",
"us_foreign_policy",
"high_school_macroeconomics",
"computer_security",
"moral_scenarios",
"moral_disputes",
"electrical_engineering",
"astronomy",
"college_biology",
]
class MMLUConfig(datasets.BuilderConfig):
def __init__(self, **kwargs):
super().__init__(version=datasets.Version("1.0.0"), **kwargs)
class MMLU(datasets.GeneratorBasedBuilder):
BUILDER_CONFIGS = [
MMLUConfig(
name=task_name,
)
for task_name in task_list
]
def _info(self):
features = datasets.Features(
{
"question": datasets.Value("string"),
"A": datasets.Value("string"),
"B": datasets.Value("string"),
"C": datasets.Value("string"),
"D": datasets.Value("string"),
"answer": datasets.Value("string"),
}
)
return datasets.DatasetInfo(
description=_DESCRIPTION,
features=features,
homepage=_HOMEPAGE,
license=_LICENSE,
citation=_CITATION,
)
def _split_generators(self, dl_manager):
data_dir = dl_manager.download_and_extract(_URL)
task_name = self.config.name
return [
datasets.SplitGenerator(
name=datasets.Split.TEST,
gen_kwargs={
"filepath": os.path.join(data_dir, "data", "test", f"{task_name}_test.csv"),
},
),
datasets.SplitGenerator(
name=datasets.Split.VALIDATION,
gen_kwargs={
"filepath": os.path.join(data_dir, "data", "val", f"{task_name}_val.csv"),
},
),
datasets.SplitGenerator(
name=datasets.Split.TRAIN,
gen_kwargs={
"filepath": os.path.join(data_dir, "data", "dev", f"{task_name}_dev.csv"),
},
),
]
def _generate_examples(self, filepath):
df = pd.read_csv(filepath, header=None)
df.columns = ["question", "A", "B", "C", "D", "answer"]
for i, instance in enumerate(df.to_dict(orient="records")):
yield i, instance

View File

@@ -13,6 +13,26 @@ Make sure to execute these commands in the `LLaMA-Factory` directory.
Use `CUDA_VISIBLE_DEVICES` (GPU) or `ASCEND_RT_VISIBLE_DEVICES` (NPU) to choose computing devices. Use `CUDA_VISIBLE_DEVICES` (GPU) or `ASCEND_RT_VISIBLE_DEVICES` (NPU) to choose computing devices.
By default, LLaMA-Factory uses all visible computing devices.
Basic usage:
```bash
llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml
```
Advanced usage:
```bash
CUDA_VISIBLE_DEVICES=0,1 llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml \
learning_rate=1e-5 \
logging_steps=1
```
```bash
bash examples/train_lora/llama3_lora_sft.sh
```
## Examples ## Examples
### LoRA Fine-Tuning ### LoRA Fine-Tuning
@@ -32,7 +52,19 @@ llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml
#### Multimodal Supervised Fine-Tuning #### Multimodal Supervised Fine-Tuning
```bash ```bash
llamafactory-cli train examples/train_lora/llava1_5_lora_sft.yaml llamafactory-cli train examples/train_lora/qwen2_5vl_lora_sft.yaml
```
#### DPO/ORPO/SimPO Training
```bash
llamafactory-cli train examples/train_lora/llama3_lora_dpo.yaml
```
#### Multimodal DPO/ORPO/SimPO Training
```bash
llamafactory-cli train examples/train_lora/qwen2_5vl_lora_dpo.yaml
``` ```
#### Reward Modeling #### Reward Modeling
@@ -47,12 +79,6 @@ llamafactory-cli train examples/train_lora/llama3_lora_reward.yaml
llamafactory-cli train examples/train_lora/llama3_lora_ppo.yaml llamafactory-cli train examples/train_lora/llama3_lora_ppo.yaml
``` ```
#### DPO/ORPO/SimPO Training
```bash
llamafactory-cli train examples/train_lora/llama3_lora_dpo.yaml
```
#### KTO Training #### KTO Training
```bash ```bash
@@ -73,17 +99,11 @@ llamafactory-cli train examples/train_lora/llama3_preprocess.yaml
llamafactory-cli eval examples/train_lora/llama3_lora_eval.yaml llamafactory-cli eval examples/train_lora/llama3_lora_eval.yaml
``` ```
#### Batch Predicting and Computing BLEU and ROUGE Scores
```bash
llamafactory-cli train examples/train_lora/llama3_lora_predict.yaml
```
#### Supervised Fine-Tuning on Multiple Nodes #### Supervised Fine-Tuning on Multiple Nodes
```bash ```bash
FORCE_TORCHRUN=1 NNODES=2 RANK=0 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml FORCE_TORCHRUN=1 NNODES=2 NODE_RANK=0 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml
FORCE_TORCHRUN=1 NNODES=2 RANK=1 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml FORCE_TORCHRUN=1 NNODES=2 NODE_RANK=1 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml
``` ```
#### Supervised Fine-Tuning with DeepSpeed ZeRO-3 (Weight Sharding) #### Supervised Fine-Tuning with DeepSpeed ZeRO-3 (Weight Sharding)
@@ -92,6 +112,12 @@ FORCE_TORCHRUN=1 NNODES=2 RANK=1 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llama
FORCE_TORCHRUN=1 llamafactory-cli train examples/train_lora/llama3_lora_sft_ds3.yaml FORCE_TORCHRUN=1 llamafactory-cli train examples/train_lora/llama3_lora_sft_ds3.yaml
``` ```
#### Supervised Fine-Tuning with Ray on 4 GPUs
```bash
USE_RAY=1 llamafactory-cli train examples/train_lora/llama3_lora_sft_ray.yaml
```
### QLoRA Fine-Tuning ### QLoRA Fine-Tuning
#### Supervised Fine-Tuning with 4/8-bit Bitsandbytes/HQQ/EETQ Quantization (Recommended) #### Supervised Fine-Tuning with 4/8-bit Bitsandbytes/HQQ/EETQ Quantization (Recommended)
@@ -100,6 +126,12 @@ FORCE_TORCHRUN=1 llamafactory-cli train examples/train_lora/llama3_lora_sft_ds3.
llamafactory-cli train examples/train_qlora/llama3_lora_sft_otfq.yaml llamafactory-cli train examples/train_qlora/llama3_lora_sft_otfq.yaml
``` ```
#### Supervised Fine-Tuning with 4-bit Bitsandbytes Quantization on Ascend NPU
```bash
llamafactory-cli train examples/train_qlora/llama3_lora_sft_bnb_npu.yaml
```
#### Supervised Fine-Tuning with 4/8-bit GPTQ Quantization #### Supervised Fine-Tuning with 4/8-bit GPTQ Quantization
```bash ```bash
@@ -123,20 +155,28 @@ llamafactory-cli train examples/train_qlora/llama3_lora_sft_aqlm.yaml
#### Supervised Fine-Tuning on Single Node #### Supervised Fine-Tuning on Single Node
```bash ```bash
FORCE_TORCHRUN=1 llamafactory-cli train examples/train_full/llama3_full_sft_ds3.yaml FORCE_TORCHRUN=1 llamafactory-cli train examples/train_full/llama3_full_sft.yaml
``` ```
#### Supervised Fine-Tuning on Multiple Nodes #### Supervised Fine-Tuning on Multiple Nodes
```bash ```bash
FORCE_TORCHRUN=1 NNODES=2 RANK=0 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_full/llama3_full_sft_ds3.yaml FORCE_TORCHRUN=1 NNODES=2 NODE_RANK=0 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_full/llama3_full_sft.yaml
FORCE_TORCHRUN=1 NNODES=2 RANK=1 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_full/llama3_full_sft_ds3.yaml FORCE_TORCHRUN=1 NNODES=2 NODE_RANK=1 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_full/llama3_full_sft.yaml
``` ```
#### Batch Predicting and Computing BLEU and ROUGE Scores ### Elastic and Fault-Tolerant Supervised Fine-Tuning on Multiple Nodes
To launch an elastic job with `MAX_RESTARTS` failures retries, run the following on at least `MIN_NNODES` nodes and at most `MAX_NNODES` nodes. `RDZV_ID` should be set as a unique job id (shared by all nodes participating in the job). See also [torchrun](https://docs.pytorch.org/docs/stable/elastic/run.html).
```bash ```bash
llamafactory-cli train examples/train_full/llama3_full_predict.yaml FORCE_TORCHRUN=1 MIN_NNODES=1 MAX_NNODES=3 MAX_RESTARTS=3 RDZV_ID=llamafactory MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_full/llama3_full_sft.yaml
```
#### Multimodal Supervised Fine-Tuning
```bash
FORCE_TORCHRUN=1 llamafactory-cli train examples/train_full/qwen2_5vl_full_sft.yaml
``` ```
### Merging LoRA Adapters and Quantization ### Merging LoRA Adapters and Quantization
@@ -155,15 +195,28 @@ llamafactory-cli export examples/merge_lora/llama3_lora_sft.yaml
llamafactory-cli export examples/merge_lora/llama3_gptq.yaml llamafactory-cli export examples/merge_lora/llama3_gptq.yaml
``` ```
### Save Ollama modelfile
```bash
llamafactory-cli export examples/merge_lora/llama3_full_sft.yaml
```
### Inferring LoRA Fine-Tuned Models ### Inferring LoRA Fine-Tuned Models
#### Use CLI #### Evaluation using vLLM's Multi-GPU Inference
```
python scripts/vllm_infer.py --model_name_or_path meta-llama/Meta-Llama-3-8B-Instruct --template llama3 --dataset alpaca_en_demo
python scripts/eval_bleu_rouge.py generated_predictions.jsonl
```
#### Use CLI ChatBox
```bash ```bash
llamafactory-cli chat examples/inference/llama3_lora_sft.yaml llamafactory-cli chat examples/inference/llama3_lora_sft.yaml
``` ```
#### Use Web UI #### Use Web UI ChatBox
```bash ```bash
llamafactory-cli webchat examples/inference/llama3_lora_sft.yaml llamafactory-cli webchat examples/inference/llama3_lora_sft.yaml
@@ -183,12 +236,30 @@ llamafactory-cli api examples/inference/llama3_lora_sft.yaml
llamafactory-cli train examples/extras/galore/llama3_full_sft.yaml llamafactory-cli train examples/extras/galore/llama3_full_sft.yaml
``` ```
#### Full-Parameter Fine-Tuning using APOLLO
```bash
llamafactory-cli train examples/extras/apollo/llama3_full_sft.yaml
```
#### Full-Parameter Fine-Tuning using BAdam #### Full-Parameter Fine-Tuning using BAdam
```bash ```bash
llamafactory-cli train examples/extras/badam/llama3_full_sft.yaml llamafactory-cli train examples/extras/badam/llama3_full_sft.yaml
``` ```
#### Full-Parameter Fine-Tuning using Adam-mini
```bash
llamafactory-cli train examples/extras/adam_mini/qwen2_full_sft.yaml
```
#### Full-Parameter Fine-Tuning using Muon
```bash
llamafactory-cli train examples/extras/muon/qwen2_full_sft.yaml
```
#### LoRA+ Fine-Tuning #### LoRA+ Fine-Tuning
```bash ```bash
@@ -219,3 +290,15 @@ llamafactory-cli train examples/extras/llama_pro/llama3_freeze_sft.yaml
```bash ```bash
bash examples/extras/fsdp_qlora/train.sh bash examples/extras/fsdp_qlora/train.sh
``` ```
#### OFT Fine-Tuning
```bash
llamafactory-cli train examples/extras/oft/llama3_oft_sft.yaml
```
#### QOFT Fine-Tuning
```bash
llamafactory-cli train examples/extras/qoft/llama3_oft_sft_bnb_npu.yaml
```

View File

@@ -13,6 +13,26 @@
使用 `CUDA_VISIBLE_DEVICES`GPU`ASCEND_RT_VISIBLE_DEVICES`NPU选择计算设备。 使用 `CUDA_VISIBLE_DEVICES`GPU`ASCEND_RT_VISIBLE_DEVICES`NPU选择计算设备。
LLaMA-Factory 默认使用所有可见的计算设备。
基础用法:
```bash
llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml
```
高级用法:
```bash
CUDA_VISIBLE_DEVICES=0,1 llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml \
learning_rate=1e-5 \
logging_steps=1
```
```bash
bash examples/train_lora/llama3_lora_sft.sh
```
## 示例 ## 示例
### LoRA 微调 ### LoRA 微调
@@ -32,7 +52,19 @@ llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml
#### 多模态指令监督微调 #### 多模态指令监督微调
```bash ```bash
llamafactory-cli train examples/train_lora/llava1_5_lora_sft.yaml llamafactory-cli train examples/train_lora/qwen2_5vl_lora_sft.yaml
```
#### DPO/ORPO/SimPO 训练
```bash
llamafactory-cli train examples/train_lora/llama3_lora_dpo.yaml
```
#### 多模态 DPO/ORPO/SimPO 训练
```bash
llamafactory-cli train examples/train_lora/qwen2_5vl_lora_dpo.yaml
``` ```
#### 奖励模型训练 #### 奖励模型训练
@@ -47,12 +79,6 @@ llamafactory-cli train examples/train_lora/llama3_lora_reward.yaml
llamafactory-cli train examples/train_lora/llama3_lora_ppo.yaml llamafactory-cli train examples/train_lora/llama3_lora_ppo.yaml
``` ```
#### DPO/ORPO/SimPO 训练
```bash
llamafactory-cli train examples/train_lora/llama3_lora_dpo.yaml
```
#### KTO 训练 #### KTO 训练
```bash ```bash
@@ -73,17 +99,19 @@ llamafactory-cli train examples/train_lora/llama3_preprocess.yaml
llamafactory-cli eval examples/train_lora/llama3_lora_eval.yaml llamafactory-cli eval examples/train_lora/llama3_lora_eval.yaml
``` ```
#### 批量预测并计算 BLEU 和 ROUGE 分数
```bash
llamafactory-cli train examples/train_lora/llama3_lora_predict.yaml
```
#### 多机指令监督微调 #### 多机指令监督微调
```bash ```bash
FORCE_TORCHRUN=1 NNODES=2 RANK=0 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml FORCE_TORCHRUN=1 NNODES=2 NODE_RANK=0 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml
FORCE_TORCHRUN=1 NNODES=2 RANK=1 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml FORCE_TORCHRUN=1 NNODES=2 NODE_RANK=1 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_lora/llama3_lora_sft.yaml
```
### 支持弹性和容错的多机指令监督微调
要启动一个支持弹性节点和容错的多机指令微调,在每个节点上执行以下命令。弹性节点数量范围为 `MIN_NNODES:MAX_NNODES`,每个节点最多允许因为错误重启 `MAX_RESTARTS` 次。`RDZV_ID` 应设置为一个唯一的作业 ID由参与该作业的所有节点共享。更多新可以参考官方文档 [torchrun](https://docs.pytorch.org/docs/stable/elastic/run.html)。
```bash
FORCE_TORCHRUN=1 MIN_NNODES=1 MAX_NNODES=3 MAX_RESTARTS=3 RDZV_ID=llamafactory MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_full/llama3_full_sft.yaml
``` ```
#### 使用 DeepSpeed ZeRO-3 平均分配显存 #### 使用 DeepSpeed ZeRO-3 平均分配显存
@@ -92,6 +120,12 @@ FORCE_TORCHRUN=1 NNODES=2 RANK=1 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llama
FORCE_TORCHRUN=1 llamafactory-cli train examples/train_lora/llama3_lora_sft_ds3.yaml FORCE_TORCHRUN=1 llamafactory-cli train examples/train_lora/llama3_lora_sft_ds3.yaml
``` ```
#### 使用 Ray 在 4 张 GPU 上微调
```bash
USE_RAY=1 llamafactory-cli train examples/train_lora/llama3_lora_sft_ray.yaml
```
### QLoRA 微调 ### QLoRA 微调
#### 基于 4/8 比特 Bitsandbytes/HQQ/EETQ 量化进行指令监督微调(推荐) #### 基于 4/8 比特 Bitsandbytes/HQQ/EETQ 量化进行指令监督微调(推荐)
@@ -100,6 +134,12 @@ FORCE_TORCHRUN=1 llamafactory-cli train examples/train_lora/llama3_lora_sft_ds3.
llamafactory-cli train examples/train_qlora/llama3_lora_sft_otfq.yaml llamafactory-cli train examples/train_qlora/llama3_lora_sft_otfq.yaml
``` ```
#### 在 NPU 上基于 4 比特 Bitsandbytes 量化进行指令监督微调
```bash
llamafactory-cli train examples/train_qlora/llama3_lora_sft_bnb_npu.yaml
```
#### 基于 4/8 比特 GPTQ 量化进行指令监督微调 #### 基于 4/8 比特 GPTQ 量化进行指令监督微调
```bash ```bash
@@ -123,20 +163,20 @@ llamafactory-cli train examples/train_qlora/llama3_lora_sft_aqlm.yaml
#### 在单机上进行指令监督微调 #### 在单机上进行指令监督微调
```bash ```bash
FORCE_TORCHRUN=1 llamafactory-cli train examples/train_full/llama3_full_sft_ds3.yaml FORCE_TORCHRUN=1 llamafactory-cli train examples/train_full/llama3_full_sft.yaml
``` ```
#### 在多机上进行指令监督微调 #### 在多机上进行指令监督微调
```bash ```bash
FORCE_TORCHRUN=1 NNODES=2 RANK=0 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_full/llama3_full_sft_ds3.yaml FORCE_TORCHRUN=1 NNODES=2 NODE_RANK=0 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_full/llama3_full_sft.yaml
FORCE_TORCHRUN=1 NNODES=2 RANK=1 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_full/llama3_full_sft_ds3.yaml FORCE_TORCHRUN=1 NNODES=2 NODE_RANK=1 MASTER_ADDR=192.168.0.1 MASTER_PORT=29500 llamafactory-cli train examples/train_full/llama3_full_sft.yaml
``` ```
#### 批量预测并计算 BLEU 和 ROUGE 分数 #### 多模态指令监督微调
```bash ```bash
llamafactory-cli train examples/train_full/llama3_full_predict.yaml FORCE_TORCHRUN=1 llamafactory-cli train examples/train_full/qwen2_5vl_full_sft.yaml
``` ```
### 合并 LoRA 适配器与模型量化 ### 合并 LoRA 适配器与模型量化
@@ -155,15 +195,28 @@ llamafactory-cli export examples/merge_lora/llama3_lora_sft.yaml
llamafactory-cli export examples/merge_lora/llama3_gptq.yaml llamafactory-cli export examples/merge_lora/llama3_gptq.yaml
``` ```
### 保存 Ollama 配置文件
```bash
llamafactory-cli export examples/merge_lora/llama3_full_sft.yaml
```
### 推理 LoRA 模型 ### 推理 LoRA 模型
#### 使用命令行接口 #### 使用 vLLM 多卡推理评估
```
python scripts/vllm_infer.py --model_name_or_path meta-llama/Meta-Llama-3-8B-Instruct --template llama3 --dataset alpaca_en_demo
python scripts/eval_bleu_rouge.py generated_predictions.jsonl
```
#### 使用命令行对话框
```bash ```bash
llamafactory-cli chat examples/inference/llama3_lora_sft.yaml llamafactory-cli chat examples/inference/llama3_lora_sft.yaml
``` ```
#### 使用浏览器界面 #### 使用浏览器对话框
```bash ```bash
llamafactory-cli webchat examples/inference/llama3_lora_sft.yaml llamafactory-cli webchat examples/inference/llama3_lora_sft.yaml
@@ -183,12 +236,30 @@ llamafactory-cli api examples/inference/llama3_lora_sft.yaml
llamafactory-cli train examples/extras/galore/llama3_full_sft.yaml llamafactory-cli train examples/extras/galore/llama3_full_sft.yaml
``` ```
#### 使用 APOLLO 进行全参数训练
```bash
llamafactory-cli train examples/extras/apollo/llama3_full_sft.yaml
```
#### 使用 BAdam 进行全参数训练 #### 使用 BAdam 进行全参数训练
```bash ```bash
llamafactory-cli train examples/extras/badam/llama3_full_sft.yaml llamafactory-cli train examples/extras/badam/llama3_full_sft.yaml
``` ```
#### 使用 Adam-mini 进行全参数训练
```bash
llamafactory-cli train examples/extras/adam_mini/qwen2_full_sft.yaml
```
#### 使用 Muon 进行全参数训练
```bash
llamafactory-cli train examples/extras/muon/qwen2_full_sft.yaml
```
#### LoRA+ 微调 #### LoRA+ 微调
```bash ```bash
@@ -219,3 +290,15 @@ llamafactory-cli train examples/extras/llama_pro/llama3_freeze_sft.yaml
```bash ```bash
bash examples/extras/fsdp_qlora/train.sh bash examples/extras/fsdp_qlora/train.sh
``` ```
#### OFT 微调
```bash
llamafactory-cli train examples/extras/oft/llama3_oft_sft.yaml
```
#### QOFT 微调
```bash
llamafactory-cli train examples/extras/qoft/llama3_oft_sft_bnb_npu.yaml
```

View File

@@ -7,16 +7,16 @@ fsdp_config:
fsdp_backward_prefetch: BACKWARD_PRE fsdp_backward_prefetch: BACKWARD_PRE
fsdp_forward_prefetch: false fsdp_forward_prefetch: false
fsdp_cpu_ram_efficient_loading: true fsdp_cpu_ram_efficient_loading: true
fsdp_offload_params: true # offload may affect training speed fsdp_offload_params: false
fsdp_sharding_strategy: FULL_SHARD fsdp_sharding_strategy: FULL_SHARD
fsdp_state_dict_type: FULL_STATE_DICT fsdp_state_dict_type: FULL_STATE_DICT
fsdp_sync_module_states: true fsdp_sync_module_states: true
fsdp_use_orig_params: true fsdp_use_orig_params: true
machine_rank: 0 machine_rank: 0
main_training_function: main main_training_function: main
mixed_precision: fp16 # or bf16 mixed_precision: bf16 # or fp16
num_machines: 1 # the number of nodes num_machines: 1 # the number of nodes
num_processes: 2 # the number of GPUs in all nodes num_processes: 2 # the number of GPUs in all nodes
rdzv_backend: static rdzv_backend: static
same_network: true same_network: true
tpu_env: [] tpu_env: []

View File

@@ -0,0 +1,25 @@
compute_environment: LOCAL_MACHINE
debug: false
distributed_type: FSDP
downcast_bf16: 'no'
fsdp_config:
fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP
fsdp_backward_prefetch: BACKWARD_PRE
fsdp_forward_prefetch: false
fsdp_cpu_ram_efficient_loading: true
fsdp_offload_params: true # offload may affect training speed
fsdp_sharding_strategy: FULL_SHARD
fsdp_state_dict_type: FULL_STATE_DICT
fsdp_sync_module_states: true
fsdp_use_orig_params: true
machine_rank: 0
main_training_function: main
mixed_precision: bf16 # or fp16
num_machines: 1 # the number of nodes
num_processes: 2 # the number of GPUs in all nodes
rdzv_backend: static
same_network: true
tpu_env: []
tpu_use_cluster: false
tpu_use_sudo: false
use_cpu: false

View File

@@ -0,0 +1,28 @@
{
"train_batch_size": "auto",
"train_micro_batch_size_per_gpu": "auto",
"gradient_accumulation_steps": "auto",
"gradient_clipping": "auto",
"zero_allow_untested_optimizer": true,
"fp16": {
"enabled": "auto",
"loss_scale": 0,
"loss_scale_window": 1000,
"initial_scale_power": 16,
"hysteresis": 2,
"min_loss_scale": 1
},
"bf16": {
"enabled": "auto"
},
"zero_optimization": {
"stage": 0,
"allgather_partitions": true,
"allgather_bucket_size": 5e8,
"overlap_comm": false,
"reduce_scatter": true,
"reduce_bucket_size": 5e8,
"contiguous_gradients": true,
"round_robin_gradients": true
}
}

View File

@@ -0,0 +1,28 @@
{
"train_batch_size": "auto",
"train_micro_batch_size_per_gpu": "auto",
"gradient_accumulation_steps": "auto",
"gradient_clipping": "auto",
"zero_allow_untested_optimizer": true,
"fp16": {
"enabled": "auto",
"loss_scale": 0,
"loss_scale_window": 1000,
"initial_scale_power": 16,
"hysteresis": 2,
"min_loss_scale": 1
},
"bf16": {
"enabled": "auto"
},
"zero_optimization": {
"stage": 2,
"allgather_partitions": true,
"allgather_bucket_size": 5e8,
"overlap_comm": false,
"reduce_scatter": true,
"reduce_bucket_size": 5e8,
"contiguous_gradients": true,
"round_robin_gradients": true
}
}

View File

@@ -0,0 +1,32 @@
{
"train_batch_size": "auto",
"train_micro_batch_size_per_gpu": "auto",
"gradient_accumulation_steps": "auto",
"gradient_clipping": "auto",
"zero_allow_untested_optimizer": true,
"fp16": {
"enabled": "auto",
"loss_scale": 0,
"loss_scale_window": 1000,
"initial_scale_power": 16,
"hysteresis": 2,
"min_loss_scale": 1
},
"bf16": {
"enabled": "auto"
},
"zero_optimization": {
"stage": 2,
"offload_optimizer": {
"device": "cpu",
"pin_memory": true
},
"allgather_partitions": true,
"allgather_bucket_size": 5e8,
"overlap_comm": false,
"reduce_scatter": true,
"reduce_bucket_size": 5e8,
"contiguous_gradients": true,
"round_robin_gradients": true
}
}

View File

@@ -0,0 +1,30 @@
{
"train_batch_size": "auto",
"train_micro_batch_size_per_gpu": "auto",
"gradient_accumulation_steps": "auto",
"gradient_clipping": "auto",
"zero_allow_untested_optimizer": true,
"fp16": {
"enabled": "auto",
"loss_scale": 0,
"loss_scale_window": 1000,
"initial_scale_power": 16,
"hysteresis": 2,
"min_loss_scale": 1
},
"bf16": {
"enabled": "auto"
},
"zero_optimization": {
"stage": 3,
"overlap_comm": false,
"contiguous_gradients": true,
"sub_group_size": 1e9,
"reduce_bucket_size": "auto",
"stage3_prefetch_bucket_size": "auto",
"stage3_param_persistence_threshold": "auto",
"stage3_max_live_parameters": 1e9,
"stage3_max_reuse_distance": 1e9,
"stage3_gather_16bit_weights_on_model_save": true
}
}

View File

@@ -0,0 +1,45 @@
{
"train_micro_batch_size_per_gpu": "auto",
"gradient_clipping": "auto",
"zero_allow_untested_optimizer": true,
"zero_force_ds_cpu_optimizer": true,
"fp16": {
"enabled": false,
"loss_scale": 0,
"loss_scale_window": 1000,
"initial_scale_power": 16,
"hysteresis": 2,
"min_loss_scale": 1
},
"bf16": {
"enabled": "auto"
},
"zero_optimization": {
"stage": 3,
"offload_optimizer": {
"device": "cpu",
"pin_memory": false
},
"overlap_comm": false,
"contiguous_gradients": true,
"sub_group_size": 1000000000,
"reduce_bucket_size": 12845056,
"stage3_prefetch_bucket_size": 11560550,
"stage3_param_persistence_threshold": 35840,
"stage3_max_live_parameters": 1000000000,
"stage3_max_reuse_distance": 1000000000,
"stage3_gather_16bit_weights_on_model_save": true
},
"steps_per_print": 10000000,
"gradient_accumulation_steps": "auto",
"comms_config": {
"verbose": false
},
"monitor_config": {
"enabled": true,
"tag": "DeepSpeedMonitor",
"csv_monitor": {
"enabled": false
}
}
}

View File

@@ -0,0 +1,38 @@
{
"train_batch_size": "auto",
"train_micro_batch_size_per_gpu": "auto",
"gradient_accumulation_steps": "auto",
"gradient_clipping": "auto",
"zero_allow_untested_optimizer": true,
"fp16": {
"enabled": "auto",
"loss_scale": 0,
"loss_scale_window": 1000,
"initial_scale_power": 16,
"hysteresis": 2,
"min_loss_scale": 1
},
"bf16": {
"enabled": "auto"
},
"zero_optimization": {
"stage": 3,
"offload_optimizer": {
"device": "cpu",
"pin_memory": true
},
"offload_param": {
"device": "cpu",
"pin_memory": true
},
"overlap_comm": false,
"contiguous_gradients": true,
"sub_group_size": 1e9,
"reduce_bucket_size": "auto",
"stage3_prefetch_bucket_size": "auto",
"stage3_param_persistence_threshold": "auto",
"stage3_max_live_parameters": 1e9,
"stage3_max_reuse_distance": 1e9,
"stage3_gather_16bit_weights_on_model_save": true
}
}

View File

@@ -0,0 +1,43 @@
### model
model_name_or_path: Qwen/Qwen2-1.5B-Instruct
trust_remote_code: true
### method
stage: sft
do_train: true
finetuning_type: full
use_adam_mini: true
### dataset
dataset: identity,alpaca_en_demo
template: qwen
cutoff_len: 2048
max_samples: 1000
overwrite_cache: true
preprocessing_num_workers: 16
dataloader_num_workers: 4
### output
output_dir: saves/qwen2-1_5b/full/sft
logging_steps: 10
save_steps: 500
plot_loss: true
overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train
per_device_train_batch_size: 1
gradient_accumulation_steps: 8
learning_rate: 1.0e-5
num_train_epochs: 3.0
lr_scheduler_type: cosine
warmup_ratio: 0.1
bf16: true
ddp_timeout: 180000000
### eval
# val_size: 0.1
# per_device_eval_batch_size: 1
# eval_strategy: steps
# eval_steps: 500

View File

@@ -0,0 +1,48 @@
### model
model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct
trust_remote_code: true
### method
stage: sft
do_train: true
finetuning_type: full
use_apollo: true
apollo_layerwise: true # choices: [true, false], use false for DDP training
apollo_target: all
apollo_rank: 128
apollo_scale: 32.0
apollo_scale_type: channel
### dataset
dataset: identity,alpaca_en_demo
template: llama3
cutoff_len: 2048
max_samples: 1000
overwrite_cache: true
preprocessing_num_workers: 16
dataloader_num_workers: 4
### output
output_dir: saves/llama3-8b/full/sft
logging_steps: 10
save_steps: 500
plot_loss: true
overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train
per_device_train_batch_size: 1
gradient_accumulation_steps: 1 # use 1 for layerwise apollo
learning_rate: 1.0e-5
num_train_epochs: 3.0
lr_scheduler_type: cosine
warmup_ratio: 0.1
pure_bf16: true
ddp_timeout: 180000000
### eval
# val_size: 0.1
# per_device_eval_batch_size: 1
# eval_strategy: steps
# eval_steps: 500

View File

@@ -1,5 +1,6 @@
### model ### model
model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct
trust_remote_code: true
### method ### method
stage: sft stage: sft
@@ -10,14 +11,16 @@ badam_mode: layer
badam_switch_mode: ascending badam_switch_mode: ascending
badam_switch_interval: 50 badam_switch_interval: 50
badam_verbose: 2 badam_verbose: 2
# deepspeed: examples/deepspeed/ds_z3_config.json
### dataset ### dataset
dataset: identity,alpaca_en_demo dataset: identity,alpaca_en_demo
template: llama3 template: llama3
cutoff_len: 1024 cutoff_len: 2048
max_samples: 1000 max_samples: 1000
overwrite_cache: true overwrite_cache: true
preprocessing_num_workers: 16 preprocessing_num_workers: 16
dataloader_num_workers: 4
### output ### output
output_dir: saves/llama3-8b/full/sft output_dir: saves/llama3-8b/full/sft
@@ -25,17 +28,19 @@ logging_steps: 10
save_steps: 500 save_steps: 500
plot_loss: true plot_loss: true
overwrite_output_dir: true overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train ### train
per_device_train_batch_size: 1 per_device_train_batch_size: 1
gradient_accumulation_steps: 8 gradient_accumulation_steps: 8
learning_rate: 1.0e-4 learning_rate: 1.0e-5
num_train_epochs: 3.0 num_train_epochs: 3.0
lr_scheduler_type: cosine lr_scheduler_type: cosine
warmup_ratio: 0.1 warmup_ratio: 0.1
### eval ### eval
val_size: 0.1 # val_size: 0.1
per_device_eval_batch_size: 1 # per_device_eval_batch_size: 1
eval_strategy: steps # eval_strategy: steps
eval_steps: 500 # eval_steps: 500

View File

@@ -0,0 +1,43 @@
### model
model_name_or_path: Qwen/Qwen2-1.5B-Instruct
trust_remote_code: true
### method
stage: sft
do_train: true
finetuning_type: full
use_dft_loss: true
### dataset
dataset: identity,alpaca_en_demo
template: qwen
cutoff_len: 2048
max_samples: 1000
overwrite_cache: true
preprocessing_num_workers: 16
dataloader_num_workers: 4
### output
output_dir: saves/qwen2-1_5b/full/sft
logging_steps: 10
save_steps: 500
plot_loss: true
overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train
per_device_train_batch_size: 1
gradient_accumulation_steps: 8
learning_rate: 1.0e-5
num_train_epochs: 3.0
lr_scheduler_type: cosine
warmup_ratio: 0.1
bf16: true
ddp_timeout: 180000000
### eval
# val_size: 0.1
# per_device_eval_batch_size: 1
# eval_strategy: steps
# eval_steps: 500

View File

@@ -0,0 +1,48 @@
# FP8 training example with DeepSpeed ZeRO-3
# This config demonstrates FP8 mixed precision training using HuggingFace Accelerate
# with DeepSpeed providing memory optimization (not FP8 handling)
### Model configuration
model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct
trust_remote_code: true
### Method configuration
stage: sft
do_train: true
finetuning_type: full
### Dataset configuration
dataset: identity
template: llama3
cutoff_len: 1024
max_samples: 1000
overwrite_cache: true
preprocessing_num_workers: 16
### Output configuration
output_dir: saves/llama3-8b/fp8-deepspeed/sft
logging_steps: 10
save_steps: 500
plot_loss: true
overwrite_output_dir: true
### Training configuration
per_device_train_batch_size: 1
gradient_accumulation_steps: 8
learning_rate: 5.0e-5
num_train_epochs: 3.0
lr_scheduler_type: cosine
warmup_ratio: 0.1
bf16: true
### FP8 configuration
fp8: true
fp8_backend: torchao # Use TorchAO backend for FP8
fp8_enable_fsdp_float8_all_gather: false # Not used with DeepSpeed
### DeepSpeed configuration
deepspeed: examples/deepspeed/ds_z3_fp8_config.json
### Logging configuration
report_to: wandb
run_name: llama3_fp8_deepspeed_sft

View File

@@ -0,0 +1,51 @@
# FP8 training example with FSDP
# This config demonstrates FP8 mixed precision training using HuggingFace Accelerate
# with FSDP for distributed training and float8 all-gather optimization
### Model configuration
model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct
trust_remote_code: true
### Method configuration
stage: sft
do_train: true
finetuning_type: full
### Dataset configuration
dataset: identity
template: llama3
cutoff_len: 1024
max_samples: 1000
overwrite_cache: true
preprocessing_num_workers: 16
### Output configuration
output_dir: saves/llama3-8b/fp8-fsdp/sft
logging_steps: 10
save_steps: 500
plot_loss: true
overwrite_output_dir: true
### Training configuration
per_device_train_batch_size: 1
gradient_accumulation_steps: 8
learning_rate: 5.0e-5
num_train_epochs: 3.0
lr_scheduler_type: cosine
warmup_ratio: 0.1
bf16: true
### FP8 configuration
fp8: true
fp8_backend: torchao # Use TorchAO backend for FP8
fp8_enable_fsdp_float8_all_gather: true # Enable FSDP2 float8 all-gather optimization
### FSDP configuration (using training arguments - no separate FSDP config file)
fsdp:
- full_shard
- auto_wrap
fsdp_transformer_layer_cls_to_wrap: LlamaDecoderLayer
### Logging configuration
report_to: wandb
run_name: llama3_fp8_fsdp_sft

View File

@@ -1,20 +1,23 @@
### model ### model
model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct
quantization_bit: 4 quantization_bit: 4
trust_remote_code: true
### method ### method
stage: sft stage: sft
do_train: true do_train: true
finetuning_type: lora finetuning_type: lora
lora_rank: 8
lora_target: all lora_target: all
### dataset ### dataset
dataset: identity,alpaca_en_demo dataset: identity,alpaca_en_demo
template: llama3 template: llama3
cutoff_len: 1024 cutoff_len: 2048
max_samples: 1000 max_samples: 1000
overwrite_cache: true overwrite_cache: true
preprocessing_num_workers: 16 preprocessing_num_workers: 16
dataloader_num_workers: 4
### output ### output
output_dir: saves/llama3-8b/lora/sft output_dir: saves/llama3-8b/lora/sft
@@ -22,6 +25,8 @@ logging_steps: 10
save_steps: 500 save_steps: 500
plot_loss: true plot_loss: true
overwrite_output_dir: true overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train ### train
per_device_train_batch_size: 1 per_device_train_batch_size: 1
@@ -34,7 +39,7 @@ bf16: true
ddp_timeout: 180000000 ddp_timeout: 180000000
### eval ### eval
val_size: 0.1 # val_size: 0.1
per_device_eval_batch_size: 1 # per_device_eval_batch_size: 1
eval_strategy: steps # eval_strategy: steps
eval_steps: 500 # eval_steps: 500

View File

@@ -1,23 +1,25 @@
### model ### model
model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct
trust_remote_code: true
### method ### method
stage: sft stage: sft
do_train: true do_train: true
finetuning_type: full finetuning_type: full
use_galore: true use_galore: true
galore_layerwise: true galore_layerwise: true # choices: [true, false], use false for DDP training
galore_target: mlp,self_attn galore_target: all
galore_rank: 128 galore_rank: 128
galore_scale: 2.0 galore_scale: 2.0
### dataset ### dataset
dataset: identity,alpaca_en_demo dataset: identity,alpaca_en_demo
template: llama3 template: llama3
cutoff_len: 1024 cutoff_len: 2048
max_samples: 1000 max_samples: 1000
overwrite_cache: true overwrite_cache: true
preprocessing_num_workers: 16 preprocessing_num_workers: 16
dataloader_num_workers: 4
### output ### output
output_dir: saves/llama3-8b/full/sft output_dir: saves/llama3-8b/full/sft
@@ -25,18 +27,21 @@ logging_steps: 10
save_steps: 500 save_steps: 500
plot_loss: true plot_loss: true
overwrite_output_dir: true overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train ### train
per_device_train_batch_size: 1 per_device_train_batch_size: 1
gradient_accumulation_steps: 1 gradient_accumulation_steps: 1 # use 1 for layerwise galore
learning_rate: 1.0e-4 learning_rate: 1.0e-5
num_train_epochs: 3.0 num_train_epochs: 3.0
lr_scheduler_type: cosine lr_scheduler_type: cosine
warmup_ratio: 0.1 warmup_ratio: 0.1
pure_bf16: true pure_bf16: true
ddp_timeout: 180000000
### eval ### eval
val_size: 0.1 # val_size: 0.1
per_device_eval_batch_size: 1 # per_device_eval_batch_size: 1
eval_strategy: steps # eval_strategy: steps
eval_steps: 500 # eval_steps: 500

View File

@@ -2,5 +2,5 @@
python scripts/llama_pro.py \ python scripts/llama_pro.py \
--model_name_or_path meta-llama/Meta-Llama-3-8B-Instruct \ --model_name_or_path meta-llama/Meta-Llama-3-8B-Instruct \
--output_dir models/llama3-8b-instruct-pro \ --output_dir models/llama3-8b-pro \
--num_expand 8 --num_expand 8

View File

@@ -1,5 +1,6 @@
### model ### model
model_name_or_path: models/llama3-8b-instruct-pro model_name_or_path: models/llama3-8b-pro
trust_remote_code: true
### method ### method
stage: sft stage: sft
@@ -12,17 +13,20 @@ use_llama_pro: true
### dataset ### dataset
dataset: identity,alpaca_en_demo dataset: identity,alpaca_en_demo
template: llama3 template: llama3
cutoff_len: 1024 cutoff_len: 2048
max_samples: 1000 max_samples: 1000
overwrite_cache: true overwrite_cache: true
preprocessing_num_workers: 16 preprocessing_num_workers: 16
dataloader_num_workers: 4
### output ### output
output_dir: saves/llama3-8b-instruct-pro/freeze/sft output_dir: saves/llama3-8b-pro/freeze/sft
logging_steps: 10 logging_steps: 10
save_steps: 500 save_steps: 500
plot_loss: true plot_loss: true
overwrite_output_dir: true overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train ### train
per_device_train_batch_size: 1 per_device_train_batch_size: 1
@@ -35,7 +39,7 @@ bf16: true
ddp_timeout: 180000000 ddp_timeout: 180000000
### eval ### eval
val_size: 0.1 # val_size: 0.1
per_device_eval_batch_size: 1 # per_device_eval_batch_size: 1
eval_strategy: steps # eval_strategy: steps
eval_steps: 500 # eval_steps: 500

View File

@@ -1,20 +1,23 @@
### model ### model
model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct
trust_remote_code: true
### method ### method
stage: sft stage: sft
do_train: true do_train: true
finetuning_type: lora finetuning_type: lora
lora_rank: 8
lora_target: all lora_target: all
loraplus_lr_ratio: 16.0 loraplus_lr_ratio: 16.0
### dataset ### dataset
dataset: identity,alpaca_en_demo dataset: identity,alpaca_en_demo
template: llama3 template: llama3
cutoff_len: 1024 cutoff_len: 2048
max_samples: 1000 max_samples: 1000
overwrite_cache: true overwrite_cache: true
preprocessing_num_workers: 16 preprocessing_num_workers: 16
dataloader_num_workers: 4
### output ### output
output_dir: saves/llama3-8b/lora/sft output_dir: saves/llama3-8b/lora/sft
@@ -22,6 +25,8 @@ logging_steps: 10
save_steps: 500 save_steps: 500
plot_loss: true plot_loss: true
overwrite_output_dir: true overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train ### train
per_device_train_batch_size: 1 per_device_train_batch_size: 1
@@ -34,7 +39,7 @@ bf16: true
ddp_timeout: 180000000 ddp_timeout: 180000000
### eval ### eval
val_size: 0.1 # val_size: 0.1
per_device_eval_batch_size: 1 # per_device_eval_batch_size: 1
eval_strategy: steps # eval_strategy: steps
eval_steps: 500 # eval_steps: 500

View File

@@ -1,5 +1,6 @@
### model ### model
model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct
trust_remote_code: true
### method ### method
stage: sft stage: sft
@@ -10,10 +11,11 @@ mixture_of_depths: convert
### dataset ### dataset
dataset: identity,alpaca_en_demo dataset: identity,alpaca_en_demo
template: llama3 template: llama3
cutoff_len: 1024 cutoff_len: 2048
max_samples: 1000 max_samples: 1000
overwrite_cache: true overwrite_cache: true
preprocessing_num_workers: 16 preprocessing_num_workers: 16
dataloader_num_workers: 4
### output ### output
output_dir: saves/llama3-8b-mod/full/sft output_dir: saves/llama3-8b-mod/full/sft
@@ -21,12 +23,14 @@ logging_steps: 10
save_steps: 500 save_steps: 500
plot_loss: true plot_loss: true
overwrite_output_dir: true overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train ### train
per_device_train_batch_size: 1 per_device_train_batch_size: 1
gradient_accumulation_steps: 8 gradient_accumulation_steps: 8
optim: paged_adamw_8bit optim: paged_adamw_8bit
learning_rate: 1.0e-4 learning_rate: 1.0e-5
num_train_epochs: 3.0 num_train_epochs: 3.0
lr_scheduler_type: cosine lr_scheduler_type: cosine
warmup_ratio: 0.1 warmup_ratio: 0.1
@@ -34,7 +38,7 @@ pure_bf16: true
ddp_timeout: 180000000 ddp_timeout: 180000000
### eval ### eval
val_size: 0.1 # val_size: 0.1
per_device_eval_batch_size: 1 # per_device_eval_batch_size: 1
eval_strategy: steps # eval_strategy: steps
eval_steps: 500 # eval_steps: 500

View File

@@ -0,0 +1,25 @@
# SVG Container Tags
"<|START_OF_SVG|>": "Marks the beginning of an SVG document"
"<|END_OF_SVG|>": "Marks the end of an SVG document"
# SVG Group Tags
"<|start_of_g|>": "Begins a group element in SVG for organizing related shapes"
"<|end_of_g|>": "Ends a group element"
# SVG Shape Tags
"<|start_of_rect|>": "Begins a rectangle shape with width and height attributes"
"<|end_of_rect|>": "Ends a rectangle shape definition"
"<|start_of_circle|>": "Begins a circular shape with radius attribute"
"<|end_of_circle|>": "Ends a circular shape definition"
"<|start_of_path|>": "Begins a path element for drawing custom vector graphics"
"<|end_of_path|>": "Ends a path element definition"
"<|start_of_ellipse|>": "Begins an ellipse shape with x and y radii"
"<|end_of_ellipse|>": "Ends an ellipse shape definition"
# SVG Text Tags
"<|start_of_text|>": "Begins a text element for rendering text content"
"<|end_of_text|>": "Ends a text element"
# SVG Style Tags
"<|start_of_style|>": "Begins a style definition block for CSS styling"
"<|end_of_style|>": "Ends a style definition block"

View File

@@ -0,0 +1,43 @@
### model
model_name_or_path: Qwen/Qwen2-1.5B-Instruct
trust_remote_code: true
### method
stage: sft
do_train: true
finetuning_type: full
use_muon: true
### dataset
dataset: identity,alpaca_en_demo
template: qwen
cutoff_len: 2048
max_samples: 1000
overwrite_cache: true
preprocessing_num_workers: 16
dataloader_num_workers: 4
### output
output_dir: saves/qwen2-1_5b/full/sft
logging_steps: 10
save_steps: 500
plot_loss: true
overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train
per_device_train_batch_size: 1
gradient_accumulation_steps: 8
learning_rate: 1.0e-5
num_train_epochs: 3.0
lr_scheduler_type: cosine
warmup_ratio: 0.1
bf16: true
ddp_timeout: 180000000
### eval
# val_size: 0.1
# per_device_eval_batch_size: 1
# eval_strategy: steps
# eval_steps: 500

View File

@@ -1,6 +1,10 @@
# The batch generation can be SLOW using this config.
# For faster inference, we recommend to use `scripts/vllm_infer.py`.
### model ### model
model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct
adapter_name_or_path: saves/llama3-8b/lora/sft adapter_name_or_path: saves/llama3-8b/lora/sft
trust_remote_code: true
### method ### method
stage: sft stage: sft
@@ -10,14 +14,16 @@ finetuning_type: lora
### dataset ### dataset
eval_dataset: identity,alpaca_en_demo eval_dataset: identity,alpaca_en_demo
template: llama3 template: llama3
cutoff_len: 1024 cutoff_len: 2048
max_samples: 50 max_samples: 50
overwrite_cache: true overwrite_cache: true
preprocessing_num_workers: 16 preprocessing_num_workers: 16
dataloader_num_workers: 4
### output ### output
output_dir: saves/llama3-8b/lora/predict output_dir: saves/llama3-8b/lora/predict
overwrite_output_dir: true overwrite_output_dir: true
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### eval ### eval
per_device_eval_batch_size: 1 per_device_eval_batch_size: 1

View File

@@ -1,31 +1,31 @@
### model ### model
model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct model_name_or_path: meta-llama/Meta-Llama-3-8B-Instruct
trust_remote_code: true
### method ### method
stage: sft stage: sft
do_train: true do_train: true
finetuning_type: full finetuning_type: oft
use_badam: true oft_block_size: 32
badam_mode: layer oft_target: all
badam_switch_mode: ascending
badam_switch_interval: 50
badam_verbose: 2
deepspeed: examples/deepspeed/ds_z3_config.json
### dataset ### dataset
dataset: identity,alpaca_en_demo dataset: identity,alpaca_en_demo
template: llama3 template: llama3
cutoff_len: 1024 cutoff_len: 2048
max_samples: 1000 max_samples: 1000
overwrite_cache: true overwrite_cache: true
preprocessing_num_workers: 16 preprocessing_num_workers: 16
dataloader_num_workers: 4
### output ### output
output_dir: saves/llama3-8b/full/sft output_dir: saves/llama3-8b/oft/sft
logging_steps: 10 logging_steps: 10
save_steps: 500 save_steps: 500
plot_loss: true plot_loss: true
overwrite_output_dir: true overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train ### train
per_device_train_batch_size: 1 per_device_train_batch_size: 1
@@ -34,9 +34,13 @@ learning_rate: 1.0e-4
num_train_epochs: 3.0 num_train_epochs: 3.0
lr_scheduler_type: cosine lr_scheduler_type: cosine
warmup_ratio: 0.1 warmup_ratio: 0.1
bf16: true
ddp_timeout: 180000000
resume_from_checkpoint: null
### eval ### eval
val_size: 0.1 # eval_dataset: alpaca_en_demo
per_device_eval_batch_size: 1 # val_size: 0.1
eval_strategy: steps # per_device_eval_batch_size: 1
eval_steps: 500 # eval_strategy: steps
# eval_steps: 500

View File

@@ -0,0 +1,47 @@
### model
model_name_or_path: Qwen/Qwen2.5-VL-7B-Instruct
image_max_pixels: 262144
video_max_pixels: 16384
trust_remote_code: true
### method
stage: sft
do_train: true
finetuning_type: oft
oft_block_size: 32
oft_target: all
### dataset
dataset: mllm_demo,identity,alpaca_en_demo # video: mllm_video_demo
template: qwen2_vl
cutoff_len: 2048
max_samples: 1000
overwrite_cache: true
preprocessing_num_workers: 16
dataloader_num_workers: 4
### output
output_dir: saves/qwen2_5vl-7b/oft/sft
logging_steps: 10
save_steps: 500
plot_loss: true
overwrite_output_dir: true
save_only_model: false
report_to: none # choices: [none, wandb, tensorboard, swanlab, mlflow]
### train
per_device_train_batch_size: 1
gradient_accumulation_steps: 8
learning_rate: 1.0e-4
num_train_epochs: 3.0
lr_scheduler_type: cosine
warmup_ratio: 0.1
bf16: true
ddp_timeout: 180000000
resume_from_checkpoint: null
### eval
# val_size: 0.1
# per_device_eval_batch_size: 1
# eval_strategy: steps
# eval_steps: 500

Some files were not shown because too many files have changed in this diff Show More