This project contains two LoRA fine-tuning setups for elegant Chinese writing. Both aim to generate a fluent, literary paragraph from a short input sentence without requiring extra prompt engineering.
LoRA-Qwen/uses a shorter dialogue dataset to improve basic expansion ability.LoRA-WriterQwen/uses a longer dialogue dataset to produce richer expansions and incorporate poetic quotes.
Base model: Qwen/Qwen2.5-7B
all the fine-tuned models are in github!!!!!
You: 你好
Qwen: 两个音节的相遇,是人际关系最小的仪式,却承载着最大的善意与可能。它像一把无形的钥匙,轻轻转动,开启一段对话、一份连接,或一个崭新的故事。
You: 在黄昏之下, 她吹着海风,望向无尽的海面
Qwen: 橘红的晚霞为波涛染上温柔的光泽,而她站在陆地的尽头,让身体与风、水彻底融为一体。目光追随着远方的舟影或岛屿,仿佛在眺望内心某个遥远而模糊的远方。
You: 你好,再见
Qwen: 你好,是相见时的轻启唇角;再见,是离别时含蓄的颔首。'世事一场大梦,人生几度秋凉',有些关系淡如朝露,却也圆满。不必执著于‘后会无期’的怅惘,因为重逢或许已在下一个拐角。你好是开始,再见也是完成——完成一段情缘,开启下一段旅程。愿每一次道别都带着微笑,因为知道明天依旧可期。
You: 在黄昏之下, 她吹着海风,望向无尽的海面
Qwen: 夕阳如血,海天一色。她独立海边,长发被晚风吹起,像飘动的旗帜。海浪一次次涌来,又退向远方,发出深沉的、有节奏的呼吸声。这场景,仿佛一首没有歌词的悲壮诗篇,而她,便是那沉默的读者。夕阳将她的侧脸染成金红色,目光却望向更远的地方——或许是一艘即将远航的船,或许只是虚幻的地平线。海风咸涩,却也带着自由的气息,令人心旷神怡。她吹着风,也在吹散那些无法言说的思绪。落日余晖渐渐淡去,夜幕开始垂下,而她,依然伫立。不必问她在想什么,这凝视本身便是一种答案。它是对广阔与永恒的承认,也是对个人渺小命运的温柔接纳。人生如朝露,但在这海风里,在这眺望中,我们都能找到一种超越具体得失的精神高度。
- Download the
Qwen/Qwen2.5-7B-Instructmodel (recommended), or use any compatible base model. - After downloading, run:
python LoRA-Qwen/generate_lora_qwen.pyor
python LoRA-WriterQwen/generate_lora_qwen.py- To fine-tune with your own data, run:
python LoRA-WriterQwen/train_lora_qwen.pyWith r = 8, fine-tuning a 7B model typically requires about 20GB of GPU memory. You can reduce memory usage by adjusting the dataset, LoRA rank, or training settings.