Your Location:
Home >
Browse articles >
Building accurate translation-tailored large language models with language-aware instruction tuning
Regular Papers | Updated:2025-09-04
    • Building accurate translation-tailored large language models with language-aware instruction tuning

      Enhanced Publication
    • 构建基于语言感知指令微调的精准翻译定制大语言模型
    • In the realm of natural language processing, a breakthrough has been made in enhancing the accuracy of large language models (LLMs) in machine translation tasks. Researchers have developed a two-stage fine-tuning algorithm that significantly reduces off-target translation issues, improving translation quality. This innovation effectively addresses the challenge of producing translations in the wrong language when instructions are not followed. The method involves fine-tuning LLMs on translation data and then introducing an extra unlikelihood loss to decrease the probability of incorrect translations. This advancement not only boosts translation accuracy but also preserves the model's performance on other tasks.
    • Frontiers of Information Technology & Electronic Engineering   Vol. 26, Issue 8, Pages: 1341-1355(2025)
    • DOI:10.1631/FITEE.2400458    

      CLC: TP391
    • Received:30 March 2024

      Revised:2024-11-27

      Published:2025-08

    Scan QR Code

  • Changtong ZAN, Liang DING, Li SHEN, et al. Building accurate translation-tailored large language models with language-aware instruction tuning[J]. Frontiers of Information Technology & Electronic Engineering, 2025, 26(8): 1341-1355. DOI: 10.1631/FITEE.2400458.

  •  
  •  

0

Views

173

Downloads

0

CSCD

>
Alert me when the article has been cited
Submit
Tools
Download
Export Citation
Share
Add to favorites
Add to my album

Related Articles

GMCoT: a graph-augmented multimodal chain-of-thought reasoning framework for multi-label zero-shot learning
Mind the Gap: towards generalizable autonomous penetration testing via domain randomization and meta-reinforcement learning
Can large language models effectively process and execute financial trading instructions?
Four development stages of collective intelligence

Related Author

Gang CHEN
Tianlei HU
Ke CHEN
Haobo WANG
Xiang WEN
Yuliang LU
Jie CHEN
Yue ZHANG

Related Institution

State Key Laboratory of Blockchain and Data Security, Zhejiang University
Hangzhou High-Tech Zone (Binjiang) Institute of Blockchain and Data Security, Zhejiang University
School of Software Technology, Zhejiang University
College of Electronic Engineering, National University of Defense Technology
Anhui Province Key Laboratory of Cyberspace Security Situation, Awareness and Evaluation
0