Your Location:
Home >
Browse articles >
Training large-scale language models with limited GPU memory: a survey
Regular Papers | Updated:2025-04-03
    • Training large-scale language models with limited GPU memory: a survey

      Enhanced Publication
    • 有限GPU显存下的大语言模型训练技术综述
    • Frontiers of Information Technology & Electronic Engineering   Vol. 26, Issue 3, Pages: 309-331(2025)
    • DOI:10.1631/FITEE.2300710    

      CLC: TP389.1
    • Received:17 October 2023

      Revised:2024-03-31

      Published Online:17 March 2025

      Published:2025-03

    Scan QR Code

  • Yu TANG, Linbo QIAO, Lujia YIN, et al. Training large-scale language models with limited GPU memory: a survey[J]. Frontiers of Information Technology & Electronic Engineering, 2025, 26(3): 309-331. DOI: 10.1631/FITEE.2300710.

  •  
  •  
icon
The trial reading is over, you can activate your VIP account to continue reading.
Deactivate >
icon
The trial reading is over. You can log in to your account, go to the personal center, purchase VIP membership, and read the full text.
Already a VIP member?
Log in >

0

Views

433

Downloads

0

CSCD

>
Alert me when the article has been cited
Submit
Tools
Download
Export Citation
Share
Add to favorites
Add to my album

Related Articles

No data

Related Author

No data

Related Institution

No data
0