Your Location:
Home >
Browse articles >
Training large-scale language models with limited GPU memory: a survey
Regular Papers | Updated:2025-04-03
|
    • Training large-scale language models with limited GPU memory: a survey

      Enhanced Publication
    • 有限GPU显存下的大语言模型训练技术综述
    • Frontiers of Information Technology & Electronic Engineering   Vol. 26, Issue 3, Pages: 309-331(2025)
    • DOI:10.1631/FITEE.2300710    

      CLC: TP389.1
    • Received:17 October 2023

      Revised:31 March 2024

      Published Online:17 March 2025

      Published:2025-03

    Scan QR Code

  • Yu TANG, Linbo QIAO, Lujia YIN, et al. Training large-scale language models with limited GPU memory: a survey[J]. Frontiers of information technology & electronic engineering, 2025, 26(3): 309-331. DOI: 10.1631/FITEE.2300710.

  •  
  •  

0

Views

8

Downloads

0

CSCD

>
Alert me when the article has been cited
Submit
Tools
Download
Export Citation
Share
Add to favorites
Add to my album

Related Articles

No data

Related Author

No data

Related Institution

No data
0