Your Location:
Home >
Browse articles >
Forget less, count better: a domain-incremental self-distillation learning benchmark for lifelong crowd counting
Regular Papers | Updated:2023-02-27
|
    • Forget less, count better: a domain-incremental self-distillation learning benchmark for lifelong crowd counting

      Cover Article Enhanced Publication
    • 忘得少,数得好:一种域增量式自蒸馏终身人群计数基准
    • Frontiers of Information Technology & Electronic Engineering   Vol. 24, Issue 2, Pages: 187-202(2023)
    • DOI:10.1631/FITEE.2200380    

      CLC: TP391
    • Published:0 February 2023

      Received:07 September 2022

      Accepted:2022-12-26

    Scan QR Code

  • JIAQI GAO, JINGQI LI, HONGMING SHAN, et al. Forget less, count better: a domain-incremental self-distillation learning benchmark for lifelong crowd counting. [J]. Frontiers of information technology & electronic engineering, 2023, 24(2): 187-202. DOI: 10.1631/FITEE.2200380.

  •  
  •  

0

Views

78

Downloads

1

CSCD

>
Alert me when the article has been cited
Submit
Tools
Download
Export Citation
Share
Add to favorites
Add to my album

Related Articles

Federated learning on non-IID and long-tailed data via dual-decoupling
Multi-exit self-distillation with appropriate teachers
Federated mutual learning: a collaborative machine learning method for heterogeneous data, models, and objectives
Aggregated context network for crowd counting
A novel convolutional neural network method for crowd counting

Related Author

Baojin WANG
Renhao HU
Jinguo LI
Hongjiao LI
Zhaohui WANG
Can WANG
Wujie SUN
Defang CHEN

Related Institution

College of Computer Science and Technology, Shanghai University of Electric Power
College of Computer Science and Technology, Zhejiang University
School of Software Technology, Zhejiang University
School of Public Affairs, Zhejiang University
School of Computer Science and Technology, East China Normal University
0