Return to search

Minimalism Yields Maximum Results: Deep Learning with Limited Resource

<p dir="ltr">Deep learning models have demonstrated remarkable success across diverse domains, including computer vision and natural language processing. These models heavily rely on resources, encompassing annotated data, computational power, and storage. However, mobile devices, particularly in scenarios like medical or multilingual contexts, often face constraints with computing power, making ample data annotation prohibitively expensive. Developing deep learning models for such resource-constrained scenarios presents a formidable challenge. Our primary goal is to enhance the efficiency of state-of-the-art neural network models tailored for resource-limited scenarios. Our commitment lies in crafting algorithms that not only mitigate annotation requirements but also reduce computational complexity and alleviate storage demands. Our dissertation focuses on two key areas: Parameter-efficient Learning and Data-efficient Learning. In Part 1, we present our studies on parameter-efficient learning. This approach targets the creation of lightweight models for efficient storage or inference. The proposed solutions are tailored for diverse tasks, including text generation, text classification, and text/image retrieval. In Part 2, we showcase our proposed methods for data-efficient learning, concentrating on cross-lingual and multi-lingual text classification applications. </p>

  1. 10.25394/pgs.26349415.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/26349415
Date22 July 2024
CreatorsHaoyu Wang (19193416)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/thesis/Minimalism_Yields_Maximum_Results_Deep_Learning_with_Limited_Resource/26349415

Page generated in 0.0018 seconds