Return to search

Improving Article Summarizationby Fine-tuning GPT-3.5

This thesis project aims to improve text summarization in the financial application by fine-tuningGenerative Pre-trained Transformer 3.5 (GPT-3.5) . Through meticulous training andoptimization, the model was adeptly configured to accurately and efficiently condense complexfinancial reports into concise, informative summaries, specifically designed to support decision-making in professional business environments. Notable improvements were demonstrated inthe model's capacity to retain essential financial details while enhancing the readability andcontextual relevance of the text, as evidenced by superior ROUGE and BLEU scores whencompared to the baseline GPT-3.5 Turbo model. This fine-tuning approach not only underscoresGPT-3.5’s remarkable adaptability to domain-specific challenges but also marks a significantadvancement in the field of automated text summarization within the financial sector. Thefindings from this research highlight the transformative potential of bespoke NLP solutions,offering data-driven industries the tools to rapidly generate precise and actionable businessinsights, thus facilitating more informed decision-making processes

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:uu-533460
Date January 2024
CreatorsGillgren, Fredrik
PublisherUppsala universitet, Signaler och system
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess
RelationUPTEC F, 1401-5757 ; 24047

Page generated in 0.0017 seconds