Return to search

Time to Open the Black Box : Explaining the Predictions of Text Classification

The purpose of this thesis has been to evaluate if a new instance based explanation method, called Automatic Instance Text Classification Explanator (AITCE), could provide researchers with insights about the predictions of automatic text classification and decision support about documents requiring human classification. Making it possible for researchers, that normally use manual classification, to cut time and money in their research, with the maintained quality. In the study, AITCE was implemented and applied to the predictions of a black box classifier. The evaluation was performed at two levels: at instance level, where a group of 3 senior researchers, that use human classification in their research, evaluated the results from AITCE from an expert view; and at model level, where a group of 24 non experts evaluated the characteristics of the classes. The evaluations indicate that AITCE produces insights about which words that most strongly affect the prediction. The research also suggests that the quality of an automatic text classification may increase through an interaction between the user and the classifier in situations with unsure predictions.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:hb-14194
Date January 2018
CreatorsLöfström, Helena
PublisherHögskolan i Borås, Akademin för bibliotek, information, pedagogik och IT
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0041 seconds