Molecular property prediction has the ability to improve many processes in molecular chemistry industry. One important application is the development of new drugs where molecular property prediction can decrease both the cost and time of finding new drugs. The current trend is to use graph neural networks or transformers which tend to need moderate and large amounts of data respectively to perform well. Because of the scarceness of molecular property data it is of great interest to find an effective method to transfer learning from other more data-abundant problems. In this thesis I present an approach to pre-train transformer encoders on reaction prediction in order to improve performance on downstream molecular property prediction tasks. I have built a model based on the full transformer architecture but modify it for the purpose of pre-training the encoder. Model performance and specifically the effect of pre-training is tested by predicting lipophilicity, HIV inhibition and hERG channel blocking using both pre-trained models and models without any pre-training. The results demonstrate a tendency for improvement of performance on all molecular property prediction tasks using the suggested pre-training but this tendency for improvement is not statistically significant. The major limitation with the conclusive evaluation stems from the limited simulations due to computational constraints
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:kth-309332 |
Date | January 2022 |
Creators | Broberg, Johan |
Publisher | KTH, Skolan för elektroteknik och datavetenskap (EECS) |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Student thesis, info:eu-repo/semantics/bachelorThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Relation | TRITA-EECS-EX ; 2022:14 |
Page generated in 0.0026 seconds