Return to search

Paralelní trénování hlubokých neuronových sítí / Parallel Deep Learning

Aim of this thesis is to propose how to evaluate favourableness of parallel deep learning. In this thesis I analyze parallel deep learning and I focus on its length. I take into account gradient computation length and weight transportation length. Result of this thesis is proposal of equations, which can estimate the speedup on multiple workers. These equations can be used to determine ideal number of workers for training.

Identiferoai:union.ndltd.org:nusl.cz/oai:invenio.nusl.cz:363904
Date January 2017
CreatorsŠlampa, Ondřej
ContributorsSochor, Jakub, Hradiš, Michal
PublisherVysoké učení technické v Brně. Fakulta informačních technologií
Source SetsCzech ETDs
LanguageCzech
Detected LanguageEnglish
Typeinfo:eu-repo/semantics/masterThesis
Rightsinfo:eu-repo/semantics/restrictedAccess

Page generated in 0.0021 seconds