• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 38
  • 2
  • 2
  • 1
  • Tagged with
  • 54
  • 20
  • 12
  • 11
  • 10
  • 10
  • 9
  • 9
  • 9
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Malicious user attacks in decentralised cognitive radio networks

Sivakumaran, Arun January 2020 (has links)
Cognitive radio networks (CRNs) have emerged as a solution for the looming spectrum crunch caused by the rapid adoption of wireless devices over the previous decade. This technology enables efficient spectrum utility by dynamically reusing existing spectral bands. A CRN achieves this by requiring its users – called secondary users (SUs) – to measure and opportunistically utilise the band of a legacy broadcaster – called a primary user (PU) – in a process called spectrum sensing. Sensing requires the distribution and fusion of measurements from all SUs, which is facilitated by a variety of architectures and topologies. CRNs possessing a central computation node are called centralised networks, while CRNs composed of multiple computation nodes are called decentralised networks. While simpler to implement, centralised networks are reliant on the central node – the entire network fails if this node is compromised. In contrast, decentralised networks require more sophisticated protocols to implement, while offering greater robustness to node failure. Relay-based networks, a subset of decentralised networks, distribute the computation over a number of specialised relay nodes – little research exists on spectrum sensing using these networks. CRNs are vulnerable to unique physical layer attacks targeted at their spectrum sensing functionality. One such attack is the Byzantine attack; these attacks occur when malicious SUs (MUs) alter their sensing reports to achieve some goal (e.g. exploitation of the CRN’s resources, reduction of the CRN’s sensing performance, etc.). Mitigation strategies for Byzantine attacks vary based on the CRN’s network architecture, requiring defence algorithms to be explored for all architectures. Because of the sparse literature regarding relay-based networks, a novel algorithm – suitable for relay-based networks – is proposed in this work. The proposed algorithm performs joint MU detection and secure sensing by large-scale probabilistic inference of a statistical model. The proposed algorithm’s development is separated into the following two parts. • The first part involves the construction of a probabilistic graphical model representing the likelihood of all possible outcomes in the sensing process of a relay-based network. This is done by discovering the conditional dependencies present between the variables of the model. Various candidate graphical models are explored, and the mathematical description of the chosen graphical model is determined. • The second part involves the extraction of information from the graphical model to provide utility for sensing. Marginal inference is used to enable this information extraction. Belief propagation is used to infer the developed graphical model efficiently. Sensing is performed by exchanging the intermediate belief propagation computations between the relays of the CRN. Through a performance evaluation, the proposed algorithm was found to be resistant to probabilistic MU attacks of all frequencies and proportions. The sensing performance was highly sensitive to the placement of the relays and honest SUs, with the performance improving when the number of relays was increased. The transient behaviour of the proposed algorithm was evaluated in terms of its dynamics and computational complexity, with the algorithm’s results deemed satisfactory in this regard. Finally, an analysis of the effectiveness of the graphical model’s components was conducted, with a few model components accounting for most of the performance, implying that further simplifications to the proposed algorithm are possible. / Dissertation (MEng)--University of Pretoria, 2020. / Electrical, Electronic and Computer Engineering / MEng / Unrestricted
12

Pushing the Limits of Gossip-Based Decentralised Machine Learning

Giaretta, Lodovico January 2019 (has links)
Recent years have seen a sharp increase in the ubiquity and power of connected devices, such as smartphones, smart appliances and smart sensors. These de- vices produce large amounts of data that can be extremely precious for training larger, more advanced machine learning models. Unfortunately, it is some- times not possible to collect and process these datasets on a central system, due either to their size or to the growing privacy requirements of digital data handling.To overcome this limit, researchers developed protocols to train global models in a decentralised fashion, exploiting the computational power of these edge devices. These protocols do not require any of the data on the device to be shared, relying instead on communicating partially-trained models.Unfortunately, real-world systems are notoriously hard to control, and may present a wide range of challenges that are easily overlooked in academic stud- ies and simulations. This research analyses the gossip learning protocol, one of the main results in the area of decentralised machine learning, to assess its applicability to real-world scenarios.Specifically, this work identifies the main assumptions built into the pro- tocol, and performs carefully-crafted simulations in order to test its behaviour when these assumptions are lifted. The results show that the protocol can al- ready be applied to certain environments, but that it fails when exposed to certain conditions that appear in some real-world scenarios. In particular, the models trained by the protocol may be biased towards the data stored in nodes with faster communication speeds or a higher number of neighbours. Further- more, certain communication topologies can have a strong negative impact on the convergence speed of the models.While this study also suggests effective mitigations for some of these is- sues, it appears that the gossip learning protocol requires further research ef- forts, in order to ensure a wider industrial applicability. / Under de senaste åren har vi sett en kraftig ökning av närvaron och kraften hos anslutna enheter, såsom smartphones, smarta hushållsmaskiner, och smarta sensorer. Dessa enheter producerar stora mängder data som kan vara extremt värdefulla för att träna stora och avancerade maskininlärningsmodeller. Dessvärre är det ibland inte möjligt att samla in och bearbeta dessa dataset på ett centralt system, detta på grund av deras storlek eller de växande sekretesskraven för digital datahantering.För att lösa problemet har forskare utvecklar protokoller för att träna globala modeller på ett decentraliserat sätt och utnyttja beräkningsförmågan hos dessa enheter. Dessa protokoll kräver inte datan på enheter delas utan förlitar sig istället på att kommunicera delvis tränade modeller.Dessvärre så är verkliga system svåra att kontrollera och kan presentera ett brett spektrum av utmaningar som lätt överskådas i akademiska studier och simuleringar. Denna forskning analyserar gossip inlärning protokollet vilket är av de viktigaste resultaten inom decentraliserad maskininlärning, för att bedöma dess tillämplighet på verkliga scenarier.Detta arbete identifierar de huvudsakliga antagandena om protokollet och utför noggrant utformade simuleringar för att testa protokollets beteende när dessa antaganden tas bort. Resultaten visar att protokollet redan kan tillämpas i vissa miljöer, men att det misslyckas när det utsätts för vissa förhållanden som i verklighetsbaserade scenarier. Mer specifikt så kan modellerna som utbildas av protokollet vara partiska och fördelaktiga mot data lagrade i noder med snabbare kommunikationshastigheter eller ett högre antal grannar. Vidare kan vissa kommunikationstopologier få en stark negativ inverkan på modellernas konvergenshastighet.Även om denna studie kom fram till en förmildrande effekt för vissa av dessa problem så verkar det som om gossip inlärning protokollet kräver ytterligare forskningsinsatser för att säkerställa en bredare industriell tillämplighet.
13

Le "système de coopération" entre les collectivités locales sud-américaines : un modèle en construction au service de l'intégration régionale / The system of cooperation between South American local governments : a model under construction to the benefit of regional integration

Albujar Carbajal, Sergio 21 September 2016 (has links)
En Amérique du Sud, on construit un système politico-normatif qui reconnait les collectivités locales comme acteurs du processus d’intégration régionale. Ce système émerge principalement d’une convergence spontanée et coordonnée. Des processus de décentralisation permettent l’action extérieure des collectivités dans le monde. Des initiatives d’intégration inter-étatique soutiennent la coopération entre les collectivités locales sud-américaines. La convergence de normes de décentralisation et d’intégration régionale fixe un cadre minimal pour la collaboration entre les collectivités de pays différents. On observe ainsi une augmentation de coopérations décentralisées. Les échanges entre les collectivités locales sont devenus naturels sur le plan politique, et légitimes sur le plan juridique. A partir d’une dynamique circulaire, les collectivités locales adaptent leurs administrations au « système de coopération territoriale » conçu par les Etats, et en même temps le perfectionnent par leurs pratiques. Au moins deux conséquences découlent du fonctionnement du système. Les collectivités locales ont la compétence de libre association sud-américaine. Elles appuient l’émergence ou la consolidation d’espaces transnationaux. Alors, un nouveau découpage territorial se dessine. Des espaces transfrontaliers et bi-océaniques cohabitent avec des territoires plus classiques, comme les municipalités, les régions ou les Etats. Cela constitue un apport des collectivités locales à l’intégration régionale. / Throughout South America, a political and legal system that recognizes local governments as actors of regional integration is under construction. This system mainly emerges from a spontaneous and coordinated convergence. Decentralization processes allow local governments to act externally, internationally. Inter-States integration initiatives support the cooperation between local authorities. Norms and rules for decentralization and for South American integration converge to set a minimum framework for the collaboration between local governments of different countries.An increase in decentralised cooperation is thus observed. Exchanges between local authorities have become natural on the political level, and legitimate on the legal level. Thanks to a circular dynamic, local governments adapt their administrations to the « territorial cooperation system » designed by States, and at the same time perfect it through their practices. At least two consequences flow from the system. Local authorities have the jurisdiction to free associationthroughout South America. They support the emergence or consolidation of transnational spaces. Then, a new territorial division emerges. Bi-oceanic and cross-border spaces coexist with more traditional territories, such as municipalities, regions or States. Therefore local authorities contribute to the regional integration in South America.
14

Renewable energy technologies assessment in providing sustainable electricity to Nigerian rural areas

Garba, Abdulhakeem January 2017 (has links)
The research work that underpins this thesis aims to investigate the viability of renewable energy technologies (RETs) and to develop a RETs implementation framework for providing sustainable electricity to Nigeria’s rural areas. As a result of electricity supply deficiency in Nigeria, rural communities have been negatively affected in their socio-economic activities. A strength, weakness, opportunity and threat (SWOT) analysis in combination with an assessment of sustainability indicators of RETs, identified the most appropriate technology for providing sustainable electricity in Nigeria's rural areas. Biomass energy technologies (BETs) are the most appropriate RET given significant resource availability. However, cost has been identified as the major barrier in adopting BETs. Both BETs and grid extension (GE) systems have been assessed. Whole Life Costing (WLC) and interview methods have been used to evaluate the economics of various capacities of BETs and GE systems, and assessed suitability of BETs respectively. Typical findings revealed that all the BETs capacities evaluated other than a 50kW direct combustion system are currently cost-competitive with existing fossil fuel (FF) sources used in generating electricity in Nigeria (US$0.13/kWh without incentives). BETs are identified as the preferable option than GE system for electricity provision to communities of demand capacity less than 50kW and distance less than five kilometre from load centres. Similarly, the interview method confirmed that BETs utilisation in the country’s rural areas are suitable and desirable. For implementation, all the identified drivers and enablers of BETs should be considered, along with the identified constraints to the adoption and development of BETs, some of which should be addressed before implementation. Further, a BETs implementation framework for sustainable electricity provision in rural areas has been developed through the selection of appropriate biomass feedstock and conversion technologies, and support through suitable incentive strategies. The framework was then evaluated and validated using six villages as case study. The benefit of the framework is ensuring successful electricity provision in rural areas. Thus, this study recommends that the existing rural areas energy policies be reviewed to include incentive strategies like economic subsidies in order to encourage investors’ participation given lack of energy infrastructures in rural areas.
15

Modelling and optimisation of a decentralised heat network and energy centre in London Docklands

Janjua, Azeem January 2018 (has links)
The following project aims to create a decentralised heat network development methodology which makes best use of heat sources and loads and can be widely applied to evaluate the energy economics of a heat network scheme and energy centre. As the energy transition takes shape, the key is connectivity and the potential now, or in the future to aid progressive development of energy systems and technologies rather than traditional models that consider schemes individually in isolation and not holistically; where with the latter we’re more likely to end up with robust, future-proof solutions.   A methodology was formulated which encompassed various elements of decentralised energy masterplanning approaches and enabled heat demand loads and associated profiles to be simulated. The development of an optimisation model enabled strategies to be devised (maximisation of energy generation and revenue independently) over a set technology lifetime for the energy centre.   The results have concluded that the maximisation of revenue optimisation strategy is the most viable economically. An energy generation optimisation for the energy centre produced optimal results in terms of its heat generation profile, however, the scheme was not economically viable due to significantly high capital costs associated with piping connections to multiple clusters.   A CO2 emission analysis was carried out for a selection of energy technologies (CHP, heat pumps and gas boilers) for the heat network energy centre. An evaluation of the results has concluded that the optimal selection of technology for the energy centre for the minimisation of CO2 emissions is heat pumps. When selecting combinations of technologies for peak and base loads within the energy centre, heat pumps (base load) and gas boilers (peak load) are optimal when aiming to maximise revenue generation whilst minimising CO2 emissions. In this case, reductions in associated CO2 emissions have been calculated to achieve up to 89.07% when compared to a base case gas boiler technology (energy centre) scenario alone.   The methodology and models developed in this project can be widely applied to decentralised heat network projects in London in order to identify optimal development and expansion strategies and evaluate the energy economics of schemes.
16

Service recommendation and selection in centralized and decentralized environments

Ahmed, Mariwan January 2017 (has links)
With the increasing use of web services in everyday tasks we are entering an era of Internet of Services (IoS). Service discovery and selection in both centralized and decentralized environments have become a critical issue in the area of web services, in particular when services having similar functionality but different Quality of Service (QoS). As a result, selecting a high quality service that best suits consumer requirements from a large list of functionally equivalent services is a challenging task. In response to increasing numbers of services in the discovery and selection process, there is a corresponding increase of service consumers and a consequent diversity in Quality of Service (QoS) available. Increases in both sides leads to a diversity in the demand and supply of services, which would result in the partial match of the requirements and offers. Furthermore, it is challenging for customers to select suitable services from a large number of services that satisfy consumer functional requirements. Therefore, web service recommendation becomes an attractive solution to provide recommended services to consumers which can satisfy their requirements. In this thesis, first a service ranking and selection algorithm is proposed by considering multiple QoS requirements and allowing partially matched services to be counted as a candidate for the selection process. With the initial list of available services the approach considers those services with a partial match of consumer requirements and ranks them based on the QoS parameters, this allows the consumer to select suitable service. In addition, providing weight value for QoS parameters might not be an easy and understandable task for consumers, as a result an automatic weight calculation method has been included for consumer requirements by utilizing distance correlation between QoS parameters. The second aspect of the work in the thesis is the process of QoS based web service recommendation. With an increasing number of web services having similar functionality, it is challenging for service consumers to find out suitable web services that meet their requirements. We propose a personalised service recommendation method using the LDA topic model, which extracts latent interests of consumers and latent topics of services in the form of probability distribution. In addition, the proposed method is able to improve the accuracy of prediction of QoS properties by considering the correlation between neighbouring services and return a list of recommended services that best satisfy consumer requirements. The third part of the thesis concerns providing service discovery and selection in a decentralized environment. Service discovery approaches are often supported by centralized repositories that could suffer from single point failure, performance bottleneck, and scalability issues in large scale systems. To address these issues, we propose a context-aware service discovery and selection approach in a decentralized peer-to-peer environment. In the approach homophily similarity was used for bootstrapping and distribution of nodes. The discovery process is based on the similarity of nodes and previous interaction and behaviour of the nodes, which will help the discovery process in a dynamic environment. Our approach is not only considering service discovery, but also the selection of suitable web service by taking into account the QoS properties of the web services. The major contribution of the thesis is providing a comprehensive QoS based service recommendation and selection in centralized and decentralized environments. With the proposed approach consumers will be able to select suitable service based on their requirements. Experimental results on real world service datasets showed that proposed approaches achieved better performance and efficiency in recommendation and selection process.
17

Urban Growth and Energy Supply in Africa: The Case of Ethiopia

Hoeltl, Andrea, Brandtweiner, Roman, Berger, Tania, Bates, Romana January 2018 (has links) (PDF)
Ethiopia is rapidly urbanising. Similar to other urban areas in developing countries, major issues in Ethiopia include a high level of income inequality, lack of formal employment opportunities and deeply rooted poverty, tenure insecurity, poor infrastructure, and limited access to electricity and energy. Frequently settlers end up in impoverished urban squatters and slums which do not offer them even the most basic infrastructure and hence lack to provide them with the perspectives they came for. Onward migration to farer off destinations such as the EU member states thus often remains as sole option for those caught in such urban poverty traps. Although the issue of informal urban settlements is not new to the context of Ethiopian cities, the current rapid urban growth rates are exposing urban rental markets as well as infrastructure and energy supply to considerable pressure. The paper investigates the respective situation in Ethiopia and demonstrates some best practice examples. In the context of Ethiopian cities, energy production and distribution have been highly centralised under state entities and the scope for exploring local/business driven and decentralised systems has been limited. Transitions can be implemented towards sustainability and the United Nations' Sustainable Development Goals if collective identification and structuring of issues along with collective envisioning of future is provoked or facilitated.
18

mHealth-supported hearing and vision services for preschool children in low-income communities

Eksteen, Susan January 2021 (has links)
Sensory inputs of hearing and vision during early childhood development support the achievement of language, speech and educational milestones. The early detection of sensory impairment is essential for facilitating early childhood development, socio-emotional well-being and academic success, in addition to obtaining sustainable educational development goals. The majority of children with sensory impairment live in low- and middle-income countries where services are often unavailable or inaccessible, because of the absence of systematic screening programmes for children, prohibitive equipment cost, a shortage of trained personnel and centralised service-delivery models. Therefore, research is needed to investigate whether a community-based mobile health (mHealth) supported service-delivery model for hearing and vision screening can increase access to hearing and vision services for children in resource-constrained settings. This study aimed to describe an implemented hearing and vision screening programme and evaluate its success in terms of acceptability (consent return numbers), coverage (number of eligible children screened), referral rates and quality indicators (duration of tests and number of hearing tests conducted under conditions of excessive noise levels). The study also explored the challenges faced during a community-based screening programme and the strategies developed to address these. Four non-professionals were appointed and trained as community health workers (CHWs) to conduct combined sensory screening using mHealth technology (hearScreen application, hearXGroup, South Africa and Peek Acuity application, Peek Vision, United Kingdom) on smartphones at preschools in low-income communities in Cape Town, South Africa. The consent form return rate was 82.0%, and the coverage rate was 94.4%. An average of 501 children were screened each month, at a cost of US$5.63 per child. The number of children who failed hearing and vision screening was 435 (5.4%) and 170 (2.1%), respectively. Failing of hearing tests was associated with longer test times (odds ratio [OR]: 1.022; 95% confidence interval [CI]: 1.021–1.024) and excessive background noise levels at 1 kHz (e.g. OR for left ear: 1.688; 95% CI: 1.198–2.377). Failing of visual screening tests was associated with longer test duration (OR: 1.003; 95% CI: 1.002–1.005) and younger age (OR: 0.629; 95% CI: 0.520–0.761). The study also aimed to describe and compare the performance of two screening protocols that were used in this preschool hearing screening programme to determine optimal referral criteria that is responsive to available resources. Secondary data analysis was done to compare a protocol using a single-frequency fail criterion (which 2,147 children were screened with between 1 October 2017 and 25 February 2018) with a screening protocol using a two-frequency fail criterion (which 5,782 children were screened with between 26 February 2018 and 30 November 2018). For both protocols, screening was done at a 25 dB hearing level (HL) at 1000, 2000 and 4000 Hz. Both protocols included an immediate rescreen at the frequencies that were failed. The referral rate was 8.7% (n = 186) for the one-frequency fail protocol and 4.3% (n = 250) for the two-frequency fail protocol. Compared to the one-frequency fail protocol, children screened with the two-frequency fail protocol were 52.9% less likely to fail (OR: 0.471; 95% CI: 0.385–0.575). Gender (OR: 0.807; 95% CI: 0.531–1.225) and age (OR: 0.996; 95% CI: 0.708–1.402) had no significant effect on screening outcomes. Maximum permissible ambient noise levels (MPANLs) were exceeded in 44.7% of cases in at least one ear at 1000 Hz across both protocols. There was no significant difference between the protocols for both true positive cases and false positive cases. Protocol (OR: 1.338; 95% CI: 0.854–2.098), gender (OR: 0.807; 95% CI: 0.531–1.225) and age (OR: 0.996; 95% CI: 0.708–1.402) demonstrated no significant effect on the odds of producing true positive results. Average time for conducting the screening was 72.8 s (78.66 SD) for the one-frequency fail protocol and 64.9 s (55.78 SD) for the two-frequency fail protocol. Estimating the prevalence and describing the characteristics of sensory loss in a preschool population in low-income communities are important steps to ensure adequate planning and successful implementation of community-based hearing and vision care in this context. The study therefore also investigated the prevalence and characteristics of hearing and vision loss among preschool children (4 to 7 years) in an underserved South African community after implementing mHealth-supported community-based hearing and vision services. Children who failed hearing and vision screening were seen for follow-up assessments at their preschools. Follow-up assessments were also performed with smartphones and hearing and vision testing applications (hearTest application, hearX Group, South Africa and PeekAcuity app, Peek Vision, United Kingdom). A total of 10,390 children were screened at 298 preschools over 22 months. Of the children screened, 5.6% and 4.4% of children failed hearing and vision screening, respectively. Community-based follow-up hearing tests were done at the preschools on 88.5% (514) of the children, of whom 240 children (54.2% female) presented with hearing loss. A preschool-based follow-up vision test was conducted on 400 children (88.1%). A total of 232 children (46.1% female) had a vision impairment, and a further 32 children passed the test but had obvious signs of ocular morbidity. Logistic regression analysis found that age was a significant predictor of vision loss (p < 0.001): with every 1-year increase in age, participants were 51.4% less likely to have vision loss (OR: 0.49, 95% CI: 0.39–0.60). Age was not a significant predictor for hearing loss (OR: 0.821; 95% CI: 0.667–1.011). Gender was not a significant predictor of hearing loss (OR: 0.850; 95% CI: 0.658–1.099) or vision loss (OR: 1.185; 95% CI: 0.912–1.540). The prevalence of hearing loss at a pure tone average (PTA) of 25 dB HL ranged between 2.3% (240 out of 10,390; assuming none of the non-attenders and children who were unable to be tested had hearing loss) and 3.1% (321 out of 10,390; assuming all the non-attenders and children who were unable to be tested presented with hearing loss). The prevalence of vision loss ranged between 2.2% (232 out of 10,390; assuming none of the non-attenders had vision loss) and 2.8% (286 out of 10,390; assuming all the non-attenders presented with vision loss). Findings of this research project indicate that mHealth-supported CHW-delivered hearing and vision screening in preschools provide a low-cost, efficient and accessible service that can improve the provision of affordable hearing and vision care. This service-delivery model is affordable and scalable, because the same staff, needing minimal training, and the same equipment are used to screen for both vision and hearing. Timely identification of sensory losses is essential to ensure optimal outcomes and can be facilitated through community-based hearing and vision services by trained CHWs using mHealth technology. Future studies should aim to report on outcomes and the uptake and impact of interventions on the children diagnosed with sensory impairments following identification through a decentralised screening programme. / Thesis (PhD (Audiology))--University of Pretoria, 2021. / Sonova AG / Hear the World Foundation / Speech-Language Pathology and Audiology / PhD (Audiology) / Unrestricted
19

Decentralised Multi-agent Search, Track and Defence Coordination using a PMBM filter and Data-driven Robust Optimisation

Söderberg, Anton, Vines, Jesper January 2023 (has links)
In an air defence scenario decisions need to be taken with extreme precision and under high pressure. These decisions becomes even more challenging when the aircraft in question need to function as a team and coordinate their effort. Because of the difficulty of the task, and the amount of information that needs to be rapidly processed, fighter pilots can benefit greatly from computer-assisted decision making.  In this thesis this kind of decentralised multi-agent coordination problem is studied and mission assignment models, based on robust and stochastic optimisation, are evaluated. Since the information obtained by aircraft sensors often suffer from a notable amount of noise and the scenario state therefore is uncertain, a Poisson multi-Bernoulli mixture filter is implemented in order to model these noisy measurements and keep track of potential adversaries. The study finds that the filter used was more than capable of handling the scenario uncertainties and provided valuable task information to the mission assignment models. However, the preliminary robust optimisation models based entirely on the positional uncertainty of the adversaries were not sophisticated enough for such a complex coordination problem, indicating that further research is needed in this area.
20

A Comprehensive study on Federated Learning frameworks : Assessing Performance, Scalability, and Benchmarking with Deep Learning Model

Hamsath Mohammed Khan, Riyas January 2023 (has links)
Federated Learning now a days has emerged as a promising standard for machine learning model training, which can be executed collaboratively on decentralized data sources. As the adoption of Federated Learning grows, the selection of the apt frame work for our use case has become more important. This study mainly concentrates on a comprehensive overview of three prominent Federated Learning frameworks Flower, FedN, and FedML. The performance, scalability, and utilization these frame works is assessed on the basis of an NLP use case. The study commences with an overview of Federated Learning and its significance in distributed learning scenarios. Later on, we explore into the examination of the Flower framework in-depth covering its structure, communication methods and interaction with deep learning libraries. The performance of Flower is evaluated by conducting experiments on a standard benchmark dataset. Metrics provide measurements for accuracy, speed and scalability. Tests are also conducted to assess Flower's ability to handle large-scale Federated Learning setups. The same is carried out with the other two frameworks FedN and FedML also. To gain better insight into the strengths, limitations, and suitability of Flower, FedN, and FedML for different Federated Learning scenarios, the study utilizes the above stated comparative analysis on a real time use case. The possibilities for integrating these frameworks with current machine learning workflows are discussed. Furthermore, the final results and conclusions may help researchers and practitioners to make conversant decisions regarding framework selection for their Federated Learning applications. / <p>Det finns övrigt digitalt material (t.ex. film-, bild- eller ljudfiler) eller modeller/artefakter tillhörande examensarbetet som ska skickas till arkivet.</p><p>There are other digital material (eg film, image or audio files) or models/artifacts that belongs to the thesis and need to be archived.</p>

Page generated in 0.0827 seconds