• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 97
  • 6
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 131
  • 131
  • 61
  • 35
  • 34
  • 34
  • 33
  • 33
  • 32
  • 32
  • 25
  • 22
  • 21
  • 20
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Optimizing YOLOv5 Deployment : A Comparative Study of In-Node and Remote Processing on Edge Devices

Wijitchakhorn, Alice January 2024 (has links)
Artificiell intelligens utvecklas i snabb takt, och objektdetektering har blivit en central komponent inom detta område. Objektdetektering möjliggör att automatiserade system på ett noggrant sätt kan identifiera och lokalisera objekt i bilder. En av de mest framstående metoderna för detta ändamål är YOLOv5 (You Only Look Once, version 5), känd för sin snabbhet och effektivitet i realtidsapplikationer. Implementeringen av sådan avancerad teknologi på mindre enheter som Raspberry Pi 4 är utmaningar, främst till följd av begränsad processorkraft och energitillgång på dessa små enheter. Denna avhandling undersöker den optimala användningen av YOLOv5- modellen med hänsyn till energieffektivitet och latens i kommunikationen. Dessa aspekter är särskilt kritiska för enheter som kräver hög effektivitet, exempelvis smartphones, drönare och andra portabla enheter. Studien jämför två huvudsakliga tillvägagångssätt: bearbetning direkt på enheten (in-node) och distans utförande på en server. Genom att välja en lämplig metod för processkörning påverkas effektiviteten av objektdetektering i praktiska tillämpningar. Att bearbeta data direkt på enheten kan ge fördelar i form av snabbare svarstid och bättre integritet, eftersom det undviker att skicka data över nätverket. Dock kan detta öka energiförbrukningen och ökad belastning på enheten. Å andra sidan kan remote process, som utnyttjar kraftfulla datorer, förbättra prestandan och minska belastningen på enheten, men detta kan leda till ökad latens och potentiella integritetsproblem. Genom att använda resurser från remote-servern kan arbetsbelastningen på enheter som Raspberry Pi minskas, vilket resulterar i förbättrad energieffektivitet och latens över samtliga testade upplösningar. / Artificial intelligence is advancing quickly, and object detection has become a key part of this field. Object detection helps automated systems recognise and object detecting pictures very accurately. One of the best methods for this is YOLOv5 (You Only Look Once, version 5), known for working fast and well in real-time uses. However, using such sophisticated technology on smaller devices like the Raspberry Pi 4 can be challenging. These challenges come mainly from limited processing power and energy availability on such small devices. This thesis explores the best way to use the YOLOv5 model while considering energy efficiency and latency between communication. These aspects are crucial when devices need to be efficient, like smartphones, drones, or other portable devices. The study compares two main ways to set up the system: processing directly on the device (in-node) and processing remotely on a server or in the cloud. Choosing where to process the data affects the effectiveness of object detection in real-world applications. Processing on the device can be better for privacy and speed since it does not need to send data over a network. However, this might use more energy and put more strain on the device. On the other hand, processing remotely can use powerful computers to improve performance and reduce the load on the device, but it might make things slower and raise privacy issues. By using remote server resources, the workload in single-processing devices like Rasberry Pi is drastically reduced, which shows better energy efficiency and latency in all test resolutions.
32

Fog Computing with Go: A Comparative Study

Butterfield, Ellis H 01 January 2016 (has links)
The Internet of Things is a recent computing paradigm, de- fined by networks of highly connected things – sensors, actuators and smart objects – communicating across networks of homes, buildings, vehicles, and even people. The Internet of Things brings with it a host of new problems, from managing security on constrained devices to processing never before seen amounts of data. While cloud computing might be able to keep up with current data processing and computational demands, it is unclear whether it can be extended to the requirements brought forth by Internet of Things. Fog computing provides an architectural solution to address some of these problems by providing a layer of intermediary nodes within what is called an edge network, separating the local object networks and the Cloud. These edge nodes provide interoperability, real-time interaction, routing, and, if necessary, computational delegation to the Cloud. This paper attempts to evaluate Go, a distributed systems language developed by Google, in the context of requirements set forth by Fog computing. Similar methodologies of previous literature are simulated and benchmarked against in order to assess the viability of Go in the edge nodes of Fog computing architecture.
33

Privacy Protection and Mobility Enhancement in Internet

Ping Zhang (6595925) 10 June 2019 (has links)
<div>The Internet has substantially embraced mobility since last decade. Cellular data network carries majority of Internet mobile access traffic and become the de facto solution of accessing Internet in mobile fashion, while many clean-slate Internet mobility solutions were proposed but none of them has been largely deployed. Internet mobile users increasingly concern more about their privacy as both researches and real-world incidents show leaking of communication and location privacy could lead to serious consequences. Just the communication itself between mobile user and their peer users or websites could leak considerable privacy of mobile user, such as location history, to other parties. Additionally, comparing to ordinary Internet access, connecting through cellular network yet provides equivalent connection stability or longevity.</div><div><br></div><div>In this research we proposed a novelty paradigm that leverages concurrent far-side proxies to maximize network location privacy protection and minimize interruption and performance penalty brought by mobility. To avoid the deployment feasibility hurdle we also investigated the root causes impeding popularity of existing Internet mobility proposals and proposed guidelines on how to create an economical feasible solution for this goal. Based on these findings we designed a mobility support system offered as a value-added service by mobility service providers and built on elastic infrastructure that leverages various cloud aided designs, to satisfy economic feasibility and explore the architectural trade-offs among service QoS, economic viability, security and privacy. </div>
34

Fog e edge computing : uma arquitetura h?brida em um ambiente de internet das coisas

Schenfeld, Matheus Crespi 23 March 2017 (has links)
Submitted by Caroline Xavier (caroline.xavier@pucrs.br) on 2017-11-14T10:44:09Z No. of bitstreams: 1 DIS_MATHEUS_CRESPI_SCHENFELD_COMPLETO.pdf: 6989470 bytes, checksum: 4a16f12e8953d43da2cb18cc63c6119a (MD5) / Approved for entry into archive by Caroline Xavier (caroline.xavier@pucrs.br) on 2017-11-14T10:44:28Z (GMT) No. of bitstreams: 1 DIS_MATHEUS_CRESPI_SCHENFELD_COMPLETO.pdf: 6989470 bytes, checksum: 4a16f12e8953d43da2cb18cc63c6119a (MD5) / Made available in DSpace on 2017-11-14T10:44:39Z (GMT). No. of bitstreams: 1 DIS_MATHEUS_CRESPI_SCHENFELD_COMPLETO.pdf: 6989470 bytes, checksum: 4a16f12e8953d43da2cb18cc63c6119a (MD5) Previous issue date: 2017-03-23 / Internet of Things (IoT) is considered a computational evolution that advocates the existence of a large number of physical objects embedded with sensors and actuators, connected by wireless networks and communicating through the Internet. From the beginning of the concept to the present day, IoT is widely used in the various sectors of industry and also in academia. One of the needs encountered in these areas was to be connected to IoT devices or subsystems throughout the world. Thus, cloud computing gains space in these scenarios where there is a need to be connected and communicating with a middleware to perform the data processing of the devices. The concept of cloud computing refers to the use of memory, storage and processing of shared resources, interconnected by the Internet. However, IoT applications sensitive to communication latency, such as medical emergency applications, military applications, critical security applications, among others, are not feasible with the use of cloud computing, since for the execution of all calculations and actions messaging between devices and the cloud is required. Solving this limitation found in the use of cloud computing, the concept of fog computing arises and whose main idea is to create a federated processing layer, still in the local network of the computing devices of the ends of the network. In addition to fog computing, there is also edge computing operating directly on the devices layer, performing some kind of processing, even with little computational complexity, in order to further decrease the volume of communication, besides collaborating to provide autonomy in decision making yet in the Things layer. A major challenge for both fog and edge computing within the IoT scenario is the definition of a system architecture that can be used in different application domains, such as health, smart cities and others. This work presents a system architecture for IoT devices capable of enabling data processing in the devices themselves or the closest to them, creating the edge computing layer and fog computing layer that can be applied in different domains, improving Quality of Services (QoS) and autonomy in decision making, even if the devices are temporarily disconnected from the network (offline). The validation of this architecture was done within two application scenarios, one of public lighting in smart city environment and another simulating an intelligent agricultural greenhouse. The main objectives of the tests were to verify if the use of the concepts of edge and fog computing improve system efficiency compared to traditional IoT architectures. The tests revealed satisfactory results, improving connection times, processing and delivery of information to applications, reducing the volume of communication between devices and core middleware, and improving communications security. It also presents a review of related work in both academia and industry. / Internet das Coisas (IoT) ? considerada uma evolu??o computacional que preconiza a exist?ncia de uma grande quantidade de objetos f?sicos embarcados com sensores e atuadores, conectados por redes sem fio e que se comunicam atrav?s da Internet. Desde o surgimento do conceito at? os dias atuais, a IoT ? amplamente utilizada nos diversos setores da ind?stria e tamb?m no meio acad?mico. Uma das necessidades encontradas nessas ?reas foi a de estar conectado com dispositivos ou subsistemas de IoT espalhados por todo o mundo. Assim, cloud computing ganha espa?o nesses cen?rios, onde existe a necessidade de estar conectado e se comunicando com um middleware para realizar o processamento dos dados dos dispositivos. O conceito de cloud computing refere-se ao uso de mem?ria, armazenamento e processamento de recursos compartilhados, interligados pela Internet. No entanto, aplica??es IoT sens?veis ? lat?ncia de comunica??o, tais como, aplica??es m?dico-emergenciais, aplica??es militares, aplica??es de seguran?a cr?tica, entre outras, s?o invi?veis com o uso de cloud computing, visto que para a execu??o de todos os c?lculos e a??es ? necess?ria a troca de mensagens entre dispositivos e nuvem. Solucionando essa limita??o encontrada na utiliza??o de cloud computing, surge o conceito de fog computing, cuja ideia principal ? criar uma camada federada de processamento ainda na rede local dos dispositivos de computa??o das extremidades da rede. Al?m de fog computing tamb?m surge edge computing operando diretamente na camada dos dispositivos, realizando algum tipo de processamento, mesmo que de pouca complexidade computacional, a fim de diminuir ainda mais o volume de comunica??o, al?m de colaborar para prover autonomia na tomada de decis?es ainda na camada das coisas. Um grande desafio tanto para fog quanto para edge computing dentro do cen?rio de IoT ? a defini??o de uma arquitetura de sistema que possa ser usada em diferentes dom?nios de aplica??o, como sa?de, cidades inteligentes entre outros. Esse trabalho apresenta uma arquitetura de sistema para dispositivos IoT capaz de habilitar o processamento de dados nos pr?prios dispositivos ou o mais pr?ximo deles, criando a camada de edge e fog computing que podem ser aplicadas em diferentes dom?nios, melhorando a Qualidade dos Servi?os (QoS) e autonomia na tomada de decis?o, mesmo se os dispositivos estiverem temporariamente desconectados da rede (offline). A valida??o dessa arquitetura foi feita dentro de dois cen?rios de aplica??o, um de ilumina??o p?blica em ambiente de IoT e outro simulando uma estufa agr?cola inteligente. Os principais objetivos das execu??es dos testes foram verificar se a utiliza??o dos conceitos de edge e fog computing melhoram a efici?ncia do sistema em compara??o com arquiteturas tradicionais de IoT. Os testes revelaram resultados satisfat?rios, melhorando os tempos de conex?o, processamento e entrega das informa??es ?s aplica??es, redu??o do volume de comunica??o entre dispositivos e core middleware, al?m de melhorar a seguran?a nas comunica??es. Tamb?m ? apresentada uma revis?o de trabalhos relacionados tanto no meio acad?mico como no da ind?stria.
35

An Approach to QoS-based Task Distribution in Edge Computing Networks for IoT Applications

January 2018 (has links)
abstract: Internet of Things (IoT) is emerging as part of the infrastructures for advancing a large variety of applications involving connections of many intelligent devices, leading to smart communities. Due to the severe limitation of the computing resources of IoT devices, it is common to offload tasks of various applications requiring substantial computing resources to computing systems with sufficient computing resources, such as servers, cloud systems, and/or data centers for processing. However, this offloading method suffers from both high latency and network congestion in the IoT infrastructures. Recently edge computing has emerged to reduce the negative impacts of tasks offloading to remote computing systems. As edge computing is in close proximity to IoT devices, it can reduce the latency of task offloading and reduce network congestion. Yet, edge computing has its drawbacks, such as the limited computing resources of some edge computing devices and the unbalanced loads among these devices. In order to effectively explore the potential of edge computing to support IoT applications, it is necessary to have efficient task management and load balancing in edge computing networks. In this dissertation research, an approach is presented to periodically distributing tasks within the edge computing network while satisfying the quality-of-service (QoS) requirements of tasks. The QoS requirements include task completion deadline and security requirement. The approach aims to maximize the number of tasks that can be accommodated in the edge computing network, with consideration of tasks’ priorities. The goal is achieved through the joint optimization of the computing resource allocation and network bandwidth provisioning. Evaluation results show the improvement of the approach in increasing the number of tasks that can be accommodated in the edge computing network and the efficiency in resource utilization. / Dissertation/Thesis / Doctoral Dissertation Computer Engineering 2018
36

Planning of Mobile Edge Computing Resources in 5G Based on Uplink Energy Efficiency

Singh, Navjot 19 November 2018 (has links)
Increasing number of devices demand for low latency and high-speed data transmission require that the computation resources to be closer to users. The emerging Mobile Edge Computing (MEC) technology aims to bring the advantages of cloud computing which are computation, storage and networking capabilities in close proximity of user. MEC servers are also integrated with cloud servers which give them flexibility of reaching vast computational power whenever needed. In this thesis, leveraging the idea of Mobile Edge Computing, we propose algorithms for cost-efficient and energy-efficient the placement of Mobile Edge nodes. We focus on uplink energy-efficiency which is essential for certain applications including augmented reality and connected vehicles, as well as extending battery life of user equipment that is favorable for all applications. The experimental results show that our proposed schemes significantly reduce the uplink energy of devices and minimizes the number of edge nodes required in the network.
37

Study of Knowledge Transfer Techniques For Deep Learning on Edge Devices

January 2018 (has links)
abstract: With the emergence of edge computing paradigm, many applications such as image recognition and augmented reality require to perform machine learning (ML) and artificial intelligence (AI) tasks on edge devices. Most AI and ML models are large and computational heavy, whereas edge devices are usually equipped with limited computational and storage resources. Such models can be compressed and reduced in order to be placed on edge devices, but they may loose their capability and may not generalize and perform well compared to large models. Recent works used knowledge transfer techniques to transfer information from a large network (termed teacher) to a small one (termed student) in order to improve the performance of the latter. This approach seems to be promising for learning on edge devices, but a thorough investigation on its effectiveness is lacking. The purpose of this work is to provide an extensive study on the performance (both in terms of accuracy and convergence speed) of knowledge transfer, considering different student-teacher architectures, datasets and different techniques for transferring knowledge from teacher to student. A good performance improvement is obtained by transferring knowledge from both the intermediate layers and last layer of the teacher to a shallower student. But other architectures and transfer techniques do not fare so well and some of them even lead to negative performance impact. For example, a smaller and shorter network, trained with knowledge transfer on Caltech 101 achieved a significant improvement of 7.36\% in the accuracy and converges 16 times faster compared to the same network trained without knowledge transfer. On the other hand, smaller network which is thinner than the teacher network performed worse with an accuracy drop of 9.48\% on Caltech 101, even with utilization of knowledge transfer. / Dissertation/Thesis / Masters Thesis Computer Science 2018
38

Distributed Intelligence-Assisted Autonomic Context-Information Management : A context-based approach to handling vast amounts of heterogeneous IoT data

Rahman, Hasibur January 2018 (has links)
As an implication of rapid growth in Internet-of-Things (IoT) data, current focus has shifted towards utilizing and analysing the data in order to make sense of the data. The aim of which is to make instantaneous, automated, and informed decisions that will drive the future IoT. This corresponds to extracting and applying knowledge from IoT data which brings both a substantial challenge and high value. Context plays an important role in reaping value from data, and is capable of countering the IoT data challenges. The management of heterogeneous contextualized data is infeasible and insufficient with the existing solutions which mandates new solutions. Research until now has mostly concentrated on providing cloud-based IoT solutions; among other issues, this promotes real-time and faster decision-making issues. In view of this, this dissertation undertakes a study of a context-based approach entitled Distributed intelligence-assisted Autonomic Context Information Management (DACIM), the purpose of which is to efficiently (i) utilize and (ii) analyse IoT data. To address the challenges and solutions with respect to enabling DACIM, the dissertation starts with proposing a logical-clustering approach for proper IoT data utilization. The environment that the number of Things immerse changes rapidly and becomes dynamic. To this end, self-organization has been supported by proposing self-* algorithms that resulted in 10 organized Things per second and high accuracy rate for Things joining. IoT contextualized data further requires scalable dissemination which has been addressed by a Publish/Subscribe model, and it has been shown that high publication rate and faster subscription matching are realisable. The dissertation ends with the proposal of a new approach which assists distribution of intelligence with regard to analysing context information to alleviate intelligence of things. The approach allows to bring few of the application of knowledge from the cloud to the edge; where edge based solution has been facilitated with intelligence that enables faster responses and reduced dependency on the rules by leveraging artificial intelligence techniques. To infer knowledge for different IoT applications closer to the Things, a multi-modal reasoner has been proposed which demonstrates faster response. The evaluations of the designed and developed DACIM gives promising results, which are distributed over seven publications; from this, it can be concluded that it is feasible to realize a distributed intelligence-assisted context-based approach that contribute towards autonomic context information management in the ever-expanding IoT realm. / <p>At the time of the doctoral defense, the following paper was unpublished and had a status as follows: Paper 7: Submitted.</p>
39

An evaluation of how edge computing is enabling the opportunities for Industry 4.0

Svensson, Wictor January 2020 (has links)
Connecting factories to the internet and enable the possibilities for these to autonomously talk to each other is called the Industrial Internet of Things(IIoT) and is mentioned as Industry 4.0 in the terms of the industrial revolutions. The machines are collecting data through very many different sensors and need to share these values with each other and the cloud. This will make a large load to the cloud and the internet, and the latency will be large. To evaluate how the workload and the latency can be reduced and still get the same result as using the cloud, two different systems are implemented. One which uses cloud and one which using edge computing. Edge computing is when the processing of the data is decentralized to the edge of the network. This thesis aims to find out ”When is it more favorable to use an edge solution and when is it to prefer a cloud solution”. The first system is implemented with an edge platform, Crosser, the second system is implemented with a cloud platform, Azure. Both implementations are giving the same outputs but the differences is where the data is processed. The systems are measured in latency, bandwidth, and CPU usage. The result of the measurements shows that the Crosser system has less latency, using smaller bandwidth but is needing more computational power of the device which is on the edge of the network. The conclusion of the results is that it depends on the demands of the system. Is the demands that it should have low latency and not using much bandwidth Crosser is to prefer. But if a very heavy machine learning algorithm is going to be executed in the system and the latency and bandwidth size is not a problem then the Cloud Reference System is to prefer.
40

Kriging Methods to Exploit Spatial Correlations of EEG Signals for Fast and Accurate Seizure Detection in the IoMT

Olokodana, Ibrahim Latunde 08 1900 (has links)
Epileptic seizure presents a formidable threat to the life of its sufferers, leaving them unconscious within seconds of its onset. Having a mortality rate that is at least twice that of the general population, it is a true cause for concern which has gained ample attention from various research communities. About 800 million people in the world will have at least one seizure experience in their lifespan. Injuries sustained during a seizure crisis are one of the leading causes of death in epilepsy. These can be prevented by an early detection of seizure accompanied by a timely intervention mechanism. The research presented in this dissertation explores Kriging methods to exploit spatial correlations of electroencephalogram (EEG) Signals from the brain, for fast and accurate seizure detection in the Internet of Medical Things (IoMT) using edge computing paradigms, by modeling the brain as a three-dimensional spatial object, similar to a geographical panorama. This dissertation proposes basic, hierarchical and distributed Kriging models, with a deep neural network (DNN) wrapper in some instances. Experimental results from the models are highly promising for real-time seizure detection, with excellent performance in seizure detection latency and training time, as well as accuracy, sensitivity and specificity which compare well with other notable seizure detection research projects.

Page generated in 0.034 seconds