• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 18
  • 10
  • 5
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 59
  • 59
  • 11
  • 10
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • 5
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Bezdrátová čidla pro měření hladiny vody / Wireless water level sensors

Pospíšil, Jakub January 2010 (has links)
The thesis deals with both scheme and its implementation of water-level metering apparatus. This data are send wireless into 500 m distant station. Potential ways of solution are gradually studied and final design suggested. Detailed implementation methods are described in the following section. Ultrasonic sensors are employed for level measurement and controlling element is processor ATmega162, data are transmitted by transceiver RC1280HP. Apparatus is suggested with a view to the lowest power consumption considering it will be supplied only with a accumulator. Solving of accepting station is not a part of the thesis. Functional tested sample is understated in the execution section.
52

Bezdrátový monitoring tlaku pneumatik / Wireless Tire Pressure Monitoring

Hampl, Tomáš January 2016 (has links)
This master‘s thesis describes the design of a system usable for wireless measurement of the tire pressure. The goal is to make a theoretical acquintace with the wireless measurement of a tire pressure. Within the master‘s thesis was suggested device at the block level. For individual blocks have been chosen specific circuits. For the device has been selected the pressure sensor SP100-7AT, the microcontrollers Atmega8 and Atmega16 and the wireless module RFM22. The result of the work is a functional prototype of device including software for microprocessors. Final solution was tested during the experimental measurement.
53

Popularizing implants : Exploring conditions for eliciting user adoption of digital implants through developers, enthusiasts and users

Ericsson Duffy, Mikael January 2020 (has links)
Digital implants have become a new frontier for body hackers, technology enthusiasts and disruptive innovation developers, who seek to service this technology for themselves and to new users. This thesis has explored conditions for future user adoption of human body augmentation with digital implants. The conditions explored were mainly self-beneficial health optimization through technology, self-quantification or convenience scenarios. Applying Diffusion Of Innovation theory, Value-based Acceptance Model and research through design methods were used. The process consisted of quantitative and qualitative data gathering and analysis, using interviews, surveys and iterative prototyping with evaluation. The results show mixed user attitude towards implant usage, mainly depending on users' need for added benefits, whether the user is a technology enthusiast actively using technology for self-beneficial gain or a casual everyday consumer of technology. Certain conditions could affect adoption of implants into mainstream usage, mainly data privacy, regulation, convenience, self-quantification or health management. In order for implants to succeed as a mainstream technology, there needs to be proper secure infrastructure, easy installation and coordinated services that offer individual benefits of health or convenience, with a high consumer confidence in supported services, installation / removal and devices. Several companies are working on offering such a service, in order to evaluate such a proposition, iterative prototypes were created to evaluate a health management scenario as a streamlined consumer service, using a service design blueprint and a related interactive smartphone application prototype.
54

Design Guidelines of A Low Power Communication Protocol for Zero Energy Devices

Zhang, Jiayue January 2023 (has links)
Lågströmskommunikationsprotokoll såsom 6LoWPAN har använts i stor utsträckning för applikationer som kräver mindre energiförbrukning för trådlös kommunikation på korta avstånd, exempelvis IoT-enheter. Eftersom antalet sådana enheter ökar blir det allt viktigare att överväga ambient energy harvesting som en energikälla för att driva sådana enheter. Det framkallar ett behov av att ompröva designen av ett energieffektivt kommunikationsprotokoll som gör det möjligt för sensorer och aktuatorer att använda den utvunna energin för beräkning och kommunikation. Eftersom den utvunna energin från en energikälla är begränsad och det tar tid för en enhet att samla tillräckligt med energi för datahantering och kommunikation, finns det ett behov av att undersöka energibudgeten och bestämma de kritiska parametrarna som påverkar energiförbrukningen för trådlös kommunikation. En analys av energiförbrukningen utfördes genom att anpassa en Python-modell och simuleringar genomfördes för att hjälpa till att förstå påverkan av nyckelparametrar på energiförbrukningen med hänsyn till en lämplig radio frequency energy harvesting (RF-EH) för “zero” energienheter. I examensarbetet föreslås designöverväganden för ett nytt lågströmskommunikationsprotokoll för “zero” energienheter. Resultaten visade att adaptive data rate (ADR) har en stor betydelse för energibesparingar. Med lämpliga överföringsparametrar inställda kan energiförlusterna för omsändningar och kollisioner minskas. Det är också möjligt att införa en schemaläggningsalgoritm för kommunikationsprocessen för förbättrad kollisionsundvikande. De föreslagna designövervägandena kan tillämpas i framtida arbeten för att förbättra kortdistanskommunikationsprotokollet för “zero” energienheter. / Low power communication protocols such as 6LoWPAN have been widely used on applications that require less energy consumption for short-range wireless communication, for example, Internet of Thing (IoT) devices. As the amount of these devices escalates, it becomes increasingly important to consider ambient energy harvesting (EH) as an energy source to power such devices. This induces a need to reconsider the design of an energy-efficient data transfer protocol that enables the sensors and actuators to utilize the harvested energy for computing and communication. As the harvested energy from an energy source is limited and it takes time for a device to accumulate enough energy for data processing and communication, there is a need to investigate the energy budget and determine the critical parameters that affect the energy consumption for wireless communication. An energy consumption analysis was performed by adapting a Python model, and simulations were carried out to help understand the impact of key parameters on energy consumption while considering a suitable range for radio frequency (RF) energy harvesting “zero” energy devices. The thesis project aims to propose the design considerations of a new low-power communication protocol for “zero” energy devices. The results showed that adaptive data rate (ADR) has a major contribution to energy saving. With suitable transmitting parameters set, the energy waste of retransmissions and collisions could be reduced. It is also possible to introduce a scheduling algorithm to the communication process for improved collision avoidance. The proposed design considerations can be applied in future work to improve the shortrange communication protocol for zero-energy devices.
55

Automatic Data Allocation, Buffer Management And Data Movement For Multi-GPU Machines

Ramashekar, Thejas 10 1900 (has links) (PDF)
Multi-GPU machines are being increasingly used in high performance computing. These machines are being used both as standalone work stations to run computations on medium to large data sizes (tens of gigabytes) and as a node in a CPU-Multi GPU cluster handling very large data sizes (hundreds of gigabytes to a few terabytes). Each GPU in such a machine has its own memory and does not share the address space either with the host CPU or other GPUs. Hence, applications utilizing multiple GPUs have to manually allocate and managed at a on each GPU. A significant body of scientific applications that utilize multi-GPU machines contain computations inside affine loop nests, i.e., loop nests that have affine bounds and affine array access functions. These include stencils, linear-algebra kernels, dynamic programming codes and data-mining applications. Data allocation, buffer management, and coherency handling are critical steps that need to be performed to run affine applications on multi-GPU machines. Existing works that propose to automate these steps have limitations and in efficiencies in terms of allocation sizes, exploiting reuse, transfer costs and scalability. An automatic multi-GPU memory manager that can overcome these limitations and enable applications to achieve salable performance is highly desired. One technique that has been used in certain memory management contexts in the literature is that of bounding boxes. The bounding box of an array, for a given tile, is the smallest hyper-rectangle that encapsulates all the array elements accessed by that tile. In this thesis, we exploit the potential of bounding boxes for memory management far beyond their current usage in the literature. In this thesis, we propose a scalable and fully automatic data allocation and buffer management scheme for affine loop nests on multi-GPU machines. We call it the Bounding Box based Memory Manager (BBMM). BBMM is a compiler-assisted runtime memory manager. At compile time, it use static analysis techniques to identify a set of bounding boxes accessed by a computation tile. At run time, it uses the bounding box set operations such as union, intersection, difference, finding subset and superset relation to compute a set of disjoint bounding boxes from the set of bounding boxes identified at compile time. It also exploits the architectural capability provided by GPUs to perform fast transfers of rectangular (strided) regions of memory and hence performs all data transfers in terms of bounding boxes. BBMM uses these techniques to automatically allocate, and manage data required by applications (suitably tiled and parallelized for GPUs). This allows It to (1) allocate only as much data (or close to) as is required by computations running on each GPU, (2) efficiently track buffer allocations and hence, maximize data reuse across tiles and minimize the data transfer overhead, (3) and as a result, enable applications to maximize the utilization of the combined memory on multi-GPU machines. BBMM can work with any choice of parallelizing transformations, computation placement, and scheduling schemes, whether static or dynamic. Experiments run on a system with four GPUs with various scientific programs showed that BBMM is able to reduce data allocations on each GPU by up to 75% compared to current allocation schemes, yield at least 88% of the performance of hand-optimized Open CL codes and allows excellent weak scaling.
56

Klient-server mobilní aplikace se zpracováním obrazu / Client-Server Mobile Application with Image Processing

Černošek, Bedřich January 2018 (has links)
The main goal of this work is creating client-server application with image processing and cryptographic verification of image source and creation time. The work focuses on creating a mobile client application on the Android platform that securly takes photos by mobile device camera, processes captured images and provides a digital signature, timestamp and GPS location. The main part of the work is secure key exchange, encrypted communication, data and energy efficiency of the client-server application. The server application is implemented on the Java EE platform and processes the received image, performs object detection, object recognition in the image and provides a timestamp from a trusted server. Then taken photo can be considered as a trusted electronic document usable as valid evidence for the judical or administrative proceedings.
57

Interaktivní segmentace 3D CT dat s využitím hlubokého učení / Interactive 3D CT Data Segmentation Based on Deep Learning

Trávníčková, Kateřina January 2020 (has links)
This thesis deals with CT data segmentation using convolutional neural nets and describes the problem of training with limited training sets. User interaction is suggested as means of improving segmentation quality for the models trained on small training sets and the possibility of using transfer learning is also considered. All of the chosen methods help improve the segmentation quality in comparison with the baseline method, which is the use of automatic data specific segmentation model. The segmentation has improved by tens of percents in Dice score when trained with very small datasets. These methods can be used, for example, to simplify the creation of a new segmentation dataset.
58

Protection of Personal Data, a Power Struggle between the EU and the US: What implications might be facing the transfer of personal data from the EU to the US after the CJEU’s Safe Harbour ruling?

Strindberg, Mona January 2016 (has links)
Since the US National Security Agency’s former contractor Edward Snowden exposed the Agency’s mass surveillance, the EU has been making a series of attempts toward a more safeguarded and stricter path concerning its data privacy protection. On 8 April 2014, the Court of Justice of the European Union (the CJEU) invalidated the EU Data Retention Directive 2006/24/EC on the basis of incompatibility with the Charter of Fundamental Rights of the European Union (the Charter). After this judgment, the CJEU examined the legality of the Safe Harbour Agreement, which had been the main legal basis for transfers of personal data from the EU to the US under Decision 2000/520/EC. Subsequently, on 6 October 2015, in the case of Schrems v Data Protection Commissioner, the CJEU declared the Safe Harbour Decision invalid. The ground for the Court’s judgment was the fact that the Decision enabled interference, by US public authorities, with the fundamental rights to privacy and personal data protection under Article 7 and 8 of the Charter, when processing the personal data of EU citizens. According to the judgment, this interference has been beyond what is strictly necessary and proportionate to the protection of national security and the persons concerned were not offered any administrative or judicial means of redress enabling the data relating to them to be accessed, rectified or erased. The Court’s analysis of the Safe Harbour was borne out of the EU Commission’s own previous assessments. Consequently, since the transfers of personal data between the EU and the US can no longer be carried out through the Safe Harbour, the EU legislature is left with the task to create a safer option, which will guarantee that the fundamental rights to privacy and protection of personal data of the EU citizens will be respected. However, although the EU is the party dictating the terms for these transatlantic transfers of personal data, the current provisions of the US law are able to provide for derogations from every possible renewed agreement unless they become compatible with the EU data privacy law. Moreover, as much business is at stake and prominent US companies are involved in this battle, the pressure toward the US is not only coming from the EU, but some American companies are also taking the fight for EU citizens’ right to privacy and protection of their personal data.
59

Streamlining Certification Management with Automation and Certification Retrieval : System development using ABP Framework, Angular, and MongoDB / Effektivisering av certifikathantering med automatisering och certifikathämtning : Systemutveckling med ABP Framework, Angular och MongoDB

Hassan, Nour Al Dine January 2024 (has links)
This thesis examines the certification management challenge faced by Integrity360. The decentralized approach, characterized by manual processes and disparate data sources, leads to inefficient tracking of certification status and study progress. The main objective of this project was to construct a system that automates data retrieval, ensures a complete audit, and increases security and privacy.  Leveraging the ASP.NET Boilerplate (ABP) framework, Angular, and MongoDB, an efficient and scalable system was designed, developed, and built based on DDD (domain-driven design) principles for a modular and maintainable architecture. The implemented system automates data retrieval from the Credly API, tracks exam information, manages exam vouchers, and implements a credible authentication system with role-based access control.  With the time limitations behind the full-scale implementation of all the planned features, such as a dashboard with aggregated charts and automatic report generation, the platform significantly increases the efficiency and precision of employee certification management. Future work will include these advanced functionalities and integrations with external platforms to improve the system and increase its impact on operations in Integrity360.

Page generated in 0.0734 seconds