• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1044
  • 402
  • 315
  • 176
  • 112
  • 106
  • 37
  • 34
  • 29
  • 24
  • 17
  • 14
  • 13
  • 12
  • 7
  • Tagged with
  • 2736
  • 784
  • 528
  • 323
  • 319
  • 296
  • 254
  • 248
  • 231
  • 224
  • 220
  • 219
  • 195
  • 180
  • 173
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Realizace Revenue Assurance kontroly ve společnosti Vodafone CZ / Implementation of Revenue Assurance control in the Vodafone CZ inc.

Zapletal, Jakub January 2010 (has links)
This thesis deals with the controlling environment in the Revenue Assurance team of the telecommunication company Vodafone Inc. Its main goals are the suggestion of realizing and implementation of a new controlling process in Revenue Assurance. The individual goals of the thesis are introducing the term Revenue Assurance, so that the reader would understand the basic connection, and explanation of understanding the Revenue Assurance concept in other companies. The term is further compared to other controlling concepts such as controlling, risk management and internal audit, and the main differences among them are stated. Additionally, it explains the connection between the Revenue Assurance concept and the Sarbanes-Oxley Act. This thesis' part is then followed by an explanation of understanding the Revenue Assurance concept in the telecommunication company Vodafone CZ. The key part of the thesis is the description of the suggestion, realizing and implementation of the control into the company process. This part is preceded by a descriptive part dealing with description of the chosen method and the analysis of a telecommunication environment. The final part then deals with the question if the control has been implemented successfully and how it was working during its existence. Finally, I state the benefit for the company according to the number of the cases and the revealed amount of revenue leakage.
192

Simulations and electronics development for the LHAASO experiment / Simulations et développement d’électronique pour l’expérience LHAASO

Chen, Yingtao 23 July 2015 (has links)
Le travail de thèse porte sur l'étude de l'électronique front-end pour le télescope WFCTA (Wide Field of View Cherenkov Telescope Array,) qui est l'un des détecteurs de l’observatoire LHAASO (Large High Altitude Air shower Observatory,). Le manuscrit de thèse couvre six thèmes principaux allant de la simulation physique au développement d’un nouveau système d'acquisition de données.Tout d'abord, les principes de la physique des rayons cosmiques et de l'expérience LHAASO sont présentés donnant ainsi une introduction sur les sujets discutés dans la thèse. Des simulations ont été faites dans le but de comprendre la propagation des rayons cosmiques dans l'atmosphère et d’en déduire les caractéristiques du signal d'entrée de l'électronique. Ces simulations ont également été utilisées pour approfondir la compréhension des spécifications du télescope et de les vérifier.Un nouveau modèle de PMT a été élaboré pour être utilisé dans les simulations. Ce nouveau modèle est comparé aux autres modèles de PMT. Des modèles d’électronique pour les conceptions basées sur les composants électroniques classiques et sur l’ASIC (Application-specific Integrated Circuit) sont construites et étudiées. Ces deux solutions remplissent les spécifications du télescope WFCTA. Néanmoins, compte tenu du développement de la micro-électronique, il est proposé que l’électronique des télescopes de haute performance devrait être basée sur l’ASIC.L'ASIC sélectionné, PARISROC 2, est évalué en utilisant des bancs de tests existants. Les résultats montrent que ces bancs de tests ne peuvent pas démontrer pleinement la véritable performance de l’ASIC. Par conséquent, une carte électronique front-end prototype qui est basée sur ASIC a été conçu et fabriqué. Plusieurs modifications ont été apportées pour améliorer la performance de la nouvelle carte. Une description détaillée de ce développement est présentée dans la thèse. Un nouveau système d’acquisition de données a également été conçu pour améliorer la capacité de lecture de données dans le banc de tests de la carte front-end.Enfin, une série de tests ont été effectués pour vérifier le concept de design et pour évaluer la performance de la carte front-end. Ces résultats montrent la bonne performance générale de l'ASIC PARISROC 2 et que la carte front-end répond globalement aux spécifications de la WFCTA. Basé sur les résultats de ce travail de thèse, un nouveau ASIC, mieux adapté pour les télescopes de type WFCTA, a été conçu et est actuellement en cours de fabrication. / This thesis is focused on the study of the front-end electronics for the wide field of view Cherenkov telescope array (WFCTA), which is one of the large high altitude air shower observatory (LHAASO) detectors. The thesis manuscript covers six main topics going from the physics simulations to the implementation of a new data acquisition system. The physics of cosmic rays and the LHAASO experiment is presented giving foundation for discussion of the main topics of the thesis. Simulations were performed to understand the propagation of cosmic rays in the atmosphere and to determine the characteristics of the input signal of the electronics. These simulations allow also understand the specifications of the telescope and to verify them. A new PMT model was successfully built for both physical and electronic simulations. This new model is compared to other models and its performance is evaluated. Behavior models for the designs based on the classical electronics and application-specific integrated circuit (ASIC) were built and studied. It is shown that both solutions fit the requirements of the telescope. However, considering the development of the micro-electronics, it is proposed that the electronics of the high-performance telescopes should be based on ASIC. The selected ASIC, PARISROC 2, is evaluated by using the existing application boards. The results showed that the designs considered could not fully demonstrate the real performance of the chip. Therefore, a prototype front-end electronics board, based on PARISROC 2, was designed, implemented and fabricated. Several modifications and enhancements were made to improve the performance of the new design. A detailed description of the development is presented and discussed in the manuscript. Furthermore, a new data acquisition system was developed to enhance the readout capabilities in the front-end test bench.Finally, a series of tests were performed to verify the concept of the design and to evaluate the front-end board. The results show the good general performance of the PARISROC 2 and that this design globally meets the specifications of the WFCTA. Based on the results of this thesis work, a new ASIC chip, better adapted for telescopes such as WFCTA, has been designed and is currently being fabricated.
193

Návrh, tvorba a implementace softwarové aplikace ve firemním prostředí / Design, Creation and Implementation of Software Applications in the Corporate Environment

Zsiga, Juraj January 2021 (has links)
The diploma thesis deals with the design, creation and implementation of a software application in the corporate environment of Velká Pecka s.r.o., better known as Rohlík. The first goal is to analyze the given company and find its shortcomings. The second, the main one, is to create and implement a software, which would eliminate them. The resulting solution is a mobile application, which improves issue reporting in their warehouses, thus saving resources overall.
194

Untersuchung von AGE und RAGE im proximalen Aortenaneurysma von Patienten mit bikuspider oder trikuspider Aortenklappe

Heiser, Linda 04 March 2020 (has links)
In der vorliegenden Arbeit wurde aneurysmatisches Aortengewebe von Patienten mit bikuspider oder trikuspider Aortenklappe untersucht. Im Laufe des Lebens ist die bikuspide Aortenklappe als häufigste angeborene Anomalie des Herzens mit zahlreichen, potentiell lebensbedrohlichen Komplikationen verbunden. Betroffene Patienten zeigen eine frühere Entwicklung und rapidere Progression von Dilatationen und – im schlimmsten Fall – Dissektionen der Aorta ascendens. Die Ätiologie dessen konnte bis dato nicht ausreichend geklärt werden. Hintergrund der Studie war eine Untersuchung von Branchetti et al., wobei eine Erhöhung von RAGE im Plasma bei Patienten mit bikuspider Klappe nachgewiesen werden konnte. Daraus wurde die Hypothese entwickelt, dass eine Expressionserhöhung von RAGE und dessen Liganden AGE im Aortengewebe selbst ursächlich mit der Aortendilatation verbunden sein könnte. In Proben von 93 Patienten wurde mittels Western Blot, ELISA und Immunhistochemie die Expression von RAGE und AGE untersucht. Hierbei zeigte sich eine signifikante Expressionserhöhung beider Proteine im Aortenaneurysma bei bikuspider Klappe im Vergleich zu Patienten mit trikuspider Aortenklappe. Auch die exemplarisch angefertigten Immunhistologien stützen diese Ergebnisse. Mögliche Folgen können Steifigkeitserhöhung der Aortenwand, Aktivierung von Matrixmetalloproteinasen sowie Erhöhung des oxidativen Stresses sein. Neben der Expression im aneurysmatischen Aortengewebe wurden auch Plasmaproben hinsichtlich AGE und RAGE analysiert, wobei sich keine Erhöhung feststellen ließ. Die Ergebnisse der Studie, die eine RAGE – Erhöhung im Plasma detektierten und ihn somit als potentiellen Biomarker für eine bikuspide Klappe diskutierten, ließen sich bei der vorliegenden Untersuchung einer kleineren Stichprobe nicht bestätigen. Ebenso stellt sich die Etablierung eines Biomarkers als anspruchsvolle Aufgabe dar. Eine Eignung von RAGE als Biomarker zur Identifikation von Patienten mit bikuspider Klappe ist kritisch zu betrachten.:Inhaltsverzeichnis Bibliographische Beschreibung Abkürzungsverzeichnis 1. Einleitung 1.1. Die bikuspide Aortenklappe (BAV) 1.1.1. Prävalenz 1.1.2. Klassifikation 1.1.3. Ätiologie 1.1.4. Assoziierte Pathologien 1.1.5. Hypothesen der Dilatationsentstehung 1.1.6. Diagnostik 1.1.7. Therapie 1.2. RAGE und AGE 1.2.1. Advanced Glycation End Products (AGE) 1.2.2. Receptor for Advanced Glycation End Products (RAGE) 1.2.3. Interaktion von AGE und RAGE 1.2.4. Bezug zum thorakalen Aortenaneurysma 2. Zielstellung 3. Material 3.1. Allgemeine Geräte 3.2. Allgemeine Materialien 3.3. Allgemeine Chemikalien 3.4. Proteinextraktion 3.5. Proteinkonzentrationsbestimmung 3.6. SDS – Gelelektrophorese 3.7. Antikörper (AK) 3.8. Western Blot Analyse 3.9. Enzyme – linked Immunosorbent Assay (ELISA) 3.10. Immunhistochemie (IHC) 3.11. Software 4. Methoden 4.1. Patientenpopulation und Probengewinnung 4.2. Isolation der Proteine aus Aortengewebe 4.3. Konzentrationsbestimmung nach BCA – Methode 4.4 Elektrophoretische Auftrennung der Proteine 4.5. Detektion von AGE und RAGE mittels Western Blot Analyse 4.6. Nachweis von AGE und RAGE mittels ELISA 4.7. Immunhistochemische Färbung von AGE und RAGE 4.8. Statistische Auswertung 5. Ergebnisse 5.1. Patientenpopulation 5.2. Expression von AGE in humanen aneurysmatischen Gewebeproben der Aorta ascendens 5.2.1. Analyse der AGE – Expression mittels Western Blot 5.2.2. Analyse der AGE – Expression mittels ELISA 5.2.3. Darstellung der Lokalisation von AGE in der Aortenwand mittels Immunhistochemie 5.3. Expression von RAGE im Aortengewebe 5.3.1. Analyse der Expression von RAGE mittels Western Blot 5.3.2. Analyse der RAGE – Expression mittels ELISA 5.3.3. Darstellung der Lokalisation von RAGE in der Aortenwand mittels Immunhistochemie 5.4. Bestimmung der Plasmaspiegel von AGE und RAGE in ausgewählten Plasmaproben 6. Diskussion 6.1. Expressionserhöhung von AGE in der Aortenwand von Patienten mit BAV 6.1.1. Mögliche Ursachen der Expressionserhöhung 6.1.2. Zusammenhang von AGE und Gefäßsteifigkeit 6.2. Expressionserhöhung von RAGE im Aortengewebe von Patienten mit BAV 6.2.1. Ätiologie der Expressionserhöhung unter Einbeziehung der Liganden 6.2.2. Folgen der RAGE – Erhöhung und ihr Einfluss auf die Gefäßwand 6.3. RAGE in seiner Rolle als Biomarker Schlussfolgerung Limitationen 7. Zusammenfassung 8. Literaturverzeichnis 9. Abbildungsverzeichnis 10. Tabellenverzeichnis Erklärung über die eigenständige Abfassung der Arbeit Lebenslauf Danksagung
195

TCP with Adaptive Pacing for Multihop Wireless Networks

ElRakabawy, Sherif M., Klemm, Alexander, Lindemann, Christoph 17 December 2018 (has links)
In this paper, we introduce a novel congestion control algorithm for TCP over multihop IEEE 802.11 wireless networks implementing rate-based scheduling of transmissions within the TCP congestion window. We show how a TCP sender can adapt its transmission rate close to the optimum using an estimate of the current 4-hop propagation delay and the coefficient of variation of recently measured round-trip times. The novel TCP variant is denoted as TCP with Adaptive Pacing (TCP-AP). Opposed to previous proposals for improving TCP over multihop IEEE 802.11 networks, TCP-AP retains the end-to-end semantics of TCP and does neither rely on modifications on the routing or the link layer nor requires cross-layer information from intermediate nodes along the path. A comprehensive simulation study using ns-2 shows that TCP-AP achieves up to 84% more goodput than TCP NewReno, provides excellent fairness in almost all scenarios, and is highly responsive to changing traffic conditions.
196

Contextual Outlier Detection from Heterogeneous Data Sources

Yan, Yizhou 17 May 2020 (has links)
The dissertation focuses on detecting contextual outliers from heterogeneous data sources. Modern sensor-based applications such as Internet of Things (IoT) applications and autonomous vehicles are generating a huge amount of heterogeneous data including not only the structured multi-variate data points, but also other complex types of data such as time-stamped sequence data and image data. Detecting outliers from such data sources is critical to diagnose and fix malfunctioning systems, prevent cyber attacks, and save human lives. The outlier detection techniques in the literature typically are unsupervised algorithms with a pre-defined logic, such as, to leverage the probability density at each point to detect outliers. Our analysis of the modern applications reveals that this rigid probability density-based methodology has severe drawbacks. That is, low probability density objects are not necessarily outliers, while the objects with relatively high probability densities might in fact be abnormal. In many cases, the determination of the outlierness of an object has to take the context in which this object occurs into consideration. Within this scope, my dissertation focuses on four research innovations, namely techniques and system for scalable contextual outlier detection from multi-dimensional data points, contextual outlier pattern detection from sequence data, contextual outlier image detection from image data sets, and lastly an integrative end-to-end outlier detection system capable of doing automatic outlier detection, outlier summarization and outlier explanation. 1. Scalable Contextual Outlier Detection from Multi-dimensional Data. Mining contextual outliers from big datasets is a computational expensive process because of the complex recursive kNN search used to define the context of each point. In this research, leveraging the power of distributed compute clusters, we design distributed contextual outlier detection strategies that optimize the key factors determining the efficiency of local outlier detection, namely, to localize the kNN search while still ensuring the load balancing. 2. Contextual Outlier Detection from Sequence Data. For big sequence data, such as messages exchanged between devices and servers and log files measuring complex system behaviors over time, outliers typically occur as a subsequence of symbolic values (or sequential pattern), in which each individual value itself may be completely normal. However, existing sequential pattern mining semantics tend to mis-classify outlier patterns as typical patterns due to ignoring the context in which the pattern occurs. In this dissertation, we present new context-aware pattern mining semantics and then design efficient mining strategies to support these new semantics. In addition, methodologies that continuously extract these outlier patterns from sequence streams are also developed. 3. Contextual Outlier Detection from Image Data. An image classification system not only needs to accurately classify objects from target classes, but also should safely reject unknown objects that belong to classes not present in the training data. Here, the training data defines the context of the classifier and unknown objects then correspond to contextual image outliers. Although the existing Convolutional Neural Network (CNN) achieves high accuracy when classifying known objects, the sum operation on multiple features produced by the convolutional layers causes an unknown object being classified to a target class with high confidence even if it matches some key features of a target class only by chance. In this research, we design an Unknown-aware Deep Neural Network (UDN for short) to detect contextual image outliers. The key idea of UDN is to enhance existing Convolutional Neural Network (CNN) to support a product operation that models the product relationship among the features produced by convolutional layers. This way, missing a single key feature of a target class will greatly reduce the probability of assigning an object to this class. To further improve the performance of our UDN at detecting contextual outliers, we propose an information-theoretic regularization strategy that incorporates the objective of rejecting unknowns into the learning process of UDN. 4. An End-to-end Integrated Outlier Detection System. Although numerous detection algorithms proposed in the literature, there is no one approach that brings the wealth of these alternate algorithms to bear in an integrated infrastructure to support versatile outlier discovery. In this work, we design the first end-to-end outlier detection service that integrates outlier-related services including automatic outlier detection, outlier summarization and explanation, human guided outlier detector refinement within one integrated outlier discovery paradigm. Experimental studies including performance evaluation and user studies conducted on benchmark outlier detection datasets and real world datasets including Geolocation, Lighting, MNIST, CIFAR and the Log file datasets confirm both the effectiveness and efficiency of the proposed approaches and systems.
197

A scenario study on end-of-life tyre management in 2020

Lin, Hong-Mao January 2011 (has links)
With a large amount of tyres being discarded every year, the question of how to manage the end-of-life tyres (ELTs) has become a serious issue. Thus this study identifies different driving forces for this management and the most possible scenarios for the future management of ELTs. The study also compares the business as usual model with a waste hierarchy model to explore the possibilities for optimizing management of ELTs through cascading. This study collects opinions about the driving forces of ELT management from 29 experts working in the area. Important driving forces identified were: price of substitute products, recycled materials’ market, environmental legislation, and technology. This study also surveys 23 experts in the tyre area about the most possible scenarios for ELTs in 2020. One of the more believed in futures was: “Due to increasingly limited fossil fuels and a rise of sustainability awareness, applications for ELTs are growing both in material and energy recycling.” This suggests that a shift toward an equal recycling situation of ELTs among material and energy might be likely to happen by 2020. Based on the most possible scenario for ELTs in 2020, a comparison between waste hierarchy model and business as usual model has been performed. The result shows that the (cascading) waste hierarchy model would likely create more environmental benefits than business as usual model. This is done though the saving and cycling of more materials from energy recovery into material recycling.
198

Knowledge Base : Back-end interface and possible uses

Liliequist, Erik, Jonsson, Martin January 2016 (has links)
This paper addresses two different aspects of the subject known as knowledge bases, or knowledge graphs. A knowledge base is defined as a comprehensive semantically organized machine-readable collection of universally relevant or domain-specific entities, classes, and facts. The objective of this paper is to explore how a knowledge base can be used to gain information about an entity. First we present one way to access information from the knowledge base using a back-end interface. This back-end interface takes simple parameters as input which are used to query the knowledge base. The main objective here is to be able to access the right entity, to be able to answers the questions correctly.  After that follows a discussion about the need for knowledge bases and possible uses. The discussions will partly be based on results from our implementation, but also consider other similar implementation, and interviews with possible users in the business and society.  We conclude that the back-end interface developed performs well enough, with a high precision, to be ran in an unsupervised system. Furthermore we realise that the interface can be improved in several ways by focusing on smaller domains of information. Several different possible uses have been identified. From these uses a market analysis has been done from which we conclude good market possibilities. Some of the key problems with implementing the interface regards the credibility of the information in the knowledge base. This is one of the main problems that needs to be solved to fully implement knowledge bases in business and society. / Den här rapporten tar upp två olika områden som berör knowledge bases. En knowledge base definieras som en omfattande semantiskt organiserad maskinläslig samling av universellt relevanta eller domän-specifika entiteter, klasser, och fakta. Målet med rapporten är att undersöka hur en knowledge base kan användas för att få fram information om en entitet. Först presenteras ett tillvägagångsätt för kommunikation mot en knowledge base med hjälp av ett back-end gränssnitt. Back-end gränssnittet tar enkla parametrar som input och använder dessa för att köra en query mot en knowledge base. Huvudfokus i denna del kommer ligga i att få rätt svar på frågorna och kommer därmed att utvärderas utifrån det. Det andra området som arbetet berör är en diskussion kring hur knowledge bases kan integreras i samhället och näringslivet för att få ut en ökad nytta. Diskussionerna kommer att baseras på resultaten från den första delen av arbetet till viss del, men även andra liknande studier kommer vägas in för att ge ett bredare diskussionsunderlag. Utöver detta baseras också diskussionen på intervjuer med möjliga intressenter inom näringsliv och samhälle. Det utvecklade gränssnittet presterar på en nivå, med hög precision, som vi bedömer tillräcklig för implementering i oövervakade system. Dessutom har flertalet förbättringsområden identifierats. Huvudsakligen berör dessa att mer specifika implementationer kan få högre precision då specifikare kontroller kan genomföras. Flertal möjliga användningsområden har identifierats. Med dessa som grund har en marknadsanalys genomförts som pekar på goda förutsättningar för tekniken. Ett av det största problemen berör trovärdigheten i informationen i knowledge basen. Det är ett problem som måste lösas innan tekniken kan implementeras fullt ut i näringsliv och samhälle.
199

A Visual Focus on Form Understanding

Davis, Brian Lafayette 19 May 2022 (has links)
Paper forms are a commonly used format for collecting information, including information that ultimately will be added to a digital database. This work focuses on the automatic extraction of information from form images. It examines what can be achieved at parsing forms without any textual information. The resulting model, FUDGE, shows that computer vision alone is reasonably successful at the problem. Drawing from the strengths and weaknesses of FUDGE, this work also introduces a novel model, Dessurt, for end-to-end document understanding. Dessurt performs text recognition implicitly and is capable of outputting arbitrary text, making it a more flexible document processing model than prior methods. Dessurt is capable of parsing the entire contents of a form image into a structured format directly, achieving better performance than FUDGE at this task. Also included is a technique to generate synthetic handwriting, which provides synthetic training data for Dessurt.
200

Evaluation Procedure for QoS of Short Message Service : International SMS Route Analysis

Mulkijanyan, Nina January 2011 (has links)
Due to its ubiquitous availability, Short Message Service (SMS), first introduced in the 1980s, became not only the most popular way of communication, but also stimulated the development of SMS-based value added services. This application-to-person traffic is delivered to end users through SMS aggregators who provide the link between service providers and mobile carriers. In order to perform optimal traffic routing, the aggregators need to estimate the quality of each potential international route to the specified destination. The evaluation criteria include end-to-end delivery time, as well as correct verification of delivered data. This thesis suggests a method of quality of service (QoS) assessment for international SMS service which combines two types of tests, end-to-end delay measurements and various verification tests. A prototype of the testing system for international SMS service was developed to generate SMS traffic, collect and analyze results, and evaluate the experienced QoS of the SMS route used in accordance with the proposed approach. As a part of end-to- end delay measurement tests, SMS traffic was sent to Singtel network in Singapore along two routes. The verification tests were executed via different routes to two mobile networks: Singtel and Tele2 (Sweden). The results of the performed measurements determined the route with the highest QoS, i.e. the one with bigger bottleneck bandwidth and lower data loss rate. The prototype of the SMS testing system can be used by SMS aggregators to verify delivery of a SMS message, check the integrity of the message, figure out interconnection type of the route supplier with the destination carrier and to identify the presence of load balancers in the path. The prototype also makes it possible to compare end-to-end delay times of several routes and compute bottleneck values for each of the tested routes.

Page generated in 0.0449 seconds