• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 26
  • 6
  • 5
  • 4
  • 4
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 58
  • 9
  • 9
  • 8
  • 8
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Mathematical theory of the Flutter Shutter : its paradoxes and their solution

Tendero, Yohann 22 June 2012 (has links) (PDF)
This thesis provides theoretical and practical solutions to two problems raised by digital photography of moving scenes, and infrared photography. Until recently photographing moving objects could only be done using short exposure times. Yet, two recent groundbreaking works have proposed two new designs of camera allowing arbitrary exposure times. The flutter shutter of Agrawal et al. creates an invertible motion blur by using a clever shutter technique to interrupt the photon flux during the exposure time according to a well chosen binary sequence. The motion-invariant photography of Levin et al. gets the same result by accelerating the camera at a constant rate. Both methods follow computational photography as a new paradigm. The conception of cameras is rethought to include sophisticated digital processing. This thesis proposes a method for evaluating the image quality of these new cameras. The leitmotiv of the analysis is the SNR (signal to noise ratio) of the image after deconvolution. It gives the efficiency of these new camera design in terms of image quality. The theory provides explicit formulas for the SNR. It raises two paradoxes of these cameras, and resolves them. It provides the underlying motion model of each flutter shutter, including patented ones. A shorter second part addresses the the main quality problem in infrared video imaging, the non-uniformity. This perturbation is a time-dependent noise caused by the infrared sensor, structured in columns. The conclusion of this work is that it is not only possible but also efficient and robust to perform the correction on a single image. This permits to ensure the absence of ''ghost artifacts'', a classic of the literature on the subject, coming from inadequate processing relative to the acquisition model.
42

Imagerie multispectrale, vers une conception adaptée à la détection de cibles

Minet, Jean 01 December 2011 (has links) (PDF)
L'imagerie hyperspectrale, qui consiste à acquérir l'image d'une scène dans un grand nombre de bandes spectrales, permet de détecter des cibles là où l'imagerie couleur classique ne permettrait pas de conclure. Les imageurs hyperspectraux à acquisition séquentielle sont inadaptés aux applications de détection en temps réel. Dans cette thèse, nous proposons d'utiliser un imageur multispectral snapshot, capable d'acquérir simultanément un nombre réduit de bandes spectrales sur un unique détecteur matriciel. Le capteur offrant un nombre de pixels limité, il est nécessaire de réaliser un compromis en choisissant soigneusement le nombre et les profils spectraux des filtres de l'imageur afin d'optimiser la performance de détection. Dans cet objectif, nous avons développé une méthode de sélection de bandes qui peut être utilisée dans la conception d'imageurs multispectraux basés sur une matrice de filtres fixes ou accordables. Nous montrons, à partir d'images hyperspectrales issues de différentes campagnes de mesure, que la sélection des bandes spectrales à acquérir peut conduire à des imageurs multispectraux capables de détecter des cibles ou des anomalies avec une efficacité de détection proche de celle obtenue avec une résolution hyperspectrale. Nous développons conjointement un démonstrateur constitué d'une matrice de 4 filtres de Fabry-Perot accordables électroniquement en vue de son implantation sur un imageur multispectral snapshot agile. Ces filtres sont développés en technologie MOEMS (microsystèmes opto-électro-mécaniques) en partenariat avec l'Institut d'Electronique Fondamentale. Nous présentons le dimensionnement optique du dispositif ainsi qu'une étude de tolérancement qui a permis de valider sa faisabilité.
43

Fernand Braun, photographe et éditeur à Royan (1895-1920) / Fernand Braun, photographer and publisher in Royan (1895-1920)

Caillaud, Benjamin 13 December 2012 (has links)
Le photographe français Fernand Braun (1852-1948) quitte les ateliers alsaciens de son oncle Adolphe Braun (1812-1877) en 1878 pour s’’installer à Angoulême, en Charente. Profitant du marché porteur du portrait, le jeune entrepreneur prend une part active aux recherches sur le gélatino-bromure d’’argent, une technique qui ouvre l’ère de l’instantané en photographie. Rejoignant les élites républicaines au sein de la société de gymnastique de la ville, le jeune homme exprime son sentiment de revanche face à la Prusse victorieuse de 1870. Gymnaste accompli et influent, les sentiments patriotiques du personnage le conduisent alors à rejoindre la Ligue des patriotes en 1884. Fréquentant Royan dès 1893, Fernand Braun s’installe en Charente-Inférieure en 1895 pour relancer ses activités. Inséré au réseau des élites conduisant le projet balnéaire de la cité et l’œuvre républicaine d’’instruction de la jeunesse, il entame une carrière d’éditeur de cartes postales. Le photographe crée à partir de 1900, un catalogue de milliers de clichés baptisé la Grande Série et valorisant les paysages et la vie des Charentais. Connaissant un succès phénoménal, la carte postale s’impose comme le medium de prédilection du début du XXe siècle. Les publications de l’éditeur témoignent alors des mutations des paysages régionaux ainsi que des évolutions sociales sur lesquelles le photographe porte un regard d’homme concerné et engagé. Cependant, le dynamisme de la Grande Série ralentit au début des années 1910 et l’auteur cherche de nouveaux débouchés pour son imagerie de cartes postales.L’année 1913 marque une rupture dans sa politique éditoriale et davantage influencé par l’industrie touristique, le personnage renouvelle sa manière de mettre en image les territoires. La Grande Guerre remobilise l’éditeur dès 1914 et ce dernier s’attache alors à exposer les conséquences du conflit à ses contemporains. Néanmoins, ne produisant guère plus que des vues touristiques après 1918, la Grande Série s’arrête en 1920. / The French photographer Fernand Braun (1852-1948) leaves his uncle’s factories, Adolphe Braun’s, in Alsace, for Angouleme in Charente where he settles in 1878. Taking advantage of the successful vogue for portraits, the young business man takes an active part in the researches into gelatine silver process, a revolutionary technique witch was to open a new era for modern photography : the snapshot. When he joined the gymnastic circle, and with it,the republican elite of the place, it was highly significant of his desire for revenge on the victorious Prussia of 1870.He’s not only a brilliant and influent gymnast but also a passionate patriot, when in 1884, he joins the Patriot’s league. From 1893 he often stays in Royan, and in 1895 he definitively settles there, in Inferior-Charente, to give a new start to his activities. Closely connected to the network of the local elite interested in the two main projects for the town : the development of Royan into a great seaside resort and the republican achievement of education of the youth, he starts, then, a career as a postcard publisher. From the year 1900, Fernand Braun produces his Great Collection, a catalogue containing thousands of pictures that shows the landscapes of the Charentes and the dailylife of its inhabitants. Being a prodigious success, the postcard reveals to be the first rank medium at the beginning of 20th century. The publisher’s productions bear witness of the environmental and social changes of the region on which Fernand Braun has a very attentive look. However, in the early 1910’s, the dynamism of the Great Collection slows down and its author looks for new openings in the issue of his pictures. By the year 1913, there is a break in his editorial policy and being more influenced by the tourism industry, the man imagines a new way of picturing the surrounding territories. As soon as the Great War breaks out, in 1914, the publisher finds a new motivation,eager to show his contemporaries its consequences. Nevertheless, by 1918, scarcely producing anything else but touristic views, the Great Collection comes to an end in 1920.
44

Využití metody reálného potenciálu zlepšení pro zvýšení efektů investic do informačních technologií / VYUŽITÍ METODY REÁLNÉHO POTENCIÁLU ZLEPŠENÍ PRO ZVÝŠENÍ EFEKTŮ INVESTIC DO INFORMAČNÍCH TECHNOLOGIÍ

Blahanravau, Yauheni January 2009 (has links)
This thesis is dealing with the problematic of relationship between information technology and business, IT and business alignment and the role of CIO in these processes. The further topic I describe in my thesis is the evaluating of the investing projects into information technologies. In connection with this I analyze various financial methods for evaluating of investments in information technologies. The mail goals of this work are: analyze and create a detailed description of modern trends in information technology management and in business and IT alignment, another goal is make comparison of the financial methods and frameworks that help to measure the efficiency of investments in IT and show how these methods can be applied. The further goal is analyze in details the real potential of improvement method and create new set of tools (spreadsheets) for the complex support of this method and show how it can be applied in practice. In the theoretical part of my thesis I focused on the analysis of the role of information technology and CIO in modern organizations and on ways of effective communication of added value to top management of the organization. Analysis is based on the literature review and aimed on searching of skills and knowledge that the modern and successful CIO must have. In this relation were made the analysis of popular financial methods, their comparison and examples how to apply it. Practical part of this work was aimed on the detailed analysis of the real potential of improvement method and how to use it in practice. In this chapter I described the basic principles of this method and created a set of MS Excel tools for better efficiency while using the method of the real potential of improvement. After creation of the tools the practical application of it was shown as an example of improvement of one business process in organization.
45

Studie efektivnosti využití pracovišť ve vybraném provozu / The Study Efficiency of Utilization of Workplacess in Selected Operation

Milota, Tomáš January 2017 (has links)
This thesis deals with the analysis of data obtained by observing and measuring workplaces of assembly lines with a focus on the efficiency of their utilization, through selected elements of industrial engineering, which the reader can get acquainted with in the theoretical part of the thesis. In this thesis the line productivity during the shifts is analyzed, an ideal line balancing condition is defined and suggestion for changing the time of line consumption norm according to the measured line cycle time is made. According to the complex analyzes of the workplaces, suggestions are concluded and an economic appreciation of these proposals is also elaborated. The author of this thesis finds the main benefit of this work in increasing of the productivity of the analyzed line in the case of implementation of solution designs.
46

Le maintien de la cohérence dans les systèmes de stockage partiellement repliqués / Ensuring consistency in partially replicated data stores

Saeida Ardekani, Masoud 16 September 2014 (has links)
Dans une première partie, nous étudions la cohérence dans les systèmes transactionnels, en nous concentrant sur le problème de réconcilier la scalabilité avec des garanties transactionnelles fortes. Nous identifions quatre propriétés critiques pour la scalabilité. Nous montrons qu’aucun des critères de cohérence forte existants n’assurent l’ensemble de ces propriétés. Nous définissons un nouveau critère, appelé Non-Monotonic Snapshot Isolation ou NMSI, qui est le premier à être compatible avec les quatre propriétés à la fois. Nous présentons aussi une mise en œuvre de NMSI, appelée Jessy, que nous comparons expérimentalement à plusieurs critères connus. Une autre contribution est un canevas permettant de comparer de façon non biaisée différents protocoles. Elle se base sur la constatation qu’une large classe de protocoles transactionnels distribués est basée sur une même structure, Deferred Update Replication(DUR). Les protocoles de cette classe ne diffèrent que par les comportements spécifiques d’un petit nombre de fonctions génériques. Nous présentons donc un canevas générique pour les protocoles DUR.La seconde partie de la thèse a pour sujet la cohérence dans les systèmes de stockage non transactionnels. C’est ainsi que nous décrivons Tuba, un stockage clef-valeur qui choisit dynamiquement ses répliques selon un objectif de niveau de cohérence fixé par l’application. Ce système reconfigure automatiquement son ensemble de répliques, tout en respectant les objectifs de cohérence fixés par l’application, afin de s’adapter aux changements dans la localisation des clients ou dans le débit des requête. / In the first part, we study consistency in a transactional systems, and focus on reconciling scalability with strong transactional guarantees. We identify four scalability properties, and show that none of the strong consistency criteria ensure all four. We define a new scalable consistency criterion called Non-Monotonic Snapshot Isolation (NMSI), while is the first that is compatible with all four properties. We also present a practical implementation of NMSI, called Jessy, which we compare experimentally against a number of well-known criteria. We also introduce a framework for performing fair comparison among different transactional protocols. Our insight is that a large family of distributed transactional protocols have a common structure, called Deferred Update Replication (DUR). Protocols of the DUR family differ only in behaviors of few generic functions. We present a generic DUR framework, called G-DUR. We implement and compare several transactional protocols using the G-DUR framework.In the second part, we focus on ensuring consistency in non-transactional data stores. We introduce Tuba, a replicated key-value store that dynamically selects replicas in order to maximize the utility delivered to read operations according to a desired consistency defined by the application. In addition, unlike current systems, it automatically reconfigures its set of replicas while respecting application-defined constraints so that it adapts to changes in clients’ locations or request rates. Compared with a system that is statically configured, our evaluation shows that Tuba increases the reads that return strongly consistent data by 63%.
47

Podpora snapshotu a rollbacku pro konfigurační soubory v distribuci Fedora / Snapshot and Rollback Support for Configuration Files in Fedora

Ježek, Michal January 2008 (has links)
The purpose of this thesis is to design and implement tools for support of a snapshot and a rollback for configuration files on the GNU/Linux distribution. The set of the tools enables an automatic/periodical saving of the configuration files into the selected placement. The creation of backups reacts to file events by watching the changes with kernel subsystem inotify. Tools are enabling to return to the selected backup. The way of the backup actualization is configurable. This tool permits the data comparison from selected backups, to show the differences in configurations and eventually to manage a merge among actual and selected backup. Tools also allows a comparison of a configurations of one client or configurations among clients, and to display the mutual differences, eventually to manage their merge.
48

Viability of Scrum Elements in Mechatronic System Testing : An Exploratory Case Study at Husqvarna AB

Axelsson, Arvid, Ossiansson, Viktor January 2023 (has links)
Mechatronics is an interdisciplinary field of engineering involving components from software engineering, as well as physical components from mechanics and electronics. Development of a mechatronic system requires special interdisciplinary knowledge from multiple fields of engineering. This becomes especially relevant when a mechatronic system is integrated for full system testing, where all the components are combined, and the client-specified use-case requirements are tested. Agile project methods, originally only intended for software engineering, are spreading in the field of mechatronics with positive effect, though there are concerns that such methods cannot be broadly applied to all sub-disciplines of the field or stages of development. A major Swedish mechatronic development firm has begun experimenting with Scrum elements for the system testing department of their products. The purpose of this thesis was to investigate the viability of Scrum elements for system testing of mechatronic products, through an exploratory, qualitative case study. Specifically, we wanted to understand how the use of Scrum elements can benefit mechatronic system testing, and what challenges can emerge. Data collection for our study consisted of semi-structured interviews with members of the chosen system test department, as well as with system test representatives from other mechatronic development firms. The data was transcribed and then analysed using inductive thematic content analysis. The results consisted of four distinct themes, describing the benefits and challenges of using Scrum elements in mechatronic system testing: 1: The Importance and Challenges of Planning, 2: How to Handle Task Prioritisation and Estimation, 3: The Benefits and Challenges of Inter-Departmental Collaboration, and 4: Adjustment of the Scrum Framework. The most significant benefits were found with adopting the incremental Sprint cycle structure, with corresponding Sprint Planning meetings, as well as Sprint Retrospectives. The Daily Scrum meeting was also identified as a useful impediment bulldozer, letting the team refocus efforts on the tasks requiring the most attention. An important challenge identified was learning how to effectively collaborate with other departments in the company who may not be using an Agile framework. A comparison was made between different approaches to adapting the Scrum framework to best fit the needs of your team, concluding that simply picking individual Scrum elements to include in your work structure may yield underwhelming results compared to the combined effect of a complete Scrum framework.
49

Exploring Techniques for Providing Privacy in Location-Based Services Nearest Neighbor Query

Asanya, John-Charles 01 January 2015 (has links)
Increasing numbers of people are subscribing to location-based services, but as the popularity grows so are the privacy concerns. Varieties of research exist to address these privacy concerns. Each technique tries to address different models with which location-based services respond to subscribers. In this work, we present ideas to address privacy concerns for the two main models namely: the snapshot nearest neighbor query model and the continuous nearest neighbor query model. First, we address snapshot nearest neighbor query model where location-based services response represents a snapshot of point in time. In this model, we introduce a novel idea based on the concept of an open set in a topological space where points belongs to a subset called neighborhood of a point. We extend this concept to provide anonymity to real objects where each object belongs to a disjointed neighborhood such that each neighborhood contains a single object. To help identify the objects, we implement a database which dynamically scales in direct proportion with the size of the neighborhood. To retrieve information secretly and allow the database to expose only requested information, private information retrieval protocols are executed twice on the data. Our study of the implementation shows that the concept of a single object neighborhood is able to efficiently scale the database with the objects in the area. The size of the database grows with the size of the grid and the objects covered by the location-based services. Typically, creating neighborhoods, computing distances between objects in the area, and running private information retrieval protocols causes the CPU to respond slowly with this increase in database size. In order to handle a large number of objects, we explore the concept of kernel and parallel computing in GPU. We develop GPU parallel implementation of the snapshot query to handle large number of objects. In our experiment, we exploit parameter tuning. The results show that with parameter tuning and parallel computing power of GPU we are able to significantly reduce the response time as the number of objects increases. To determine response time of an application without knowledge of the intricacies of GPU architecture, we extend our analysis to predict GPU execution time. We develop the run time equation for an operation and extrapolate the run time for a problem set based on the equation, and then we provide a model to predict GPU response time. As an alternative, the snapshot nearest neighbor query privacy problem can be addressed using secure hardware computing which can eliminate the need for protecting the rest of the sub-system, minimize resource usage and network transmission time. In this approach, a secure coprocessor is used to provide privacy. We process all information inside the coprocessor to deny adversaries access to any private information. To obfuscate access pattern to external memory location, we use oblivious random access memory methodology to access the server. Experimental evaluation shows that using a secure coprocessor reduces resource usage and query response time as the size of the coverage area and objects increases. Second, we address privacy concerns in the continuous nearest neighbor query model where location-based services automatically respond to a change in object*s location. In this model, we present solutions for two different types known as moving query static object and moving query moving object. For the solutions, we propose plane partition using a Voronoi diagram, and a continuous fractal space filling curve using a Hilbert curve order to create a continuous nearest neighbor relationship between the points of interest in a path. Specifically, space filling curve results in multi-dimensional to 1-dimensional object mapping where values are assigned to the objects based on proximity. To prevent subscribers from issuing a query each time there is a change in location and to reduce the response time, we introduce the concept of transition and update time to indicate where and when the nearest neighbor changes. We also introduce a database that dynamically scales with the size of the objects in a path to help obscure and relate objects. By executing the private information retrieval protocol twice on the data, the user secretly retrieves requested information from the database. The results of our experiment show that using plane partitioning and a fractal space filling curve to create nearest neighbor relationships with transition time between objects reduces the total response time.
50

Underwater Use of a Hyperspectral Camera to Estimate Optically Active Substances in theWater Column of Freshwater Lakes

Seidel, Michael, Hutengs, Christopher, Oertel, Felix, Schwefel, Daniel, Jung, András, Vohland, Michael 21 April 2023 (has links)
Freshwater lakes provide many important ecosystem functions and services to support biodiversity and human well-being. Proximal and remote sensing methods represent an efficient approach to derive water quality indicators such as optically active substances (OAS). Measurements of above-ground remote and in situ proximal sensors, however, are limited to observations of the uppermost water layer. We tested a hyperspectral imaging system, customized for underwater applications, with the aim to assess concentrations of chlorophyll a (CHLa) and colored dissolved organic matter (CDOM) in the water columns of four freshwater lakes with different trophic conditions in Central Germany. We established a measurement protocol that allowed consistent reflectance retrievals at multiple depths within the water column independent of ambient illumination conditions. Imaging information from the camera proved beneficial for an optimized extraction of spectral information since low signal areas in the sensor’s field of view, e.g., due to non-uniform illumination, and other interfering elements, could be removed from the measured reflectance signal for each layer. Predictive hyperspectral models, based on the 470 nm–850 nm reflectance signal, yielded estimates of both water quality parameters (R² = 0.94, RMSE = 8.9 µg L−1 for CHLa; R² = 0.75, RMSE = 0.22 m−1 for CDOM) that were more accurate than commonly applied waveband indices (R² = 0.83, RMSE = 13.2 µg L−1 for CHLa; R² = 0.66, RMSE = 0.25 m−1 for CDOM). Underwater hyperspectral imaging could thus facilitate future water monitoring efforts through the acquisition of consistent spectral reflectance measurements or derived water quality parameters along the water column, which has the potential to improve the link between above-surface proximal and remote sensing observations and in situ point-based water probe measurements for ground truthing or to resolve the vertical distribution of OAS.

Page generated in 0.0305 seconds