• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 415
  • 146
  • 56
  • 39
  • 37
  • 14
  • 10
  • 6
  • 6
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • Tagged with
  • 863
  • 111
  • 110
  • 92
  • 80
  • 79
  • 78
  • 77
  • 63
  • 61
  • 58
  • 55
  • 53
  • 49
  • 47
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
381

An Approach to Utilize a No-Reference Image Quality Metric and Fusion Technique for the Enhancement of Color Images

de Silva, Manawaduge Supun Samudika 09 September 2016 (has links)
No description available.
382

A comparison between on-premise and cloud environments in terms of security : With an emphasis on Software-as-a-Service & Platform-as-a-Service

Byström, Oliver January 2022 (has links)
Background: Cloud- and on-premise environments have been compared in terms of security several times. Many of these comparisons based their assessments on qualitative data rather than quantitative metrics. Some recent articles have considered comparing environments by using quantitative data. These methodologies are often complicated and based on incident simulations that might not be relevant in a real-life scenario. Therefore it could be troublesome for a company to evaluate and compare two environments before deciding which environment they would prefer in terms of security. Before an environment migration, it is decisive to know if that environment has been a target for recent cyberattacks. Unfortunately, this data is not available to the public. Objectives: This study aims to provide the reader with an overview of the environmental aspects of the victims of recent cyberattacks. It will reveal what environment cybercriminals have targeted the most. The study will also propose a methodology to compare two environments to each other based on quantitative measurements. The measurements were based on cybersecurity metrics that quantified the threats in each environment. Methods: A structured literature- and dataset review was conducted to find how much each environment had been exposed to cybersecurity incidents. Several expert interviews were held to help explain the findings made in the reviews. A threat analysis was used as the foundation for the proposed comparison methodology. A case study of a recent environment migration was used to test the proposed comparison methodology. Results: The results show that on-premise environments have been more exposed to cybersecurity incidents during recent years than cloud environments. The proposed methodology showed that the cloud environment was the preferred choice in the conducted case study. Conclusions: In recent years, cloud environments have been the preferred choice in terms of security as long as the cloud consumer takes heed to best practices. There is a knowledge gap when it comes to cloud environments. It has been the same for both cloud consumers and cybercriminals. However, according to recent threat reports, cybercriminals have started to improve. Therefore there will likely be more cloud-related incidents in the future. It was determined that the proposed methodology could represent the security posture of each environment. However, a decision should not be based entirely on this methodology because it has not been tested on a large scale. / Bakgrund: Moln- och on-premise-miljöer har jämförts vad gäller säkerhet flera gånger. De flesta jämförelser baserade sina bedömningar på kvalitativ data snarare än kvantitativa mått. Några nya artiklar har jämfört miljöer med hjälp av kvantitativ data. Dessa metoder är ofta komplicerade och baserade på incidentsimuleringar som kanske inte är relevanta i ett verkligt scenario. Därför kan det vara besvärligt för ett företag att utvärdera och jämföra två miljöer innan de bestämmer sig för vilken miljö de skulle föredra vad gäller säkerhet. Innan en miljömigrering är det avgörande att veta om den miljön har varit ett mål för de senaste cyberattackerna. Tyvärr är denna information inte tillgänglig för allmänheten. Syfte: Denna studie syftar till att ge läsaren en översikt av miljöaspekterna hos offren för de senaste cyberattackerna. Det kommer att avslöja vilken miljö cyberkriminella har riktat sig mest mot. Studien kommer också att föreslå en metodik för att jämföra två miljöer med varandra baserat på kvantitativa mått. Mätningarna baserades på cybersäkerhetsmått som kvantifierade hoten i varje miljö. Metod: En strukturerad litteratur- och datasetgranskning genomfördes för att ta reda på hur mycket varje miljö har varit utsatt för cybersäkerhetsincidenter. Flera expertintervjuer hölls för att förklara resultaten som gjorts i granskningarna. En hotanalys genomfördes för att ge underlag för den föreslagna jämförelsemetodiken. Jämförelsemetoden testades i en fallstudie av en nyligen genomförd miljömigrering. Resultat: Resultaten visar att on-premise miljöer har varit mer utsatta för cybersäkerhetsincidenter under de senaste åren än molnmiljöer. Den föreslagna metoden visade att molnmiljön var det föredragna valet i den genomförda fallstudien. Slutsatser: Under de senaste åren har molnmiljöer varit det föredragna valet när det gäller säkerhet så länge som molnkonsumenten tar hänsyn till bästa praxis. Det finns en kunskapslucka när det kommer till molnmiljöer. Det har varit samma sak för både molnkonsumenter och cyberkriminella. Men enligt de senaste hotrapporterna har cyberkriminella börjat kommit ikapp. Därför kommer det troligen att finnas fler molnrelaterade incidenter i framtiden. Det fastställdes att den föreslagna metoden kunde representera säkerheten för varje miljö väl. Ett beslut bör dock inte baseras helt på denna metodik eftersom den inte har testats i stor skala.
383

The Persistent Topology of Geometric Filtrations

Wang, Qingsong 06 September 2022 (has links)
No description available.
384

Contributions to measure-valued diffusion processes arising in statistical mechanics and population genetics

Lehmann, Tobias 19 September 2022 (has links)
The present work is about measure-valued diffusion processes, which are aligned with two distinct geometries on the set of probability measures. In the first part we focus on a stochastic partial differential equation, the Dean-Kawasaki equation, which can be considered as a natural candidate for a Langevin equation on probability measures, when equipped with the Wasserstein distance. Apart from that, the dynamic in question appears frequently as a model for fluctuating density fields in non-equilibrium statistical mechanics. Yet, we prove that the Dean-Kawasaki equation admits a solution only in integer parameter regimes, in which case the solution is given by a particle system of finite size with mean field interaction. For the second part we restrict ourselves to positive probability measures on a finite set, which we identify with the open standard unit simplex. We show that Brownian motion on the simplex equipped with the Aitchison geometry, can be interpreted as a replicator dynamic in a white noise fitness landscape. We infer three approximation results for this Aitchison diffusion. Finally, invoking Fokker-Planck equations and Wasserstein contraction estimates, we study the long time behavior of the stochastic replicator equation, as an example of a non-gradient drift diffusion on the Aitchison simplex.
385

Selected Topics in Analysis in Metric Measure Spaces

Capolli, Marco 02 February 2021 (has links)
The thesis is composed by three sections, each devoted to the study of a specific problem in the setting of PI spaces. The problem analyzed are: a C^m Lusin approximation result for horizontal curves in the Heisenberg group, a limit result in the spirit of Burgain-Brezis-Mironescu for Orlicz-Sobolev spaces in Carnot groups and the differentiability of Lipschitz functions in Laakso spaces.
386

High Frequency Modeling and Experimental Analysis for Implementation of Impedance-based Structural Health Monitoring

Peairs, Daniel Marsden 23 June 2006 (has links)
A promising structural health monitoring (SHM) method for implementation on real world structures is impedance-based health monitoring. An in-service system is envisioned to include on board processing and perhaps wireless transfer of data. Ideally, a system could be produced as a slap-on or automatically installed addition to a structure. The research presented in this dissertation addresses issues that will help make such a system a reality. Although impedance-based SHM does not typically use an analytical model for basic damage identification, a model is necessary for more advanced features of SHM, such as damage prognosis, and to evaluate system parameters when installing on various structures. A model was developed based on circuit analysis of the previously proposed low-cost circuit for impedance-based SHM in combination with spectral elements. When a three-layer spectral element representing a piezoceramic bonded to a base beam is used, the model can predict the large peaks in the impedance response due to resonances of the bonded active sensor. Parallel and series connections of distributed sensor systems are investigated both experimentally and with the developed model. Additionally, the distribution of baseline damage metrics is determined to assess how the large quantities of data produced by a monitoring system can be handled statistically. A modification of the RMSD damage metric has also been proposed that is essentially the squared sum of the Z-statistic for each frequency point. Preferred excitation frequencies for macro-fiber composite (MFC) active sensors are statistically determined for a long composite boom under development for use in rigidizable inflatable space structures. / Ph. D.
387

Demand Estimation with Differentiated Products: An Application to Price Competition in the U.S. Brewing Industry

Rojas, Christian Andres 23 September 2005 (has links)
A large part of the empirical work on differentiated products markets has focused on demand estimation and the pricing behavior of firms. These two themes are key inputs in important applications such as the merging of two firms or the introduction of new products. The validity of inferences, therefore, depends on accurate demand estimates and sound assumptions about the pricing behavior of firms. This dissertation makes a contribution to this literature in two ways. First, it adds to previous techniques of estimating demand for differentiated products. Second, it extends previous analyses of pricing behavior to models of price leadership that, while important, have received limited attention. The investigation focuses on the U.S. brewing industry, where price leadership appears to be an important type of firm behavior. The analysis is conducted in two stages. In the first stage, the recent Distance Metric (DM) method devised by Pinkse, Slade and Brett is used to estimate the demand for 64 brands of beer in 58 major metropolitan areas of the United States. This study adds to previous applications of the DM method (Pinkse and Slade; Slade 2004) by employing a demand specification that is more flexible and also by estimating advertising substitution coefficients for numerous beer brands. In the second stage, different pricing models are compared and ranked by exploiting the exogenous change in the federal excise tax of 1991. Demand estimates of the first stage are used to compute the implied marginal costs for the different models of pricing behavior prior to the tax increase. Then, the tax increase is added to the these pre-tax increase marginal costs, and equilibrium prices for all brands are simulated for each model of pricing behavior. These "predicted" prices are then compared to actual prices for model assessment. Results indicate that Bertrand-Nash predicts the pricing behavior of firms more closely than other models, although Stackelberg leadership yields results that are not substanitally different from the Bertrand-Nash model. Nevertheless, Bertrand-Nash tends to under-predict prices of more price-elastic brands and to over-predict prices of less price- elastic brands. An implication of this result is that Anheuser-Busch could exert more market power by increasing the price of its highly inelastic brands, especially Budweiser. Overall, actual price movements as a result of the tax increase tend to be more similar across brands than predicted by any of the models considered. While this pattern is not inconsistent with leadership behavior, leadership models considered in this dissertation do not conform with this pattern. / Ph. D.
388

Capacity Metric for Chip Heterogeneous Multiprocessors

Otoom, Mwaffaq Naif 05 March 2012 (has links)
The primary contribution of this thesis is the development of a new performance metric, Capacity, which evaluates the performance of Chip Heterogeneous Multiprocessors (CHMs) that process multiple heterogeneous channels. Performance metrics are required in order to evaluate any system, including computer systems. A lack of appropriate metrics can lead to ambiguous or incorrect results, something discovered while developing the secondary contribution of this thesis, that of workload modes for CHMs — or Workload Specific Processors (WSPs). For many decades, computer architects and designers have focused on techniques that reduce latency and increase throughput. The change in modern computer systems built around CHMs that process multi-channel communications in the service of single users calls this focus into question. Modern computer systems are expected to integrate tens to hundreds of processor cores onto single chips, often used in the service of single users, potentially as a way to access the Internet. Here, the design goal is to integrate as much functionality as possible during a given time window. Without the ability to correctly identify optimal designs, not only will the best performing designs not be found, but resources will be wasted and there will be a lack of insight to what leads to better performing designs. To address performance evaluation challenges of the next generation of computer systems, such as multicore computers inside of cell phones, we found that a structurally different metric is needed and proceeded to develop such a metric. In contrast to single-valued metrics, Capacity is a surface with dimensionality related to the number of input streams, or channels, processed by the CHM. We develop some fundamental Capacity curves in two dimensions and show how Capacity shapes reveal interaction of not only programs and data, but the interaction of multiple data streams as they compete for access to resources on a CHM as well. For the analysis of Capacity surface shapes, we propose the development of a demand characterization method in which its output is in the form of a surface. By overlaying demand surfaces over Capacity surfaces, we are able to identify when a system meets its demands and by how much. Using the Capacity metric, computer performance optimization is evaluated against workloads in the service of individual users instead of individual applications, aggregate applications, or parallel applications. Because throughput was originally derived by drawing analogies between processor design and pipelines in the automobile industry, we introduce our Capacity metric for CHMs by drawing an analogy to automobile production, signifying that Capacity is the successor to throughput. By developing our Capacity metric, we illustrate how and why different processor organizations cannot be understood as being better performers without both magnitude and shape analysis in contrast to other metrics, such as throughput, that consider only magnitude. In this work, we make the following major contributions: • Definition and development of the Capacity metric as a surface with dimensionality related to the number of input streams, or channels, processed by the CHM. • Techniques for analysis of the Capacity metric. Since the Capacity metric was developed out of necessity, while pursuing the development of WSPs, this work also makes the following minor contributions: • Definition and development of three foundations in order to establish an experimental foundation — a CHM model, a multimedia cell phone example, and a Workload Specific Processor (WSP). • Definition of Workload Modes, which was the original objective of this thesis. • Definition and comparison of two approaches to workload mode identification at run time; The Workload Classification Model (WCM) and another model that is based on Hidden Markov Models (HMMs). • Development of a foundation for analysis of the Capacity metric, so that the impact of architectural features in a CHM may be better understood. In order to do this, we develop a Demand Characterization Method (DCM) that characterizes the demand of a specific usage pattern in the form of a curve (or a surface in general). By doing this, we will be able to overlay demand curves over Capacity curves of different architectures to compare their performance and thus identify optimal performing designs. / Ph. D.
389

Framework for Concentrated Strain Deployable Trusses

Mejia-Ariza, Juan Manuel 25 June 2008 (has links)
This research presents a simplified framework for the analysis of deployable trusses using the concentrated strain approach and uses it to provide key insights into the many design decisions to be made in the development of concentrated strain architectures. The framework uses Euler Column Theory to derive closed form solutions to estimate truss performance. The results are compared to a classical solution and shown to give similar results. A range of strut and hinge hierarchy choices are considered. Trusses composed of solid rods with rectangular flexures are shown to have significant axial and bending stiffness reductions due to the smaller cross-sectional areas and lower modulus of the flexures. Trusses composed of tubes are less sensitive to this because the flexure cross-sectional area does not dramatically change from that of the tube. A hinge material metric that properly weights flexure strain and modulus is presented to provide a basis for the comparison and selection of proper hinge materials. However, based on this metric, new materials with higher folding failure strain and higher modulus are needed. Finally, a concentrated strain deployable truss of solid rods was designed, manufactured, and tested. A truss performance index for column loading was used to compare this system with a distributed strain ATK-ABLE GR1 coilable boom system and an articulated ATK-ABLE SRTM boom system. It was demonstrated that the concentrated strain approach has the potential to achieve a higher linear compaction ratio and truss performance index for mass efficient deployable trusses than the distributed strain approach and the articulated approach. / Ph. D.
390

Geometry of Spaces of Planar Quadrilaterals

StClair, Jessica Lindsey 04 May 2011 (has links)
The purpose of this dissertation is to investigate the geometry of spaces of planar quadrilaterals. The topology of moduli spaces of planar quadrilaterals (the set of all distinct planar quadrilaterals with fixed side lengths) has been well-studied [5], [8], [10]. The symplectic geometry of these spaces has been studied by Kapovich and Millson [6], but the Riemannian geometry of these spaces has not been thoroughly examined. We study paths in the moduli space and the pre-moduli space. We compare intraplanar paths between points in the moduli space to extraplanar paths between those same points. We give conditions on side lengths to guarantee that intraplanar motion is shorter between some points. Direct applications of this result could be applied to motion-planning of a robot arm. We show that horizontal lifts to the pre-moduli space of paths in the moduli space can exhibit holonomy. We determine exactly which collections of side lengths allow holonomy. / Ph. D.

Page generated in 0.0679 seconds