• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 486
  • 296
  • 197
  • 80
  • 78
  • 36
  • 34
  • 34
  • 16
  • 10
  • 10
  • 8
  • 8
  • 7
  • 7
  • Tagged with
  • 1439
  • 155
  • 148
  • 137
  • 114
  • 114
  • 112
  • 107
  • 99
  • 79
  • 69
  • 68
  • 65
  • 54
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
371

Robustesse des arbres phylogénétiques

Mariadassou, Mahendra 27 November 2009 (has links) (PDF)
La théorie synthétique de l'évolution a largement diffusé dans tous les domaines de la biologie, notamment grâce aux arbres phylogénétiques. S'ils ont une utilité évidente en génomique comparative, ils n'en sont pas moins utilisés dans de nombreux autres domaines allant de l'étude de la biodiversité à l'épidémiologie en passant par les sciences forensiques. Les arbres phylogénétiques sont non seulement une charactérisation efficace mais aussi un outil puissant pour étudier l'évolution. Cependant, toute utilisation d'arbre dans une étude suppose que l'arbre ait été correctement estimé, tant au niveau de la topologie que des autres paramètres, alors que cette estimation est un problème statistique compliqué et encore très ouvert. On admet généralement qu'on ne peut faire de bonne estimation sans les quatre pré-requis que sont (1) le choix d'un ou plusieurs gènes pertinents pour la question étudiée, (2) une quantité suffisante de données pour s'assurer une bonne précision d'estimation, (3) une méthode de reconstruction efficace qui s'appuie sur une modélisation fine de l'évolution pour minimiser les biais de reconstruction, (4) un bon échantillonnage de taxons. Nous nous intéressons dans cette thèse à quatre thèmes étroitement liés à l'un ou l'autre de ces pré-requis. Dans la première partie, nous utilisons des inégalités de concentration pour étudier le lien entre précision d'estimation et quantité de données. Nous proposons ensuite une méthode basée sur des extensions de Edgeworth pour tester la congruence phylogénétique d'un nouveau gène avec ses prédécesseurs. Dans la deuxième partie, nous proposons deux méthodes, inspirées des analyses de sensibilités, pour détecter les sites et taxons aberrants. Ces points aberrants peuvent nuire à la robustesse des estimateurs et nous montrons sur des exemples comment quelques observations aberrantes seulement suffisent à drastiquement modifier les estimateurs. Nous discutons les implications de ces résultats et montrons comment augmenter la robustesse de l'estimateur de l'arbre en présence d'observations aberrantes.
372

Perception and re-synchronization issues for the watermarking of 3D shapes

Rondao Alface, Patrice 26 October 2006 (has links)
Digital watermarking is the art of embedding secret messages in multimedia contents in order to protect their intellectual property. While the watermarking of image, audio and video is reaching maturity, the watermarking of 3D virtual objects is still a technology in its infancy. In this thesis, we focus on two main issues. The first one is the perception of the distortions caused by the watermarking process or by attacks on the surface of a 3D model. The second one concerns the development of techniques able to retrieve a watermark without the availability of the original data and after common manipulations and attacks. Since imperceptibility is a strong requirement, assessing the visual perception of the distortions that a 3D model undergoes in the watermarking pipeline is a key issue. In this thesis, we propose an image-based metric that relies on the comparison of 2D views with a Mutual Information criterion. A psychovisual experiment has validated the results of this metric for the most common watermarking attacks. The other issue this thesis deals with is the blind and robust watermarking of 3D shapes. In this context, three different watermarking schemes are proposed. These schemes differ by the classes of 3D watermarking attacks they are able to resist to. The first scheme is based on the extension of spectral decomposition to 3D models. This approach leads to robustness against imperceptible geometric deformations. The weakness of this technique is mainly related to resampling or cropping attacks. The second scheme extends the first to resampling by making use of the automatic multiscale detection of robust umbilical points. The third scheme then addresses the cropping attack by detecting robust prong feature points to locally embed a watermark in the spatial domain.
373

Landscape, Kitchen, Table: Compressing the Food Axis to Serve a Food Desert

Elliott, Shannon Brooke 01 December 2010 (has links)
In the past, cities and their food system were spatially interwoven. However, rapid urbanization and the creation of industrialized agriculture have physically isolated and psychologically disconnected urban residents from the landscape that sustains them. Cities can no longer feed themselves and must rely on a global hinterland. Vital growing, preserving, and cooking knowledge has been lost, while negative health, economic, and environmental effects continue to develop from this separation. Low-income neighborhoods have significantly been affected where a lack of income and mobility pose barriers to adequate food access. Architects have addressed food issues individually, but have yet to take an integrative approach that meaningfully engages urban citizens with all processes of the food system. Urban planners have recently taken a holistic design approach to food issues through the development of the community food system concept. By applying this idea to an architectural program I have designed a Community Food Center for the Five Points Neighborhood in East Knoxville, TN. Spatially compressing and layering food activity spaces preserves the majority of the landscape on site for food production. The kitchen, dining room, market, and garden increase access to healthy food while serving as community gathering spaces, and the business incubator kitchens provide economic opportunities. The whole facility acts to educate and engage people in the growing, harvesting, preserving, cooking, sharing, and composting of food. Cities cannot sustain themselves by only providing spaces for consumption. Architects must challenge the accepted relationships between food system spaces and strive to reincorporate productive landscapes and spaces dedicated to transforming raw ingredients into a variety of architectural programs. Although the Five Points Community Food Center is site specific, the concept of integrating multiple food activities into a single architectural entity can be used as a tool for place making by expressing a local identity through food culture while improving the social and economic fabric.
374

Integrated Layout Design of Multi-component Systems

Zhu, Jihong 09 December 2008 (has links)
A new integrated layout optimization method is proposed here for the design of multi-component systems. By introducing movable components into the design domain, the components layout and the supporting structural topology are optimized simultaneously. The developed design procedure mainly consists of three parts: (i). Introduction of non-overlap constraints between components. The Finite Circle Method (FCM) is used to avoid the components overlaps and also overlaps between components and the design domain boundaries. It proceeds by approximating geometries of components and the design domain with numbers of circles. The distance constraints between the circles of different components are then imposed as non-overlap constraints. (ii). Layout optimization of the components and supporting structure. Locations and orientations of the components are assumed as geometrical design variables for the optimal placement. Topology design variables of the supporting structure are defined by the density points. Meanwhile, embedded meshing techniques are developed to take into account the finite element mesh change caused by the component movements. Moreover, to account for the complicated requirements from aerospace structural system designs, design-dependent loads related to the inertial load or the structural self-weight and the design constraint related to the system gravity center position are taken into account in the problem formulation. (iii). Consistent material interpolation scheme between element stiffness and inertial load. The common SIMP material interpolation model is improved to avoid the singularity of localized deformation due to the presence of design dependent loading when the element stiffness and the involved inertial load are weakened with the element material removal. Finally, to validate the proposed design procedure, a variety of multi-component system layout design problems are tested and solved on account of inertia loads and gravity center position constraint.
375

Local Hospital¡¦s Strategy Management Under National Health Insurance Policy

Lin, Chin-hsing 20 August 2007 (has links)
Since the global budget system carried out by National Health Insurance Bureau in 2003, hospital autonomy management practiced in 2003 as well as reviews carried by specialized doctors system established, local hospitals have faced critical impacts. The fluctuating point reimbursement was applied, it not only restrains the reimbursement received by the hospitals, but also causes management difficulties for local hospital as the fluctuating points shrink year by year. The number of western medical hospitals is decreasing. The number in 2000 was 575, but now only 500 local hospitals operate in Taiwan. For survival, local hospitals have to establish sound financial system in order to deal with changing national health insurance policies. On the other hand, they are encouraged to use strategy management theories to promote the competitiveness for local hospitals to well control their expenditures and create their income. The study has analyzed the statistical data from Statistical Office, Department of Health and National Health Insurance Bureau and integrated related literatures to understand the management strategies and responses of local hospitals under national health insurance policies and the economic, demographic and political environment. Results of the study will be provided for reference. Beside statistic data and literatures, strategy management concepts and theories were also adopted to clarify current policies of health insurance, reimbursement system and the situation of local hospitals to probe into the difficulties and possible solutions. Local hospitals were chosen to be the study subjects, and through SWOT Analysis, Poter¡¦s Five Force Analysis, Blue Sea Strategy and the findings from literature review, we found that (1)the financial gap of health insurance has transferred to medical organizations, especially local hospitals; (2)under global budget system, local hospitals have to increase income and decrease expenditures by transformation, running pay business or joint outpatient service; ¡]3¡^cost management is critical for local hospitals to establish internal strength; (4) referring to blue sea strategy to develop distinguished business and differential products can create niche for local hospitals to break through the bottleneck.
376

Computer-aided detection and novel mammography imaging techniques

Bornefalk, Hans January 2006 (has links)
This thesis presents techniques constructed to aid the radiologists in detecting breast cancer, the second largest cause of cancer deaths for western women. In the first part of the thesis, a computer-aided detection (CAD) system constructed for the detection of stellate lesions is presented. Different segmentation methods and an attempt to incorporate contra-lateral information are evaluated. In the second part, a new method for evaluating such CAD systems is presented based on constructing credible regions for the number of false positive marks per image at a certain desired target sensitivity. This method shows that the resulting regions are rather wide and this explains some of the difficulties encountered by other researchers when trying to compare CAD algorithms on different data sets. In this part an attempt to model the clinical use of CAD as a second look is also made and it shows that applying CAD in sequence to the radiologist in a routine manner, without duly altering the decision criterion of the radiologist, might very well result in suboptimal operating points. Finally, in the third part two dual-energy imaging methods optimized for contrast-enhanced imaging of breast tumors are presented. The first is based on applying an electronic threshold to a photon-counting digital detector to discriminate between high- and low-energy photons. This allows simultaneous acquisition of the high- and low-energy images. The second method is based on the geometry of a scanned multi-slit system and also allows single-shot contrast-enhanced dual-energy mammography by filtering the x-ray beam that reaches different detector lines differently. / QC 20100819
377

VÄG, VAL OCH VILLKOR : Individer som tidigare begått kriminella handlingar berättar

Skantze, Lina, Zandén, Bianca January 2008 (has links)
This thesis, titled “Change, choice and conditions”, is written by Lina Skantze and Bianca Zandén. The study explores the process in which individuals’ attempt to end their criminal career, focusing on the interplay between path of life, choices, and conditions. The method is qualitative, and the empirical material consists of interviews with four young adults that all have experience of criminality. The empirical material is analyzed within a theoretical framework based on social construction, Antonovskys “Sense of coherence, SOC” and Giddens “Structuration theory” as well as existential philosophy. The authors suggest a theoretically and empirically based model illustrating the change process. The model, developed through abduction, suggests that the process in changing ones life radically includes a number of steps such as; distance to everyday life and its habits, existential choices, new conditions, reflection around former situations and experiences, formulating a life story, new habits and routines, new and/or re-established social relationships, orientation towards new goals and a sense of meaning in life, as well as hopes and ideas about the future. The authors conclude that there are no absolute turning points in the lives of the interviewees. Instead change happens in a complex process best described as incremental, consisting of small – and sometimes incoherent – steps. However, certain situations during the process are crucial and offer opportunity for fundamental existential choices.
378

Benchmarking Points-to Analysis

Gutzmann, Tobias January 2013 (has links)
Points-to analysis is a static program analysis that, simply put, computes which objects created at certain points of a given program might show up at which other points of the same program. In particular, it computes possible targets of a call and possible objects referenced by a field. Such information is essential input to many client applications in optimizing compilers and software engineering tools. Comparing experimental results with respect to accuracy and performance is required in order to distinguish the promising from the less promising approaches to points-to analysis. Unfortunately, comparing the accuracy of two different points-to analysis implementations is difficult, as there are many pitfalls in the details. In particular, there are no standardized means to perform such a comparison, i.e, no benchmark suite - a set of programs with well-defined rules of how to compare different points-to analysis results - exists. Therefore, different researchers use their own means to evaluate their approaches to points-to analysis. To complicate matters, even the same researchers do not stick to the same evaluation methods, which often makes it impossible to take two research publications and reliably tell which one describes the more accurate points-to analysis. In this thesis, we define a methodology on how to benchmark points-to analysis. We create a benchmark suite, compare three different points-to analysis implementations with each other based on this methodology, and explain differences in analysis accuracy. We also argue for the need of a Gold Standard, i.e., a set of benchmark programs with exact analysis results. Such a Gold Standard is often required to compare points-to analysis results, and it also allows to assess the exact accuracy of points-to analysis results. Since such a Gold Standard cannot be computed automatically, it needs to be created semi-automatically by the research community. We propose a process for creating a Gold Standard based on under-approximating it through optimistic (dynamic) analysis and over-approximating it through conservative (static) analysis. With the help of improved static and dynamic points-to analysis and expert knowledge about benchmark programs, we present a first attempt towards a Gold Standard. We also provide a Web-based benchmarking platform, through which researchers can compare their own experimental results with those of other researchers, and can contribute towards the creation of a Gold Standard.
379

Qualité géométrique & aspect des surfaces : approches locales et globales

Le Goïc, Gaëtan 01 October 2012 (has links) (PDF)
Parmi tous les leviers à disposition des entreprises, la prise en compte de la perception par les clients est aujourd'hui centrale, dès la conception des produits. En effet, le consommateur est aujourd'hui mieux informé et attentif à ce qu'il perçoit de la qualité d'un produit et cette perception lui permet d'établir une valeur d'estime de la qualité esthétique des produits, mais aussi de ses fonctionnalités techniques. La méthodologie de l'analyse de la qualité d'aspect des surfaces est donc un enjeu essentiel pour l'industrie. Deux approches de la fonctionnalité des surfaces sont proposées afin de formaliser la méthodologie de détection, et d'apporter aux experts des critères objectifs d'évaluation des anomalies. La première approche proposée est basée sur la métrologie des surfaces. Elle consiste à analyser les topographies mesurées pour lier la fonction aspect aux caractéristiques géométriques extraites. Une approche multi-échelle basée sur la Décomposition Modale Discrète est mise en oeuvre afin de séparer efficacement les différents ordres de variations géométriques d'une surface, et ainsi d'isoler les anomalies d'aspect. D'autre part, cette méthode permet la mise en oeuvre du calcul des courbures sur une surface de façon simplifiée et robuste. On montre que cet attribut géométrique apporte une information supplémentaire et pertinente en lien avec la fonction aspect. Enfin, ces travaux ont mis en évidence l'importance de la qualité des données sources pour analyser l'aspect, et particulièrement deux difficultés d'ordre métrologiques, liées à la présence de points aberrants (hautes fréquences) et de variations géométriques non intrinsèques aux surfaces, générées par le moyen de mesure (basses fréquences). Une méthode innovante d'identification des points aberrants dédiée à la métrologie des surfaces et basée sur une approche statistique multi-échelle est proposée. La problématique des variations géométriques liées aux tables de positionnement du moyen de mesure est traitée au moyen de la Décomposition Modale, et un protocole pour corriger ces variations est présenté. La seconde approche, plus globale, est basée sur l'interaction entre les surfaces et l'environnement lumineux. L'objet de cette approche de l'analyse de l'aspect est d'apporter une aide aux experts pour mieux détecter les anomalies. Les travaux présentés sont basés sur la technique Polynomial Texture Mappings et consistent à modéliser la réflectance en chaque point des surfaces afin de simuler le rendu visuel sous un éclairage quelconque, à la manière de ce que font les opérateurs en analyse sensorielle pour faciliter la détection. Un dispositif d'aide à l'inspection des surfaces basé sur ce principe est présenté. Enfin, une approche industrielle est proposée afin de montrer comment ces 2 axes de recherche peuvent être complémentaires dans le cadre d'une méthodologie globale, industrielle, de l'analyse de la qualité d'aspect de surfaces.
380

802.11 Fingerprinting to Detect Wireless Stealth Attacks

Venkataraman, Aravind 20 November 2008 (has links)
We propose a simple, passive and deployable approach for fingerprinting traffic on the wired side as a solution for three critical stealth attacks in wireless networks. We focus on extracting traces of the 802.11 medium access control (MAC) protocol from the temporal arrival patterns of incoming traffic streams as seen on the wired side, to identify attacker behavior. Attacks addressed include unauthorized access points, selfish behavior at the MAC layer and MAC layer covert timing channels. We employ the Bayesian binning technique as a means of classifying between delay distributions. The scheme requires no change to the 802.11 nodes or protocol, exhibits minimal computational overhead and offers a single point of discovery. We evaluate our model using experiments and simulations.

Page generated in 0.0292 seconds