• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 263
  • 38
  • 25
  • 24
  • 5
  • 4
  • 4
  • 4
  • 4
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 437
  • 87
  • 68
  • 62
  • 56
  • 53
  • 46
  • 40
  • 40
  • 39
  • 38
  • 38
  • 37
  • 34
  • 34
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

Vizualizace černoděrových prostoročasů / Visualization of black hole spacetimes

Maixner, Michal January 2018 (has links)
This work is focused on visualisation of Schwarzschild, Reissner- Nordström and Kerr black hole. The two-dimensional conformal diagram was constructed. In the case of Kerr black hole, the causal structure was visualized by intersection of chronological future of given point in spacetime with hyper- surfaces of constant value of Boyer-Lindquist coordinate t. Conformal diagram for Kerr black hole was constructed only in the neighbourhood of outer event horizon. Then the causal diagram, which is analogous to conformal diagram for Reissner-Nordström black hole was constructed. In all cases two-dimensional spa- celike hypersurfaces were chosen that were embedded into Euclidean space. The interpretation of time evolution of black hole universe was given to a sequence of such embedded hypersurfaces. In the case of Kerr black hole the embedding of outer ergosphere and outer event horizon were also constructed. 1
282

"Avaliação crítica do uso da reação em cadeia da polimerase e exames complementares no diagnóstico da tuberculose cutânea e micobacteriose atípica" / The role of polymerase chain reaction and panel exams in the diagnosis of cutaneous tuberculosis and atypical mycobacteria skin infection compared to clinical evaluation

Abdalla, Cristina Martinez Zugaib 30 November 2005 (has links)
Realizou-se estudo comparando o uso da reação em cadeia da polimerase à evolução clínica e painel de exames tradicionais para diagnóstico em pacientes com suspeita clínica de tuberculose cutânea e micobacteriose atípica. Observou-se sensibilidade da reação em cadeia da polimerase de 88%, especificidade de 83%, valor preditivo positivo de 82%, valor preditivo negativo de 88% e acurácia de 85% com concordância pelo teste de McNemar (p= 0,655). Os exames do painel de maior acurácia, após a reação em cadeia da polimerase, foram o teste tuberculínico com acurácia de 79% e a presença de dermatite crônica granulomatosa com reação em cadeia da polimerase positiva com acurácia também de 79%, ambos com concordância pelo teste de McNemar (p= 0,179 e p= 0,655, respectivamente) / A study was performed comparing the polymerase chain reaction and the traditional panel of exams for the diagnosis in patients with a clinical suspicion of cutaneous tuberculosis and atypical mycobacteria infection to the clinical evaluation. It was observed that the sensitivity of the PCR was 88%, the specificity was 83%, the positive predictive value was 82%, the negative predictive value was 88% and the accuracy was 85% in agreement with the McNemar test (p=0.655). The panel exams of second highest accuracy, were the tuberculin test with an accuracy of 79% and the chronic granulomatous dermatitis with positive PCR, also with an accuracy of 79%, both in agreement with the McNemar test (p=0.179 and p=0.655, respectively)
283

Incorporation biologique de l'adversité sociale précoce : le rôle de la charge allostatique dans une perspective biographique / Embodiment of early social adversity : the role of allostatic load in a life course perspective

Barboza Solís, Cristina 16 September 2016 (has links)
Introduction. La notion d'" embodiment " propose que chaque humain est à la fois un être social ainsi que biologique, intégrant le monde dans lequel il/elle vit. Nous faisons l'hypothèse que la position socioéconomique pendant l'enfance peut être biologiquement incorporée, conduisant à la production des inégalités sociales de santé entre les sous-groupes de population. La charge allostatique (CA) est un concept qui tente de capturer l'usure physiologique globale du corps liée à l'activation répétée des mécanismes physiologiques compensatoires en cas d'exposition à des stress chroniques. La CA pourrait permettre une meilleure compréhension des voies biologiques qui jouent un rôle potentiel dans la construction du gradient social de santé des adultes. Objectif. Pour explorer l'hypothèse d'incorporation biologique, nous avons examiné les voies de médiation entre les adversités psychosociales et la position socioéconomique précoces et la CA à 44 ans. Nous avons également confronté l'indice de CA à une mesure multidimensionnelle de santé latente à 50 ans. Méthodes. Les données sont issues de la cohorte Britannique de naissance de 1958 (n=18 000). La CA a été construite avec les données de l'enquête biomédicale conduite à 44 ans, comme une mesure physiologique synthétique, multi-système, à l'aide de 14 biomarqueurs représentant les systèmes neuroendocrinien, métabolique, immunitaire / inflammatoire et cardiorespiratoire. Résultats. L'ensemble de nos résultats suggèrent que la CA pourrait être un indice approprié pour capturer partiellement la dimension biologique des processus d'embodiment. Discussion. Comprendre comment l'environnement affecte notre santé en se " glissant sous la peau " et pénétrant dans les cellules, les organes et les systèmes physiologiques de notre corps est un principe clé dans la recherche en santé publique. Promouvoir le recueil de marqueurs biologiques dans des grandes études prospectives et représentatives est crucial pour continuer la recherche sur ce sujet. Les études de réplication pourraient faire partie des futures perspectives de recherche, pour comparer entre populations avec des contextes culturels différents pour observer si un index de CA peut être considéré comme "universel ". / Introduction. The notion of embodiment proposes that every human being is both a social and a biological organism that incorporates the world in which (s)he lives. It has been hypothesized that early life socioeconomic position (SEP) can be biologically embedded, potentially leading to the production of health inequalities across population groups. Allostatic load (AL) is a concept that intends to capture the overall physiological wear-and-tear of the body triggered by the repeated activation of compensatory physiological mechanisms as a response to chronic stress. AL could allow a better understanding of the potential biological pathways playing a role in the construction of the social gradient in adult health. Objective. To explore the biological embedding hypothesis, we examined the mediating pathways between early SEP and early adverse psychosocial experiences and higher AL at 44 years. We also confronted an AL index with a latent multidimensional and integrative measure of health status at 50y. Methods. Data are from the 1958 British birth cohort (n=18 000) follow-up to age 50. AL was operationalized using data from the biomedical survey collected at age 44 on 14 parameters representing the neuroendocrine, metabolic, immune-inflammatory and cardiovascular systems. Results. Overall, our results suggest that AL could be a suitable index to partially capture the biological dimensions of embodiment processes. Discussion. Understanding how human environments affect our health by 'getting under the skin' and penetrating the cells, organs and physiological systems of our bodies is a key tenet in public health research. Promoting the collection of biological markers in large representative and prospective studies is crucial to continue to investigate on this topic. Replication studies could be part of the future research perspectives, to compare with other cultural context and to observe if an AL index can be 'universal'.
284

Fault Tolerance in Cryptographic Applications Using Cover-Free Families

Bardini Idalino, Thais 27 September 2019 (has links)
Cryptography is of fundamental importance to guarantee the security of communications, from the point-to-point transmission of messages and documents to the storage of big sets of data on cloud providers. As an example, we can use encryption, message authentication codes, and digital signatures to help us guarantee the privacy, integrity, and authenticity of data that is stored and/or transmitted. Many cryptographical solutions have a Boolean outcome, for example, the message is either authentic and accepted as it is, or it is not and so it needs to be rejected/re-transmitted. This outcome may be acceptable in scenarios where we can easily re-transmit the message, but it can pose a challenge when we deal with a large amount of data or a more sensitive content in which changes need to be further explored. In this context, this thesis proposes solutions to provide fault tolerance to certain cryptographic problems that traditionally have an all-or-nothing outcome. Fault tolerance is application dependent. In the case of a digital signature of a doc- ument that has been later modified, a fault-tolerant scheme can ensure authenticity and further identify which parts of the document were modified. This approach can be used in data forensics to investigate cybercrime, or to create redactable signatures for the purpose of privacy. In the case of aggregation of signatures, we consider an aggregation of a set of signatures containing a few invalid signatures (in the traditional sense). A fault-tolerant scheme is able to identify which signatures are valid and which ones are invalid, instead of rejecting the whole set. Digital signatures and aggregation of digital signatures require fault tolerance to be ensured at the origin (signer algorithm and aggregation algorithm, respectively), rather than just at the destination (verification algorithm). For this rea- son, we focus on techniques from combinatorial group testing that are nonadaptive rather than adaptive. This is in contrast with other applications of group testing, such as batch verification of signatures, employed at the verifier’s end which allow both adaptive and nonadaptive solutions. In this work, we explore solutions for fault tolerance using techniques of identification of defective elements used in nonadaptive combinatorial group testing. More specifically, we use the well studied cover-free families (CFFs). A d-cover-free family d-CFF(t, n) is a set system with n subsets of a t-set, where the union of any d subsets does not contain any other. A d-CFF(t, n) allows for the identification of up to d defective elements in a set of n elements by performing only t tests (typically t ≪ n). In the literature, CFFs are used to solve many problems in cryptography. In this work, we explore different aspects of cover-free families in order to better approach fault tolerance problems. The problems we investigate can be divided into two categories: static problems (fixed size) and dynamic problems (increasing size). In the context of static problems, we consider modification-tolerant digital signature schemes, which allow the identification of modifica- tions in signed data using a d-CFF, and in some cases the correction of such modifications in order to retrieve the originally signed data. We also propose a generalization of the classical definition of a d-CFF to support variable cover-free property, followed by some constructions and potential applications in cryptography. For dynamic problems, we con- sider the application of fault-tolerant aggregation of signatures. This problem requires an ii infinite sequence of CFFs with increasing n, consequently increasing t, and potentially in- creasing d. In this context, we investigate monotone, nested, and embedding sequences of CFFs, and propose constructions using techniques from combinatorial design theory and finite fields. In constructing these families, we are concerned with their compression ratio. The compression ratio of a sequence of CFFs measures how slowly the number of tests grow with respect to the number of elements to be tested, which affects the overall efficiency of the method. We provide constructions of CFF sequences with different compression ratios, which depend on relations among the CFF parameters in the sequence and the type of sequence. Some of these sequences achieve the best possible compression ratio, which meets the best known upper bound. Monotone, nested and embedding sequences of CFFs can be used in any group testing problem that is dynamic in nature. We discuss various cryptographical applications that can benefit from these infinite sequences of CFFs.
285

Information extraction and mapping for KG construction with learned concepts from scientic documents : Experimentation with relations data for development of concept learner

Malik, Muhammad Hamza January 2020 (has links)
Systematic review of research manuscripts is a common procedure in which research studies pertaining a particular field or domain are classified and structured in a methodological way. This process involves, between other steps, an extensive review and consolidation of scientific metrics and attributes of the manuscripts, such as citations, type or venue of publication. The extraction and mapping of relevant publication data, evidently, is a very laborious task if performed manually. Automation of such systematic mapping steps intend to reduce the human effort required and therefore can potentially reduce the time required for this process.The objective of this thesis is to automate the data extraction and mapping steps when systematically reviewing studies. The manual process is replaced by novel graph modelling techniques for effective knowledge representation, as well as novel machine learning techniques that aim to learn these representations. This eventually automates this process by characterising the publications on the basis of certain sub-properties and qualities that give the reviewer a quick high-level overview of each research study. The final model is a concept learner that predicts these sub-properties which in addition addresses the inherent concept-drift of novel manuscripts over time. Different models were developed and explored in this research study for the development of concept learner.Results show that: (1) Graph reasoning techniques which leverage the expressive power in modern graph databases are very effective in capturing the extracted knowledge in a so-called knowledge graph, which allows us to form concepts that can be learned using standard machine learning techniques like logistic regression, decision trees and neural networks etc. (2) Neural network models and ensemble models outperformed other standard machine learning techniques like logistic regression and decision trees based on the evaluation metrics. (3) The concept learner is able to detect and avoid concept drift by retraining the model. / Systematisk granskning av forskningsmanuskript är en vanlig procedur där forskningsstudier inom ett visst område klassificeras och struktureras på ett metodologiskt sätt. Denna process innefattar en omfattande granskning och sammanförande av vetenskapliga mätvärden och attribut för manuskriptet, såsom citat, typ av manuskript eller publiceringsplats. Framställning och kartläggning av relevant publikationsdata är uppenbarligen en mycket mödosam uppgift om den utförs manuellt. Avsikten med automatiseringen av processen för denna typ av systematisk kartläggning är att minska den mänskliga ansträngningen, och den tid som krävs kan på så sätt minskas. Syftet med denna avhandling är att automatisera datautvinning och stegen för kartläggning vid systematisk granskning av studier. Den manuella processen ersätts av avancerade grafmodelleringstekniker för effektiv kunskapsrepresentation, liksom avancerade maskininlärningstekniker som syftar till att lära maskinen dessa representationer. Detta automatiserar så småningom denna process genom att karakterisera publikationerna beserat på vissa subjektiva egenskaper och kvaliter som ger granskaren en snabb god översikt över varje forskningsstudie. Den slutliga modellen är ett inlärningskoncept som förutsäger dessa subjektiva egenskaper och dessutom behandlar den inneboende konceptuella driften i manuskriptet över tiden. Olika modeller utvecklades och undersöktes i denna forskningsstudie för utvecklingen av inlärningskonceptet. Resultaten visar att: (1) Diagrammatiskt resonerande som uttnytjar moderna grafdatabaser är mycket effektiva för att fånga den framställda kunskapen i en så kallad kunskapsgraf, och gör det möjligt att vidareutveckla koncept som kan läras med hjälp av standard tekniker för maskininlärning. (2) Neurala nätverksmodeller och ensemblemodeller överträffade andra standard maskininlärningstekniker baserat på utvärderingsvärdena. (3) Inlärningskonceptet kan detektera och undvika konceptuell drift baserat på F1-poäng och omlärning av algoritmen.
286

Physics from Wholeness : Dynamical Totality as a Conceptual Foundation for Physical Theories

Piechocinska, Barbara January 2005 (has links)
Motivated by reductionism's current inability to encompass the quantum theory we explore an indivisible and dynamical wholeness as an underlying foundation for physics. After reviewing the role of wholeness in the quantum theory we set a philosophical background aiming at introducing an ontology, based on a dynamical wholeness. Equipped with the philosophical background we then propose a mathematical realization by representing the dynamics with a non-trivial elementary embedding from the mathematical universe to itself. By letting the embedding interact with itself through application we obtain a left-distributive universal algebra that is isomorphic to special braids. Via the connection between braids and quantum and statistical physics we show that a the mathematical structure obtained from wholeness yields known physics in a special case. In particular we point out the connections to algebras of observables, spin networks, and statistical mechanical models used in solid state physics, such as the Potts model. Furthermore we discuss the general case and there the possibility of interpreting the mathematical structure as a dynamics beyond unitary evolution, where entropy increase is involved.
287

Information-Theoretically Secure Communication Under Channel Uncertainty

Ly, Hung Dinh 2012 May 1900 (has links)
Secure communication under channel uncertainty is an important and challenging problem in physical-layer security and cryptography. In this dissertation, we take a fundamental information-theoretic view at three concrete settings and use them to shed insight into efficient secure communication techniques for different scenarios under channel uncertainty. First, a multi-input multi-output (MIMO) Gaussian broadcast channel with two receivers and two messages: a common message intended for both receivers (i.e., channel uncertainty for decoding the common message at the receivers) and a confidential message intended for one of the receivers but needing to be kept asymptotically perfectly secret from the other is considered. A matrix characterization of the secrecy capacity region is established via a channel-enhancement argument and an extremal entropy inequality previously established for characterizing the capacity region of a degraded compound MIMO Gaussian broadcast channel. Second, a multilevel security wiretap channel where there is one possible realization for the legitimate receiver channel but multiple possible realizations for the eavesdropper channel (i.e., channel uncertainty at the eavesdropper) is considered. A coding scheme is designed such that the number of secure bits delivered to the legitimate receiver depends on the actual realization of the eavesdropper channel. More specifically, when the eavesdropper channel realization is weak, all bits delivered to the legitimate receiver need to be secure. In addition, when the eavesdropper channel realization is strong, a prescribed part of the bits needs to remain secure. We call such codes security embedding codes, referring to the fact that high-security bits are now embedded into the low-security ones. We show that the key to achieving efficient security embedding is to jointly encode the low-security and high-security bits. In particular, the low-security bits can be used as (part of) the transmitter randomness to protect the high-security ones. Finally, motivated by the recent interest in building secure, robust and efficient distributed information storage systems, the problem of secure symmetrical multilevel diversity coding (S-SMDC) is considered. This is a setting where there are channel uncertainties at both the legitimate receiver and the eavesdropper. The problem of encoding individual sources is first studied. A precise characterization of the entire admissible rate region is established via a connection to the problem of secure coding over a three-layer wiretap network and utilizing some basic polyhedral structure of the admissible rate region. Building on this result, it is then shown that the simple coding strategy of separately encoding individual sources at the encoders can achieve the minimum sum rate for the general S-SMDC problem.
288

On Graph Embeddings and a new Minor Monotone Graph Parameter associated with the Algebraic Connectivity of a Graph

Wappler, Markus 07 June 2013 (has links) (PDF)
We consider the problem of maximizing the second smallest eigenvalue of the weighted Laplacian of a (simple) graph over all nonnegative edge weightings with bounded total weight. We generalize this problem by introducing node significances and edge lengths. We give a formulation of this generalized problem as a semidefinite program. The dual program can be equivalently written as embedding problem. This is fifinding an embedding of the n nodes of the graph in n-space so that their barycenter is at the origin, the distance between adjacent nodes is bounded by the respective edge length, and the embedded nodes are spread as much as possible. (The sum of the squared norms is maximized.) We proof the following necessary condition for optimal embeddings. For any separator of the graph at least one of the components fulfills the following property: Each straight-line segment between the origin and an embedded node of the component intersects the convex hull of the embedded nodes of the separator. There exists always an optimal embedding of the graph whose dimension is bounded by the tree-width of the graph plus one. We defifine the rotational dimension of a graph. This is the minimal dimension k such that for all choices of the node significances and edge lengths an optimal embedding of the graph can be found in k-space. The rotational dimension of a graph is a minor monotone graph parameter. We characterize the graphs with rotational dimension up to two.
289

Financial time series analysis : Chaos and neurodynamics approach

Sawaya, Antonio January 2010 (has links)
This work aims at combining the Chaos theory postulates and Artificial Neural Networks classification and predictive capability, in the field of financial time series prediction. Chaos theory, provides valuable qualitative and quantitative tools to decide on the predictability of a chaotic system. Quantitative measurements based on Chaos theory, are used, to decide a-priori whether a time series, or a portion of a time series is predictable, while Chaos theory based qualitative tools are used to provide further observations and analysis on the predictability, in cases where measurements provide negative answers. Phase space reconstruction is achieved by time delay embedding resulting in multiple embedded vectors. The cognitive approach suggested, is inspired by the capability of some chartists to predict the direction of an index by looking at the price time series. Thus, in this work, the calculation of the embedding dimension and the separation, in Takens‘ embedding theorem for phase space reconstruction, is not limited to False Nearest Neighbor, Differential Entropy or other specific method, rather, this work is interested in all embedding dimensions and separations that are regarded as different ways of looking at a time series by different chartists, based on their expectations. Prior to the prediction, the embedded vectors of the phase space are classified with Fuzzy-ART, then, for each class a back propagation Neural Network is trained to predict the last element of each vector, whereas all previous elements of a vector are used as features.
290

Control Of Hexapedal Pronking Through A Dynamically Embedded Spring Loaded Inverted Pendulum Template

Ankarali, Mustafa Mert 01 February 2010 (has links) (PDF)
Pronking is a legged locomotory gait in which all legs are used in synchrony, usually resulting in slow speeds but long flight phases and large jumping heights that may potentially be useful for mobile robots locomoting in cluttered natural environments. Instantiations of this gait for robotic systems suffer from severe pitch instability either due to underactuated leg designs, or the open-loop nature of proposed controllers. Nevertheless, both the kinematic simplicity of this gait and its dynamic nature suggest that the Spring-Loaded Inverted Pendulum Model (SLIP), a very successful predictive model for both natural and robotic runners, would be a good basis for more robust and maneuverable robotic pronking. In the scope of thesis, we describe a novel controller to achieve stable and controllable pronking for a planar, underactuated hexapod model, based on the idea of &ldquo / template-based control&rdquo / , a controller structure based on the embedding of a simple dynamical template within a more complex anchor system. In this context, high-level control of the gait is regulated through speed and height commands to the SLIP template, while the embedding controller based on approximate inverse-dynamics and carefully designed passive robot morphology ensures the stability of the remaining degrees of freedom. We show through extensive simulation experiments that unlike existing open-loop alternatives, the resulting control structure provides stability, explicit maneuverability and significant robustness against sensor noise.

Page generated in 0.0789 seconds