• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 52
  • 13
  • 10
  • 6
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 108
  • 18
  • 17
  • 15
  • 14
  • 13
  • 13
  • 12
  • 11
  • 11
  • 11
  • 11
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Factores determinantes de enfermedad diarreica aguda en menores de 05 años en el Hospital Emergencia de Vitarte en el periodo Enero 2013 - Diciembre 2015

Rivas Quique, Jersson Samuel January 2017 (has links)
Introducción: se llevó a cabo un estudio sobre los factores de riesgo para EDA en menores de 05 años, esta enfermedad constituyen un problema importante de salud pública en el mundo. Afectan a todos los grupos de edad, sin embargo, los más vulnerables son los menores de cinco años 4, y sus graves complicaciones y/o consecuencias. Objetivo principal: Identificar los factores determinantes de enfermedad diarreica aguda en menores de 05 años en el Hospital Emergencia de Vitarte en el periodo enero 2013 - diciembre 2015. Materiales y métodos: El presente estudio es de tipo observacional, analítico transversal y retrospectiva. Se desarrolló con una muestra de 180 pacientes, 90 casos y 90 controles, la recolección de datos fue mediante la encuesta elaborada por el autor. El análisis de los resultados se realizó mediante estadística descriptiva con uso de porcentajes y en la estadística analítica la Prueba de Chi Cuadrado, con intervalo de 95% para contrastación de hipótesis, y para la fuerza de Asociación entre variables se aplicó Odds Ratio. Resultados: se identificó como factor de riesgo para EDA a la falta de lactancia materna exclusiva y la edad mayor de un año (p-valor de 0,03 y 00.01 respectivamente) así mismo se observó que los niños mayores de 1 año tuvieron la probabilidad de riesgo de EDA en cuatro veces más frente a los niños menores de un año (OR: 4.29) (IC: 2.26 – 8.16). Los niños que no recibieron LME incrementaban el riesgo de EDA en 2.5 veces más frente a los niños que si recibieron LME (OR: 2.56) (IC: 1.05 – 6.24). Conclusiones: la falta de lactancia materna exclusiva, es un factor de riesgo aumentando la posibilidad de EDA en 2.56 veces más en comparación con los que si recibieron LME, y la edad mayor de un año para EDA en 4 veces.
2

How hardwired are we against threats? : An EDA study comparing modern and evolutionary fear-relevant stimuli.

Isaacs, Sofie January 2016 (has links)
The threat superiority effect refers to an ability to quickly and efficiently detect threatening cues in one’s environment. Hence, ensuing and appropriate behavioral defense responses entail greater chances of survival for an organism. Some researchers argue that natural selection has led us to automatically prioritize threats that would have been salient during the period of evolutionary adaptation; as for example snakes. However, others have also argued that activation of our defense response system is more flexible, thus able to also be triggered to dangers of more recent age: such as guns or airplane crashes. The present study has sought to impact this debate, by measuring the electrodermal activity (EDA) – more specifically the skin conductance responses (SCRs) – of subjects who were visually presented with both evolutionary (snakes and spiders) and modern (guns and knives) fear-relevant stimuli. The results demonstrated no significant difference between the two categories within subjects, suggesting that both evolutionary and modern threatening cues activate the defense response system in a similar manner. Although the results are preliminary, and would need further verification in higher powered studies, they can be seen to favor the view that our defense response system is flexibly adaptive in relation to the age of a given threat.
3

"Säg, är det konstigt att man längtar bort nån' gång?" : En studie av avgångsstudenter i Eda kommun, Värmland. / "Say, is it strange to long somewhere else?" : A study of graduates in Eda count, Wermland.

Fredriksson, Elin January 2015 (has links)
Med bakgrund av urbaniseringsproblematiken i Sveriges landsbygdskommuner gjordes en fallstudie utav Eda kommun. Syftet för denna kvantitativa studie var att undersöka ungas uppfattning om Eda kommun samt förstå varför unga migrerar till storstadskommuner. Den teoretiska referensramen innefattar tidigare studier och teorier angående migration, plats och ungdomars rörelsemönster samt föreställningar om landsbygden. Avgångsstudenter år 2015 skrivna i Eda kommun som går sista året på gymnasiet fick svara på en öppen enkät. Deras svar sammanställdes genom att dela in respondenterna i fyra grupper; yrkesförberedande man/kvinna och högskoleförberedande man/kvinna. I analysen möts deras svar med studiens teoretiska referensram och det påvisar att högskoleförberedande män och kvinnor vill migrera medan yrkesförberedande män och kvinnor hellre stannar kvar i kommunen. Anledningen till migration handlar om en hindrad personutvecklig samt försämrad tillgänglighet. Slutsatsen är genom att engagera de ungdomar som stannar kvar på orten igenom det sociala kapitalet i form av nätverk, kan kommunen öka ungdomarnas engagemang och därmed övertyga andra att stanna samt sprida en god anda.
4

Incremental Power Grid Verification

Abhishek 20 November 2012 (has links)
Verification of the on-die power grid is a key step in the design of complex high performance integrated circuits. For the very large grids in modern designs, incremental verification is highly desirable, because it allows one to skip the verification of a certain section of the grid (internal nodes) and instead, verify only the rest of the grid (external nodes). The focus of this work is to develop efficient techniques for incremental verification in the context of vectorless constraints-based grid verification, under dynamic conditions. The traditional difficulty is that the dynamic case requires iterative analysis of both the internal and the external sections. A solution in the transient case is provided through two key contributions: 1) a bound on the internal nodes’ voltages is developed that eliminates the need for iterative analysis, and 2) a multi-port Norton approach is used to construct a reduced macromodel for the internal section.
5

Incremental Power Grid Verification

Abhishek 20 November 2012 (has links)
Verification of the on-die power grid is a key step in the design of complex high performance integrated circuits. For the very large grids in modern designs, incremental verification is highly desirable, because it allows one to skip the verification of a certain section of the grid (internal nodes) and instead, verify only the rest of the grid (external nodes). The focus of this work is to develop efficient techniques for incremental verification in the context of vectorless constraints-based grid verification, under dynamic conditions. The traditional difficulty is that the dynamic case requires iterative analysis of both the internal and the external sections. A solution in the transient case is provided through two key contributions: 1) a bound on the internal nodes’ voltages is developed that eliminates the need for iterative analysis, and 2) a multi-port Norton approach is used to construct a reduced macromodel for the internal section.
6

World Wide Web based layout synthesis for analogue modules

Nalbantis, Dimitris January 2001 (has links)
No description available.
7

Electromigration modeling and layout optimization for advanced VLSI

Pak, Jiwoo 04 September 2015 (has links)
Electromigration (EM) is a critical problem for interconnect reliability in advanced VLSI design. Because EM is a strong function of current density, a smaller cross-sectional area of interconnects can degrade the EM-related lifetime of IC, which is expected to become more severe in future technology nodes. Moreover, as EM is governed by various factors such as temperature, material property, geometrical shape, and mechanical stress, different interconnect structures can have distinct EM issues and solutions to mitigate them. For example, one of the most prominent technologies, die stacking technology of three-dimensional (3D) ICs, can have different EM problems from that of planer ICs, due to their unique interconnects such as through-silicon vias (TSVs). This dissertation investigates EM in various interconnect structures, and applies the EM models to optimize IC layout. First, modeling of EM is developed for chip-level interconnects, such as wires, local vias, TSVs, and multi-scale vias (MSVs). Based on the models, fast and accurate EM-prediction methods are proposed for the chip-level designs. After that, by utilizing the EM-prediction methods, the layout optimization methods are suggested, such as EM-aware routing for 3D ICs and EM-aware redundant via insertion for the future technology nodes in VLSI. Experimental results show that the proposed EM modeling approaches enable fast and accurate EM evaluation for chip design, and the EM-aware layout optimization methods improve EM-robustness of advanced VLSI designs. / text
8

Recovering skin conductance responses in under-sampledelectrodermal activity data from wearables

Mukherjee, Abhishek 05 September 2019 (has links)
No description available.
9

Analyse et reconnaissance de signaux vibratoires : contribution au traitement et à l'analyse de signaux cardiaques pour la télémédecine / Analysis and recognition of vibratory signals : contribution to the treatment and analysis of cardiac signals for telemedecine

Beya, Ouadi 15 May 2014 (has links)
Le coeur est un muscle. Son fonctionnement mécanique est celui d'une pompe chargée de distribuer et de récupérer le sang dans les poumons et dans le système cardiovasculaire. Son fonctionnement électrique est régulé par le son noeud sinusal, un stimulateur ou régulateur électrique chargé de déclencher les battements naturels du coeur qui rythment le fonctionnement du corps. Les médecins surveillent ce fonctionnement électromécanique du coeur en enregistrant un signal électrique appelé électrocardiogramme (ECG) ou un signal sonore : le phono-cardiogramme (PCG). L'analyse et le traitement de ces deux signaux sont fondamentaux pour établir un diagnostic et aider à déceler des anomalies et des pathologies cardiaques. L’objectif de cette thèse est de développer des techniques de traitement des signaux ECG et notamment PCG afin d’aider le médecin dans son analyse de ces signaux. L’idée de fond est de mettre en point des algorithmes relativement simples et peu coûteux en temps de calcul. Le premier intérêt serait de garantir leur implantation aisée dans un système mobile de surveillance cardiaque à l’usage du médecin, voire du patient. Le deuxième intérêt réside dans la possibilité d’une analyse automatique en temps réel des signaux avec le dispositif mobile, autorisant le choix de la transmission de ces signaux pour une levée de doute. De nombreux travaux ont mené à des avancées significatives dans l’analyse des signaux ECG et la reconnaissance automatiques des pathologies cardiaques. Des bases de données de signaux réels ou synthétiques annotées permettent également d’évaluer les performances de toute nouvelle méthode. Quant aux signaux PCG, ils sont nettement moins étudiés, difficiles à analyser et à interpréter. Même si les grandes familles de méthodes (Fourier, Wigner Ville et ondelettes) ont été testées, elles n’autorisent pas une reconnaissance automatique des signatures, d’en avoir une analyse et une compréhension assez fines.La Transformée en Ondelettes (TO) sur les signaux cardiaques a montré son efficacité pour filtrer et localiser les informations utiles mais elle fait intervenir une fonction externe de traitement (ondelette mère) dont le choix dépend de la connaissance au préalable du signal à traiter. Ce n'est pas toujours adapté aux signaux cardiaques. De plus, la Transformée en ondelettes induit généralement une imprécision dans la localisation due à la fonction externe et éventuellement au sous-échantillonnage des signatures. La nature non stationnaire de l'ECG et du PCG et leur sensibilité aux bruits rendent difficile la séparation d’une transition informative d'une transition due aux bruits de mesure. Le choix de l'outil de traitement doit permettre un débruitage et une analyse de ces signaux sans délocalisation des singularités ni altération de leurs caractéristiques. En réponse à nos objectifs et considérant ces différents problèmes, nous proposons de nous appuyer principalement sur la décomposition modale empirique (EMD) ou transformée de Hilbert Huang (THH) pour développer des solutions. L’EMD est une approche non linéaire capable de décomposer le signal étudié en fonctions modales intrinsèques (IMF), oscillations du type FM-AM, donnant ainsi une représentation temps/échelle du signal. Associée à la transformée de Hilbert (TH), la THH permet de déterminer les amplitudes instantanées (AI) et les fréquences instantanées (FI) de chaque mode, menant ainsi à une représentation temps/fréquence des signaux.Sans faire intervenir une fonction externe, on peut ainsi restaurer (réduction de bruit), analyser et reconstruire le signal sans délocalisation de ses singularités. Cette approche permet de localiser les pics R de l'ECG, déterminer le rythme cardiaque et étudier la variabilité fréquentielle cardiaque (VFC), localiser et analyser les composantes des bruits B1 et B2 du PCG. / The heart is a muscle. Its mechanical operation is like a pump charged for distributing and retrieving the blood in the lungs and cardiovascular system. Its electrical operation is regulated by the sinus node, a pacemaker or electric regulator responsible for triggering the natural heart beats that punctuate the functioning of the body.Doctors monitor the electromechanical functioning of the heart by recording an electrical signal called an electrocardiogram (ECG) or an audible signal : the phonocardiogram (PCG). The analysis and processing of these two signals are essential for diagnosis, to help detect anomalies and cardiac pathologies.The objective of this thesis is to develop signal processing tools on ECG and PCG to assist cardiologist in his analysis of these signals. The basic idea is to develop algorithms of low complexity and having inexpensive computing time. The primary interest is to ensure their easy implementation in a mobile heart monitoring system for use by the doctor or the patient. The second advantage lies in the possibility of automatic real-time analysis of signals with the mobile device, allowing control of the transmission of these signals to a removal of doubt.Numerous studies have led to significant advances in the analysis of ECG signals and the automatic recognition of cardiac conditions. Databases of real or synthetic signals annotated also assess the performance of new methods. PCG signals are much less studied, difficult to analyze and to interpret. The main methods (Fourier, wavelet and Wigner Ville) were tested : they do not allow automatic recognition of signatures, and an accurate understanding of their contents.Wavelet Transform (WT) on cardiac signals showed its effectiveness to filter and locate useful information, but it involves an external processing function (mother wavelet) whose the choice depends on the prior knowledge on the signal to be processed. This is not always suitable for cardiac signals. Moreover, the wavelet transform generally induces inaccuracies in the location due to the external function and optionally due to the sub- sampling of the signatures.The non-stationary nature of the ECG and PCG and their sensitivity to noise makes it difficult to separate an informative transition of a transition due to measurement noise. The choice of treatment tool should allow denoising and analysis of these signals without alteration or the processing tool delocalization of the singularities.In response to our objectives and considering these problems, we propose to rely primarily on empirical mode decomposition (EMD) and Hilbert Huang Transform (HHT) to develop solutions. The EMD is a non linear approach decomposing the signal in intrinsic signal (IMF), oscillations of the type FM-AM, giving a time/scale signal representation. Associated with the Hilbert transform (TH), the THH determines the instantaneous amplitude (IA) and instantaneous frequency (IF) of each mode, leading to a time/frequency representation of the ECG and PCG.Without involving an external function, EMD approach can restore (noise reduction), analyze and reconstruct the signal without relocation of its singularities. This approach allows to locate R peaks of the ECG, heart rate and study the cardiac frequency variability (CFV), locate and analyze the sound components B1 and B2 of the PCG.Among the trials and developments that we made, we present in particular a new method (EDA : empirical denoising approach) inspired by the EMD approach for denoising cardiac signals. We also set out the implementation of two approaches for locating ECG signature (QRS complex, T and P waves). The first is based on the detection of local maxima : in using Modulus Maxima and Lipschitz exponent followed by a classifier. The second uses NFLS, wich an nonlinear approach for the detection and location of unique transitions in the discrete domain.
10

Prise en charge de l'agilité de workflows collaboratifs par une approche dirigée par les événements / Ensuring the agility of collaborative workflows through an event driven approach

Barthe, Anne-Marie 22 November 2013 (has links)
Les organisations participent à des collaborations pour faire face à un environnement en perpétuelle évolution (mondialisation, crise, etc.). Or, la nature instable de la collaboration et de son environnement peuvent compromettre la pertinence des processus définis pour atteindre les objectifs collectifs. Deux problématiques émergent alors de ce constat : comment détecter l'inadéquation des processus exécutés par rapport aux objectifs poursuivis à l'instant t ? Comment redéfinir la meilleure réponse possible à l'instant t, dans un laps de temps proche du temps réel (en fonction de la situation de collaboration, de l'avancée de l'exécution des processus, et l'état des acteurs et ressources). La problématique scientifique relève de l'apport d'agilité aux processus collaboratifs. Ces travaux de thèse ambitionnent de répondre à ces questions en proposant (i) une définition de l'agilité des processus collaboratifs, (ii) la mise en place d'une architecture orientée services dirigée par les événements (ED-SOA), afin de prendre en compte les événements émis par la collaboration et son environnement, (iii) un algorithme de mesure de distance entre le modèle de la situation collaborative telle qu'elle devrait être et le modèle de la situation telle qu'elle est réellement (détection), et enfin (iv) un algorithme de recommandation d'adaptation des processus collaboratifs. L'implémentation des mécanismes de l'agilité des processus collaboratifs a donné naissance à un prototype open-source. Les résultats de cette thèse s'inscrivent également au sein des projets ANR SocEDA et EUR FP7 PLAY qui proposent chacun une gestion des processus collaboratifs dans le contexte d'une plateforme dirigée par les événements. / It is known that organizations have to take part into collaborations to face un unstable world. But, the unstable nature of the collaboration and of its environment can threaten the accuracy of the processes. We can extract two main issues from this context : (i) How to detect the instant where the collaborative processes does not match with the current crisis situation and what are the causes; (ii) How to redefine the best possible response (i.e. at a concrete level, the best processes and their orchestration and choreography) at time t, in real time, depending of the collaborative situation, the processes execution progress, the state of the resources and collaboration's partners. The scientific problem is focused on the agility of collaborative processes. This thesis aspires to solve these issues by proposing (i) a definition for collaborative processes agility, (ii) the definition and the realization of an Event-Driven Architecture layer among the SOA architecture of the information system of the collaboration, in order to take in account the events emitted by the collaboration and its field to update the collaborative situation models. Then (iii) an algorithm to measure the distances between the model of the crisis situation at is it supposed to be when the processes are executed, and the model of the crisis situation as it really is, and (iv) a tool for decision support and a redesign of the crisis response processes and choreography. These results are in line with the ANR SocEDA project and the EUR FP7 PLAY project which aim at providing agility to collaborative processes, in an event-driven context. The mechanisms to ensure collaborative processes agility are implemented as an open-source prototype.

Page generated in 0.0272 seconds