• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1314
  • 700
  • 234
  • 112
  • 97
  • 43
  • 36
  • 18
  • 16
  • 16
  • 15
  • 15
  • 11
  • 10
  • 10
  • Tagged with
  • 3151
  • 582
  • 547
  • 368
  • 355
  • 298
  • 296
  • 294
  • 237
  • 221
  • 215
  • 208
  • 191
  • 186
  • 180
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1031

Förändrat omhändertagande av patienter med uretärsten : - Lärdomar från ett förbättringsarbete

Khatami, Annelie January 2014 (has links)
Bakgrund: Omkring 10-15 % av befolkningen, oftast i arbetsför ålder, riskerar att någon gång drabbas av njursten. Nationella riktlinjer för njurstensbehandling saknas, men studier stödjer behandling inom 48 timmar för snabb symtomlindring och minskade besvär för patienten. Inom studerad verksamhet var tiden från diagnos till behandling lång och återinläggningarna var många, varför ett förbättringsarbete initierades. Syfte: Syftet med förbättringsarbetet var att halvera tiden från diagnos till behandling för patienter med akut behandlingskrävande uretärsten, samt minska negativa effekter relaterade till obehandlad uretärsten. Vidare syftade studien av förbättringsarbetet till att beskriva ett tvärprofessionellt teams erfarenheter av aktuellt förbättringsarbete gällande patienter med uretärsten. Metod: Ett tvärprofessionellt team bedrev förbättringsarbetet med stöd av Nolans modell för förbättringsarbete, vilket studerades genom en deskriptiv fallstudie med induktiv ansats. Effekterna av förbättringsarbetet utvärderades med Statistical Process Control (SPC). Vidare studerades teamets erfarenheter genom gruppintervjuer, och skriftliga berättelser vars data analyserades och sammanställdes genom kvalitativ innehållsanalys. Resultat: Målet med behandlingstiden uppnåddes inte, men positiva effekter för patienterna uppmättes. ESWL-behandling inom 48 timmar minskade tiden från diagnos till sista behandling. Planering, samarbete, information var nödvändigt för att lyckas med ett förbättringsarbete, men i kontexten fanns motsättningar, vilket försvårade arbetet, så som hög arbetsbelastning och bristande rutiner. Vidare beskrevs en bristande helhetssyn inom verksamheterna kring patienter med njursten, vilket ledde till varierande drivkrafter hos medarbetarna. Slutsatser: ESWL inom 48 timmar förkortade tiden från diagnos till behandling, även hos de patienter som behövde ombehandlas. Utmaningarna i ett förbättringsarbete finns inom olika nivåer, inom en komplex organisation. Riktlinjer och en gemensam målsättning är viktigt för att erbjuda patienterna ändamålsriktig vård i rätt tid. Kommunikation är grundläggande för att lyckas med ett förbättringsarbete. / Background: About 10-15% of the population, mostly at working age, has the risk that at some point be affected by kidney stones. There is a lack of national guidelines for kidney stone treatment, but several studies suggest treatment to start within 48 hours for rapid symptom relief and reduced discomfort for the patient. Within the studied context, the time from diagnosis to final treatment was too long, and the readmission rate was high, why a quality improvement project was initiated. Purpose: The aim of the Quality Improvement project was to halve the time from diagnosis to final treatment for the patients suffering from urethral calculi, and to reduce negative impacts related to an untreated urethral stone. Furthermore the aim of the study was to describe a multi-professional teams’ experiences of actual Quality improvement project. Method: Nolans model for Improvement was used by the team. The effects of the quality improvement were evaluated with Statistical Process Control (SPC). A case study with inductive approach was used. The teams’ experiences was studied through group interviews, and written stories and the data were conducted through qualitative content analysis Results: The goal considering time to final treatment was not achieved, but positive effects for the patients were noted. Extracorporeal Shock Wave Lithotripsy (ESWL) treatment within 48 hours reduced the time from diagnosis to final treatment. Planning, cooperation and communication was the key factors for success for quality improvement. Several barriers was identified in the context, such as; high work load and indistinct routines, which complicated their work. Furthermore a lack of holistic view, considering patients with kidney stone was described, which led to a variation in the driving forces among the employees. Conclusions: ESWL in 48 hours shortened the time from diagnose to final treatment, even if a retreatment was necessary. In a complex organization, the challenges conducting a quality improvement project is on several levels. Well known guidelines and a shared goal for the entire process are important to be able to offer patients appropriate care at the right time.  Communication is fundamental to achieve success.
1032

The Approximability of Learning and Constraint Satisfaction Problems

Wu, Yi 07 October 2010 (has links)
An α-approximation algorithm is an algorithm guaranteed to output a solutionthat is within an α ratio of the optimal solution. We are interested in thefollowing question: Given an NP-hard optimization problem, what is the bestapproximation guarantee that any polynomial time algorithm could achieve? We mostly focus on studying the approximability of two classes of NP-hardproblems: Constraint Satisfaction Problems (CSPs) and Computational Learning Problems. For CSPs, we mainly study the approximability of MAX CUT, MAX 3-CSP,MAX 2-LINR, VERTEX-PRICING, as well as serval variants of the UNIQUEGAMES.• The problem of MAX CUT is to find a partition of a graph so as to maximizethe number of edges between the two partitions. Assuming theUnique Games Conjecture, we give a complete characterization of the approximationcurve of the MAX CUT problem: for every optimum value ofthe instance, we show that certain SDP algorithm with RPR2 roundingalways achieve the optimal approximation curve.• The input to a 3-CSP is a set of Boolean constraints such that each constraintcontains at most 3 Boolean variables. The goal is to find an assignmentto these variables to maximize the number of satisfied constraints.We are interested in the case when a 3-CSP is satisfiable, i.e.,there does exist an assignment that satisfies every constraint. Assumingthe d-to-1 conjecture (a variant of the Unique Games Conjecture), weprove that it is NP-hard to give a better than 5/8-approximation for theproblem. Such a result matches a SDP algorithm by Zwick which givesa 5/8-approximation problem for satisfiable 3-CSP. In addition, our resultalso conditionally resolves a fundamental open problem in PCP theory onthe optimal soundness for a 3-query nonadaptive PCP system for NP withperfect completeness.• The problem of MAX 2-LINZ involves a linear systems of integer equations;these equations are so simple such that each equation contains atmost 2 variables. The goal is to find an assignment to the variables so asto maximize the total number of satisfied equations. It is a natural generalizationof the Unique Games Conjecture which address the hardness ofthe same equation systems over finite fields. We show that assuming theUnique Games Conjecture, for a MAX 2-LINZ instance, even that thereexists a solution that satisfies 1−ε of the equations, it is NP-hard to findone that satisfies ² of the equations for any ε > 0.
1033

Securing the Northern Region of Ghana? Development Aid and Security Interventions

Torto, Eric Obodai January 2013 (has links)
This dissertation offers a perspective through which we can explore the processes of joint development and security interventions in conflict-prone regions. In employing the experiences of the Northern Region of Ghana as my case study, this thesis examines the ways that the rationales of both development and security interventions are articulated in the field of practice. The central argument of the thesis is that most analyses of aid interventions, particularly those stemming from mainstream development literature, rarely interrogate the underlying rationales and assumptions behind the ideas, strategies and discourses employed in aid intervention. Notably, these rationales and assumptions tend to reduce the complexity of development and security challenges, and, as an end result, facilitate the implementation of technical solutions. The translation of development and security discourses and strategies into programmable practices as they encounter a local population is characterized by complex processes. Following the central argument of the thesis, the key research question interrogates the way that the rationales behind development aid and security interventions have been articulated in conflict- prone Northern Region and how they have been received by the local population. With the overarching aim of understanding the complexities associated with the joint articulation of development and security programmes, this study provides a unique and critical analysis of international development and security practices. The study also provides deeper understanding of the broad socio-economic and political contexts for the delivery of aid interventions. I scrutinize the rationales behind these interventions through the critical examination of colonial practices and three contemporary interventions: 1) Region-wide interventions, 2) the UN Human Security Program, and 3) Post-liberal interventions used as a panacea to prevailing implementation challenges. Based on the analysis of archival documents, alongside policy, program, and interview documents, my study reveals the ways that the development-security nexus perpetrates liberal practices in the declared conflict-prone Northern Region of Ghana. I also evaluate the way that the development-security nexus reconstitutes individuals as resilient subjects through practices of empowerment and entrepreneurialism, and demonstrates the contestations, contradictions, and colonial features that characterize interventions in the field of articulation.
1034

Communication Complexity of Remote State Preparation

Bab Hadiashar, Shima 24 September 2014 (has links)
Superdense coding and quantum teleportation are two phenomena which were not possible without prior entanglement. In superdense coding, one sends n bits of information using n/2 qubits in the presence of shared entanglement. However, we show that n bits of information cannot be sent with less than n bits of communication in LOCC protocols even in the presence of prior entanglement. This is an interesting result which will be used in the rest of this thesis. Quantum teleportation uses prior entanglement and classical communication to send an unknown quantum state. Remote state preparation (RSP) is the same distributed task, but in the case that the sender knows the description of the state to be sent, completely. We study the communication complexity of approximate remote state preparation in which the goal is to prepare an approximation of the desired quantum state. Jain showed that the worst-case error communication complexity of RSP can be bounded from above in terms of the maximum possible information in an encoding [18]. He also showed that this quantity is a lower bound for communication complexity of exact remote state preparation [18]. In this thesis, we characterize the worst-case error and average-case error communication complexity of remote state preparation in terms of non-asymptotic information-theoretic quantities. We also utilize the bound we derived for the communication complexity of LOCC protocols in the first part of the thesis, to show that the average-case error communication complexity of RSP can be much smaller than the worst-case.
1035

Topics in discrete optimization: models, complexity and algorithms

He, Qie 13 January 2014 (has links)
In this dissertation we examine several discrete optimization problems through the perspectives of modeling, complexity and algorithms. We first provide a probabilistic comparison of split and type 1 triangle cuts for mixed-integer programs with two rows and two integer variables in terms of cut coefficients and volume cutoff. Under a specific probabilistic model of the problem parameters, we show that for the above measure, the probability that a split cut is better than a type 1 triangle cut is higher than the probability that a type 1 triangle cut is better than a split cut. The analysis also suggests some guidelines on when type 1 triangle cuts are likely to be more effective than split cuts and vice versa. We next study a minimum concave cost network flow problem over a grid network. We give a polytime algorithm to solve this problem when the number of echelons is fixed. We show that the problem is NP-hard when the number of echelons is an input parameter. We also extend our result to grid networks with backward and upward arcs. Our result unifies the complexity results for several models in production planning and green recycling including the lot-sizing model, and gives the first polytime algorithm for some problems whose complexities were not known before. Finally, we examine how much complexity randomness will bring to a simple combinatorial optimization problem. We study a problem called the sell or hold problem (SHP). SHP is to sell k out of n indivisible assets over two stages, with known first-stage prices and random second-stage prices, to maximize the total expected revenue. Although the deterministic version of SHP is trivial to solve, we show that SHP is NP-hard when the second-stage prices are realized as a finite set of scenarios. We show that SHP is polynomially solvable when the number of scenarios in the second stage is constant. A max{1/2,k/n}-approximation algorithm is presented for the scenario-based SHP.
1036

Complexities of Order-Related Formal Language Extensions / Komplexiteter hos ordnings-relaterade utökningar av formella språk

Berglund, Martin January 2014 (has links)
The work presented in this thesis discusses various formal language formalisms that extend classical formalisms like regular expressions and context-free grammars with additional abilities, most relating to order. This is done while focusing on the impact these extensions have on the efficiency of parsing the languages generated. That is, rather than taking a step up on the Chomsky hierarchy to the context-sensitive languages, which makes parsing very difficult, a smaller step is taken, adding some mechanisms which permit interesting spatial (in)dependencies to be modeled. The most immediate example is shuffle formalisms, where existing language formalisms are extended by introducing operators which generate arbitrary interleavings of argument languages. For example, introducing a shuffle operator to the regular expressions does not make it possible to recognize context-free languages like anbn, but it does capture some non-context-free languages like the language of all strings containing the same number of as, bs and cs. The impact these additions have on parsing has many facets. Other than shuffle operators we also consider formalisms enforcing repeating substrings, formalisms moving substrings around, and formalisms that restrict which substrings may be concatenated. The formalisms studied here all have a number of properties in common. They are closely related to existing regular and context-free formalisms. They operate in a step-wise fashion, deriving strings by sequences of rule applications of individually limited power. Each step generates a constant number of symbols and does not modify parts that have already been generated. That is, strings are built in an additive fashion that does not explode in size (in contrast to e.g. Lindenmayer systems). All languages here will have a semi-linear Parikh image. They feature some interesting characteristic involving order or other spatial constraints. In the example of the shuffle multiple derivations are in a sense interspersed in a way that each is unaware of. All of the formalisms are intended to be limited enough to make an efficient parsing algorithm at least for some cases a reasonable goal. This thesis will give intuitive explanations of a number of formalisms fulfilling these requirements, and will sketch some results relating to the parsing problem for them. This should all be viewed as preparation for the more complete results and explanations featured in the papers given in the appendices. / Denna avhandling diskuterar utökningar av klassiska formalismer inom formella språk, till exempel reguljära uttryck och kontextfria grammatiker. Utökningarna handlar på ett eller annat sätt omordning, och ett särskilt fokus ligger på att göra utökningarna på ett sätt som dels har intressanta spatiala/ordningsrelaterade effekter och som dels bevarar den effektiva parsningen som är möjlig för de ursprungliga klassiska formalismerna. Detta står i kontrast till att ta det större steget upp i Chomsky-hierarkin till de kontextkänsliga språken, vilket medför ett svårt parsningsproblem. Ett omedelbart exempel på en sådan utökning är s.k. shuffle-formalismer. Dessa utökar existerande formalismer genom att introducera operatorer som godtyckligt sammanflätar strängar från argumentspråk. Om shuffle-operator introduceras till de reguljära uttrycken ger det inte förmågan att känna igen t.ex. det kontextfria språket anbn, men det fångar istället vissa språk som inte är kontextfria, till exempel språket som består av alla strängar som innehåller lika många a:n, b:n och c:n. Sättet på vilket dessa utökningar påverkar parsningsproblemet är mångfacetterat. Utöver dessa shuffle-operatorer tas också formalismer där delsträngar kan upprepas, formalismer där delsträngar flyttas runt, och formalismer som begränsar hur delsträngar får konkateneras upp. Formalismerna som tas upp här har dock vissa egenskaper gemensamma. De är nära besläktade med de klassiska reguljära och kontextfria formalismerna. De arbetar stegvis, och konstruerar strängar genom successiva applikationer av individuellt enkla regler. Varje steg genererar ett konstant antal symboler och modifierar inte det som redan genererats. Det vill säga, strängar byggs additivt och längden på dem kan inte explodera (i kontrast till t.ex. Lindenmayer-system). Alla språk som tas upp kommer att ha en semi-linjär Parikh-avbildning. De har någon instressant spatial/ordningsrelaterad egenskap. Exempelvis sättet på vilket shuffle-operatorer sammanflätar annars oberoende deriveringar. Alla formalismerna är tänkta att vara begränsade nog att det är resonabelt att ha effektiv parsning som mål. Denna avhandling kommer att ge intuitiva förklaring av ett antal formalismer som uppfyller ovanstående krav, och kommer att skissa en blandning av resultat relaterade till parsningsproblemet för dem. Detta bör ses som förberedande inför läsning av de mer djupgående och komplexa resultaten och förklaringarna i de artiklar som finns inkluderade som appendix.
1037

Students' Experiences During Democratic Activities at a Canadian Free School: A Case Study

Prud'homme, Marc-Alexandre 09 February 2011 (has links)
While the challenge of improving young North Americans’ civic engagement seems to lie in the hands of schools, studying alternative ways of teaching citizenship education could benefit the current educational system. In this context, free schools (i.e., schools run democratically by students and teachers), guided by a philosophy that aims at engaging students civically through the democratic activities that they support, offer a relatively unexplored ground for research. The present inquiry is a case study using tools of ethnography and drawing upon some principles of complexity thinking. It aims at understanding students’ citizenship education experiences during democratic activities in a Canadian free school. It describes many experiences that can arise from these activities. They occurred within a school that operated democratically based on a consensus-model. More precisely, they took place during two kinds of democratic activities: class meetings, which regulated the social life of the school, and judicial committees, whose function was to solve conflicts at the school. During these activities, students mostly experienced a combination of feelings of appreciation, concernment and empowerment. While experiencing these feelings, they predominantly engaged in decision-making and conflict resolution processes. During these processes, students modified their conflict resolutions skills, various conceptions, and their participation in democratic activities and in the school. Based on these findings, the study concludes that students can develop certain skills and attitude associated to citizenship education during these activities and become active from a citizenship perspective. Hence, these democratic activities represent alternative strategies that can assist educators in teaching about citizenship.
1038

A Comparative Study of Habitat Complexity, Neuroanatomy, and Cognitive Behavior in Anolis Lizards

Powell, Brian James January 2012 (has links)
<p>Changing environmental conditions may present substantial challenges to organisms experiencing them. In animals, the fastest way to respond to these changes is often by altering behavior. This ability, called behavioral flexibility, varies among species and can be studied on several levels. First, the extent of behavioral flexibility exhibited by a species can be determined by observation of that species' behavior, either in nature or in experimental settings. Second, because the central nervous system is the substrate determining behavior, neuroanatomy can be studied as the proximate cause of behavioral flexibility. Finally, the ultimate causation can be examined by studying ecological factors that favor the evolution of behavioral flexibility. In this dissertation, I investigate behavioral flexibility across all three levels by examining the relationship between habitat structure, the size of different structures within the brain and total brain size, and behavioral flexibility in six closely-related species of Puerto Rican <italic>Anolis</italic> lizards. <italic>Anolis</italic> lizards provide an excellent taxon for this study as certain species, including those used here, are classified as belonging to different ecomorphs and are morphologically and behaviorally specialized to distinct structural habitat types.</p><p>In order to determine the presence of behavioral flexibility in <italic>Anolis</italic>, I first presented <italic>Anolis evermanni</italic> with a series of tasks requiring motor learning and a single instance of reversal learning. <italic>Anolis evermanni</italic> demonstrated high levels of behavioral flexibility in both tasks.</p><p>To address the pattern of brain evolution in the <italic>Anolis</italic> brain, I used a histological approach to measure the volume of the whole brain, telencephalon, dorsal cortex, dorsomedial cortex, medial cortex, dorsal ventricular ridge, cerebellum, and medulla in six closely-related species of Puerto Rican <italic>Anolis</italic> lizards belonging to three ecomorphs. These data were analyzed to determine the relative contribution of concerted and mosaic brain evolution to <italic>Anolis</italic> brain evolution. The cerebellum showed a trend toward mosaic evolution while the remaining brain structures matched the predictions of concerted brain evolution. </p><p>I then examined the relationship between the complexity of structural habitat occupied by each species and brain size in order to determine if complex habitats are associated with relatively large brains. I measured brain volume using histological methods and directly measured habitat complexity in all six species. Using Principal Component Analysis, I condensed the measures of habitat structure to a single variable and corrected it for the scale of each lizard species' movement, calling the resulting measurement relevant habitat complexity. I tested the relationship between relative volume of the telencephalon, dorsal cortex, dorsomedial cortex, and whole brain against both relative habitat complexity and ecomorph classification. There was no relationship between the relative volume of any brain structure examined and either relevant habitat complexity or ecomorph. However, relevant habitat complexities for each species did not completely match their ecomorph classifications. </p><p>Finally, I tested the levels of behavioral flexibility of three species of <italic>Anolis</italic>, <italic>A. evermanni</italic>, <italic>A. pulchellus</italic>, and <italic>A. cristatellus</italic>, belonging to three distinct ecomorphs, by presenting them with tasks requiring motor and reversal learning. <italic>Anolis evermanni</italic> performed well in both tasks, while <italic>A. pulchellus</italic> required more trials to learn the motor task. Only a single <italic>Anolis cristatellus</italic> was able to perform either task. <italic>Anolis evermanni</italic> displayed lower levels of neophobia than the other species, which may be related to its superior performance.</p><p>In combination, this research suggests that <italic>Anolis</italic> of different ecomorphs display different levels of behavioral flexibility. At the proximate level, this difference in behavioral flexibility cannot be explained by changes in the relative size of the total brain or brain structures associated with cognitive abilities in other taxa. At the ultimate level, the size of the brain and several constituent structures cannot be predicted by habitat complexity. However, behavioral flexibility in certain tasks may be favored by utilization of complex habitats. Flexibility in different tasks is not correlated, rendering broad comparisons to a habitat complexity problematic.</p> / Dissertation
1039

Entwicklung eines Modelica Compiler BackEnds für große Modelle / Development of a Modelica Compiler BackEnd for Large Models

Frenkel, Jens 03 February 2014 (has links) (PDF)
Die symbolische Aufbereitung mathematischer Modelle ist eine wesentliche Voraussetzung, um die Dynamik komplexer physikalischer Systeme mit Hilfe numerischer Simulationen zu untersuchen. Deren Modellierung mit gleichungsbasierten objektorientierten Sprachen bietet gegenüber der signalflussbasierenden Modellierung den Vorteil, dass die sich aus der Mathematik und Physik ergebenden Gleichungen in ihrer ursprünglichen Form zur Modellbeschreibung verwendet werden können. Die akausale Beschreibung mit Gleichungen erhöht die Wiederverwendbarkeit der Modelle und reduziert den Aufwand bei der Modellierung. Die automatisierte Überführung der Gleichungen in Zuweisungen lässt sich theoretisch auf Modelle beliebiger Größe und Komplexität anwenden. In der praktischen Anwendung ergeben sich jedoch, insbesondere bei der automatisierten Überführung großer Modelle, mathematische Systeme mit sehr vielen Gleichungen und zeitabhängigen Variablen. Die daraus resultierenden langen Ausführungszeiten schränken die Anwendbarkeit der symbolischen Aufbereitung erheblich ein. Die vorliegende Arbeit beschreibt den Prozess der automatisierten Überführung eines Modells bis zur numerischen Simulation. Alle Teilschritte werden detailliert untersucht und bezüglich ihrer theoretischen Komplexität bewertet. Geeignete Algorithmen werden im OpenModelica Compiler umgesetzt und bezüglich ihrer Laufzeit anhand praxisrelevanter Modelle miteinander verglichen und für jeden Teilschritt die beste Implementierung ausgewählt. Dadurch konnte ein nahezu linearer Zusammenhang zwischen Modellgröße und benötigter Zeit zur Überführung erreicht werden. Zusätzlich bietet die Arbeit eine umfassende Dokumentation mit zahlreichen Empfehlungen für die Implementierung eines BackEnds eines Modelica Compilers. Dies erleichtert den Einstieg für neue Entwickler sowie die Weiterentwicklung bestehender Algorithmen. Letzteres wird durch ein modulares Konzept einer BackEnd-Pipeline unterstützt. Außerdem werden Methoden diskutiert, wie ein neues Modul innerhalb dieser Pipeline effizient getestet werden kann. / The symbolic treatment of mathematical models is essential to study the dynamics of complex physical systems by means of numerical simulations. In contrast to signal flow based approaches, modeling with equation-based and object-oriented languages has the advantage that the original equations can be used directly. The acausal description of equations increases reusability and reduces the effort for the modeller. The automated transformation of equations into assignments can in theory be applied to models of any size and complexity. In practice, however, problems arise when large models, i.e. mathematical systems with many equations and time-dependent variables, shall be transformed. Long execution times that occur limit the applicability of symbolic processing considerably. The present work describes the process of automated transformation from a model to program code which can be simulated numerically. All steps are examined in detail and evaluated in terms of its theoretical complexity. Suitable algorithms are implemented in the OpenModelica Compiler. Their execution times are compared by looking at models which are relevant to engineering. The best implementations for each sub-step are selected and combined to a Modelica Compiler BackEnd. Thus a relationship between model size and the time needed for transformation has been achieved which is mostly linear. In addition, a comprehensive discussion with numerous recommendations for the implementation of a Modelica Compiler BackEnd is given. This is supposed to help new developers as well as to facilitate the development of existing algorithms. The latter is supported by a modular concept of a BackEnd pipeline. Moreover, methods are discussed how new modules can be tested efficiently using this pipeline.
1040

Independent component analysis for maternal-fetal electrocardiography

Marcynuk, Kathryn L. 09 January 2015 (has links)
Separating unknown signal mixtures into their constituent parts is a difficult problem in signal processing called blind source separation. One of the benchmark problems in this area is the extraction of the fetal heartbeat from an electrocardiogram in which it is overshadowed by a strong maternal heartbeat. This thesis presents a study of a signal separation technique called independent component analysis (ICA), in order to assess its suitability for the maternal-fetal ECG separation problem. This includes an analysis of ICA on deterministic, stochastic, simulated and recorded ECG signals. The experiments presented in this thesis demonstrate that ICA is effective on linear mixtures of known simulated or recorded ECGs. The performance of ICA was measured using visual comparison, heart rate extraction, and energy, information theoretic, and fractal-based measures. ICA extraction of clinically recorded maternal-fetal ECGs mixtures, in which the source signals were unknown, were successful at recovering the fetal heart rate.

Page generated in 0.0692 seconds