• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 40
  • 19
  • 18
  • 14
  • 10
  • 7
  • 4
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 146
  • 29
  • 29
  • 27
  • 21
  • 14
  • 14
  • 14
  • 13
  • 13
  • 12
  • 12
  • 12
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Integration of Variants Handling in M-System-NT / Integration of Variants Handling in M-System-NT

Zeeshan, Ahmed January 2006 (has links)
This Master thesis proposes a solution to manage variabilities of software product line applications. The objective of the research is to support software decision makers in handling additional software complexity introduced by product line architectures. In order to fulfill this objective an approach to analyze, visualize, and measure product line specific characteristics of the C/C++ source code are proposed. The approach is validated in an empirical experiment using an open source software system. For that purpose the approach is first implemented into ®1 M-System-NT, an existing software measurement tool developed at Fraunhofer. The target hypothesis of the Institute for Experimental Software engineering research master thesis to perform static analysis of C/C++ source code, measure traditional and product line measures to identify the correlation between measures and indicate the fault proneness. / Official Address: Researcher Zeeshan Ahmed, Mechanical Engineering Informatics and TU Virtual Product Development Division (MIVP) Vienna, Austria Permanent Address: Zeeshan Ahmed, Humdered Street Mohala Garhi Shadula Sahib Gujrat, Pakistan
82

The neural mechanisms of attention : exploring threat-related suppression and enhancement using ERPs

Bretherton, Paul January 2016 (has links)
The capacity of the visual system to process information about multiple objects at any given moment in time is limited. This is because not all information can be processed equally or in parallel and subsequently reach consciousness. Previous research has utilized behavioural experiments to explore visual attention. More recently research, however, has used electroencephalography (EEG) to measuring the electrical brain activity in the posterior scalp. By time locking visual stimulus events to fluctuations in scalp activity researchers have been able to estimate the time course of attentional changes by measuring changes in these event-related potentials (ERP). One component in particular (N2pc) has been a reliable tool in measuring either the suppression of, or the shift of attentional to, both ignored and attended items in the visual scene. The N2pc is measured by comparing the ERP activity contralateral and ipsilateral to the visual field of interest. More recently, evidence has been presented that the mechanisms of attention thought to be represented by the N2pc (suppression and attentional selection) could be separated into different ERP components (Pd: indexing attentional suppression of an ignored item; and Nt: indexing attentional selection of the target) and measured independently. In six experiments, using ERPs, this thesis employs these components to explore the mechanisms and strategies of the human attentional system. Additionally, this thesis focuses on the impact of different types of simultaneous processing load on the attentional system and how the mechanisms of this system are influenced. Experiment 1 explores the idea that the type or valence of information to be ignored may influence the ability to suppress it. Results of this experiment 4 show that neither the type nor valence of the irrelevant information modulated the amplitude of the distractor positivity (Pd), indicating suppression of the irrelevant distractor was not altered. Noted in experiment 1 was also the presence of an early negativity (Ne) that appeared to represent attentional capture of the ignored lateral stimulus. Experiment 2 demonstrated that the valence of the lateral target did not alter the target negativity (Nt), indicating a different pattern of results between the Nt and the N2pc reported in previous studies (e.g. Eimer & Kiss, 2007; Feldmann-Wüstefeld et al., 2010). Experiment 2 also showed a similarity of the target negativity (Nt) to the early negativity (Ne; the N2pc like component observed in exp 1) toward face and non-face stimuli. This comparison supported the idea that the early negativity (Ne) reflected attentional capture of the ignored lateral distractor and as a result was relabelled the distractor negativity (Nd) in subsequent experiments. Experiment 3 showed that the salience of the lateral image did not modulate the Pd as should be the case if the Pd reflected sensory-level processing. An early contralateral negativity (similar to the Nd observed in exp 1) was altered by the salience of the distractor which added support to the hypothesis that this reflects attentional capture of the lateral ignored image. Experiment 4 attempted to manipulate working memory (WM) to assess the effect of WM load on attentional capture and suppression. While the results did indicate modulation of suppression under WM load, the limitations of the design of experiment 4 made any definitive interpretation of the results unreliable. The results of experiment 5 showed that suppression, as indexed by the Pd, was not altered by cognitive load. However, reductions in attentional capture under high cognitive load, as indexed by the distractor negativity (Nd), were observed and contradict the results of previous experiments (c.f. Lavie & De Fockert, 2005) 5 where cognitive load resulted in an increase in attentional capture. Although, there appears to be some issue in the authors interpretation of the results of these experiments (see chapter 6 for discussion). The results of Experiment 6 show the opposite effect with a significant increase in the laterality of the Pd under high perceptual load. A similar increase in the laterality of the Pd was not reflected in terms of valence though, where suppression of threat related distractors was not altered under high perceptual load. The hypothesis that an increase in perceptual load will result in a decrease in attentional capture was generally supported by the results of experiment 6. Under high perceptual load angry face distractors captured attention, as indexed by the laterality of the Nd, with neutral face distractors showing a reduction in attentional capture. While under low perceptual load, both angry and neutral face distractors resulted in a significant (and similar) laterality of the Nd. The thesis concludes by discussing issues concerning Lavie’s Load Theory of attention and outlines some potential misinterpretations of previous data that have led to the proposal that cognitive load results in a decrease in attentional resources and therefore a decrease in attentional capture of ignored stimuli. It is argued in this thesis that the results of Lavie and de Fockert (2005), which concluded that the increase in cognitive load resulted in a decrease in attentional capture, are more likely to be due to changes in attentional capture (i.e. a reduction) and changes in RT (i.e. an increase), under cognitive load being separate responses to the availability of resources, one that focusses attention on the goal directed task and the other that results in extended processing time to carry out the more difficult task. In this case both ‘changes’ appear to work to prioritise resources in favour of the goal directed task.
83

Val av systemutvecklingmetod utifrån ett strategiskt perspektiv : En fallstudie på SPV

Stenberg, Gustaf, Nordin, Lina January 2014 (has links)
De flesta metoder, verktyg och arbetssätt som används vid traditionell projektledning har använts under en lång tid trots att de yttre förutsätt-ningarna har förändrats. Systemutvecklingsbranschen upplevde de traditionella metoderna att driva projekt som oerhört statiska och tröga, därför utvecklades mer lättrörliga metoder som kom att kallas agila systemutvecklingsmetoder. Fram till år 2007 fanns inte en strukturerad process för hur valet och implementeringen av en agil systemutveckl-ingsmetod bör gå till. Det finns idag några ramverk som beskriver hur val och implementering av en agil metod kan bedrivas. Inget av dessa ramverk tar dock i beaktande eller utgår ifrån en organisations strategi eller vikten av strategisk passform. Syftet med denna rapport har varit att undersöka hur en agil systemutvecklingsmetod kan väljas utifrån strategisk passform. Undersökningen har även syftat till att studerade förändringar som kan krävas i en organisations projektstyrningsmodell för att införa en agil systemutvecklingsmetod. För att besvara syftet har en fallstudie utförts på Statens tjänstepensionsverk. På SPV har en ut-värdering med TS-metoden genomförts för att bedöma olika agila systemutvecklingsmetoders strategiska passform. Med hjälp av TS-metoden har agila systemutvecklingsmetoder bedömts mot en organi-sations strategiska mål. Resultatet från utvärderingen var att de agila systemutvecklingsmetoderna DSDM och Scrum ansågs ha den bästa strategiska passformen. Då DSDM till stor del är ett ramverk är en kom-bination av dessa båda systemutvecklingsmetoder möjlig. TS-metoden har resulterat i en relativ bedömning mellan de agila systemutveckl-ingsmetoderna. Utifrån denna bedömning har de agila systemutveckl-ingsmetoderna kunnat rangordnas utifrån ett strategiskt perspektiv. Införandet av en kombination av metoderna har inneburit att arbetet kommer att utföras iterativt istället för sekventiellt och detta har inneburit att beslutspunkterna i projektstyrningsmodellen har förändrats. Även styrning efter TKQ har försvunnit. / Most of the methods, tools and approaches used in traditional project management come from the Cold War days. System development sector experienced the traditional methods of managing projects as extremely static and slow and therefore developed more agile methods, agile sys-tem development methods. By the year 2007 there was not a structured process for the selection and implementation of an agile method. There is currently some framework that describes how the selection and im-plementation of an agile method can be conducted. None of these frameworks will be considered or assume an organization strategy and the importance of strategic fit. The purpose of this report was to investi-gate how an agile software development method can be chosen based on strategic fit. The survey also aimed to study the changes that would be required in an organization's project management model in order to introduce an agile system development methodology. To answer the question, an evaluation with TS-method was performed. In the TS-method has agile software development methodologies assessed against an organization's strategic goals. The result from the evaluation was that the agile software development methods DSDM and Scrum were con-sidered to have the best strategic fit. When the DSDM largely is a framework a combination of the two agile software methods is possible. TS-method gave a relative assessment between the agile software devel-opment methods. Based on this assessment, the agile software devel-opment methods could be ranked from a strategic perspective. The in-troduction of a combination of methods has meant that the work will be performed iteratively rather than sequentially, and this has meant that the decision points in the project management model have changed. Although control with TKQ has disappeared.
84

Unrecognized myocardial infarction and cardiac biochemical markers in patients with stable coronary artery disease

Nordenskjöld, Anna January 2016 (has links)
Aim: The overarching aim of the thesis was to explore the occurrence and clinical importance of two manifestations of myocardial injury; unrecognized myocardial injury (UMI) and altered levels of cardiac biochemical markers in patients with stable coronary artery disease (CAD). Methods: A prospective multicenter cohort study investigated the prevalence, localization, size, and prognostic implication of UMI in 235 patients with stable CAD. Late gadolinium enhancement cardiovascular magnetic resonance (LGE-CMR) imaging and coronary angiography were used. The relationship between UMI and severe CAD and cardiac biochemical markers was explored. In a substudy the short- and longterm individual variation in cardiac troponins I and T (cTnI, cTnT) and N-terminal pro-B-type natriuretic peptide (NT-proBNP) were investigated. Results: The prevalence of UMI was 25%. Subjects with severe CAD were significantly more likely to exhibit UMI than subjects without CAD. There was a strong association between stenosis ≥70% and presence of UMI in the myocardial segments downstream. The presence of UMI was associated with a significant threefold risk of adverse events during follow up. After adjustments UMI was associated with a nonsignificant numerically doubled risk. The levels of cTnI, NT-proBNP, and Galacin-3 were associated with the presence of UMI in univariate analyses. The association between levels of cTnI and presence of UMI remained significant after adjustment. The individual variation in cTnI, cTnT, and NT-proBNP in subjects with stable CAD appeared similar to the biological variation in healthy individuals. Conclusions: UMI is common and is associated with significant CAD, levels of biochemical markers, and an increased risk for adverse events. A change of >50% is required for a reliable short-term change in cardiac troponins, and a rise of >76% or a fall of >43% is required to detect a long-term reliable change in NT-proBNP.
85

Jesus - en riktig man / Jesus - a real man

Elenäs, Arvid January 2019 (has links)
The objective of this study is to investigate how the authors of the Gospel of Luke and the Gospel of Mark use understandings of masculinities when portraying the character of Jesus. The study presents a survey of Greco-Roman hegemonic masculinity, with a focus on the free male’s relation to children, celibacy, bodies, good character and the household. The analysis of the gospel narratives focuses on two themes. The first one is how Jesus’ relation to his household was portrayed in masculine terms. The second theme is how Jesus uses children as an example for adult men. The study shows that it is reasonable to suggest that Jesus is described in the narratives as someone who had a complex relationship to the standards of Greco-Roman hegemonic masculinity. Jesus is sometimes portrayed as an odd man with low masculine status and sometimes portrayed as a man with honor and high masculinity. The question about Jesus’ masculinity depends on the characters’ ability to perceive Jesus’ theological standpoints in the textual world. If they understand Jesus’ theological standpoints they think of Jesus as a man with high masculine status. But if they don’t understand Jesus’ theological standpoints they think of Jesus as a man with low masculine status.
86

Such a Deal of Wonder: Structures of Feeling and Performances of The Winter's Tale from 1981 to 2002

Burt, Elizabeth Marie 11 July 2005 (has links) (PDF)
Structures of feeling represent the interaction between personal lived experience and fixed social values and meanings, which are found in interpretations of works of art. Studying various interpretations of any play in performance can provide a point of access into a culture because the choices made in the production can be compared to each other and to the written text and then reveal how the theatrical company views particular issues within their own time period. This study looks at productions of The Winter's Tale between 1981 and 2002 at the National Theatre and the Royal Shakespeare Company. Using numerous versions of this play not only increases the depth of our understanding of the play but also reveals how the actors and directors interact with British culture. Each production reveals a director's vision for the production as well as his or her own experience within the culture. Some issues and ideas that are reflected in these interpretations include both optimism and cynicism with regard to the political situation and public figures, an increase in spectacle, and secularization.
87

Restrictive modification relative clauses and adverbs.

Larson, Richard K. January 1983 (has links)
Thesis (Ph.D.)-University of Wisconsin-Madison, Madison, Wis. / bibl.; diags.; Typescript (processed). eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 442-447).
88

THE FUTURE OF DATA ACQUISITION

Wexler, Marty 10 1900 (has links)
International Telemetering Conference Proceedings / October 26-29, 1998 / Town & Country Resort Hotel and Convention Center, San Diego, California / The necessity to acquire and analyze data dates back to the beginning of science itself. Long ago, a scientist may have run experiments and noted the results on a piece of paper. These notes became the data. The method was crude, but effective. As experiments got more complex, the need for better methodologies arose. Scientists began using computers to gather, analyze, and store the data. This method worked well for most types of data acquisition. As the amount of data being collected increased, larger computers, faster processors, and faster storage devices were used in order to keep up with the demand. This method was more refined, but still did not meet the needs of the scientific community. Requirements began to change in the data acquisition arena. More people wanted access to the data in real time. Companies producing large data acquisition systems began to move toward a network-based solution. This architecture featured a specialized computer called the server, which contained all of the data acquisition hardware. The server handled requests from multiple clients and handled the data flow to the network, data displays, and the archive medium. While this solution worked well to satisfy most requirements, it fell short in meeting others. The ability to have multiple computers working together across a local or wide area network (LAN or WAN) was not addressed. In addition, this architecture inherently had a single point of failure. If the server machine went down, all data from all sources was lost. Today, we see that the requirements for data acquisition systems include features only dreamed of five years ago. These new systems are linked around the world by wide area networks. They may include code to command satellites or handle 250 Mbps download rates. They must produce data for dozens of users at once, be customizable by the end user, and they must run on personal computers (PCs)! Systems like these cannot work using the traditional client/server model of the past. The data acquisition industry demands systems with far more features than were traditionally available. These systems must provide more reliability and interoperability, and be available at a fraction of the cost. To this end, we must use commercial-off-the-shelf (COTS) computers that operate faster than the mainframe computers of only a decade ago. These computers must run software that is smart, reliable, scalable, and easy to use. All of these requirements can be met by a network of PCs running the Windows NT operating system.
89

SATELLITE PAYLOAD CONTROL AND MONITORING USING PERSONAL COMPUTERS

Willis, James 10 1900 (has links)
International Telemetering Conference Proceedings / October 26-29, 1998 / Town & Country Resort Hotel and Convention Center, San Diego, California / Universal acceptance of the Windows NT operating system has made utilization of the personal computer (PC) platform for critical space operations a reality. The software attributes of the operating system allow PC products to attain the reliability necessary for secure control of on-orbit assets. Not only is the software more reliable, it supports better networking interfaces at higher speeds. The software upgrades that the Microsoft Corporation generates on a regular basis allow PCs to offer capabilities previously available only with UNIX-based solutions. As technology matures, PCs will operate faster, offer more graphical user interfaces, and give customers a lower cost versus performance choice. These reasons, and others to be discussed further, clearly demonstrate that PCs will soon take their place at the forefront of mission-critical ground station applications.
90

Estudo da retração em fibrocimento reforçado com fibra polimérica. / Shrinkage in cement based composite reinforced with synthetic fibers.

Souza, Rui Barbosa de 07 November 2013 (has links)
O objetivo do trabalho é analisar e compreender os efeitos das características do fibrocimento reforçado com fibras sintéticas na retração por secagem deste compósito. A importância social do fibrocimento como material de construção, justifica a realização do presente trabalho, de modo a expor como acontece a retração por secagem neste compósito, e desta forma auxiliar o meio técnico na mitigação das manifestações patológicas relacionadas à retração e contribuindo para a redução do importante problema da indústria produtora de fibrocimento, que é a fissuração de borda nas telhas onduladas. Para tal buscou-se demostrar o efeito da porosidade e distribuição de tamanho de poros na retração por secagem. Além disso, foram realizados experimentos onde se variou o tipo de cimento utilizado, as quantidades de todas as matérias-primas envolvidas, além da aplicação de tratamentos modificadores da movimentação higroscópica, manipulando diretamente a tensão capilar causadora da retração. Concluiu-se que a porosidade elevada do fibrocimento faz com que a retração por secagem deste compósito seja fortemente influenciada pela porosidade total do mesmo, com correlação direta com a quantidade de mesoporos. Além disso, no estudo de caso realizado concluiu-se que a redução da retração devido à redução da tensão superficial da água do poro, se refletiu em menor fissuração de borda nas telhas onduladas. / The objective is to analyze and comprehend the effects of properties of cement based composite reinforced with synthetic fibers, on drying shrinkage of such composite. The social importance of fiber cement as construction material justifies the execution of the present work, in order to expose how drying shrinkage occurs on this composite and by those means to assist the technical community on mitigating pathological manifestations related to shrinkage, thus contributing for the reduction of an important problem on fiber cement industry, the edge cracking on corrugated sheets. In order to achieve this goal, it was sought to demonstrate the effect of porosity and pore size distribution on drying shrinkage. Moreover, experiments were performed, varying the type of cement used and the quantities of all materials involved, in addition to the application of treatments that modify hygroscopic movement, directly manipulating the capillary tension which causes shrinkage. It was concluded that the elevated porosity of the fiber cement causes drying shrinkage of this composite to be strongly influenced by its total porosity, directly correlated to the quantity of mesopores. Furthermore, in the case study performed it was concluded that shrinkage reduction due to the reduction of pore water surface tension was reflected into minor edge cracking on corrugated sheets.

Page generated in 0.0204 seconds