• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 50
  • 13
  • 7
  • 6
  • 4
  • 4
  • 4
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 98
  • 17
  • 17
  • 16
  • 14
  • 14
  • 12
  • 11
  • 10
  • 9
  • 8
  • 8
  • 8
  • 8
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Automatic Markov Chain Monte Carlo Procedures for Sampling from Multivariate Distributions

Karawatzki, Roman, Leydold, Josef January 2005 (has links) (PDF)
Generating samples from multivariate distributions efficiently is an important task in Monte Carlo integration and many other stochastic simulation problems. Markov chain Monte Carlo has been shown to be very efficient compared to "conventional methods", especially when many dimensions are involved. In this article we propose a Hit-and-Run sampler in combination with the Ratio-of-Uniforms method. We show that it is well suited for an algorithm to generate points from quite arbitrary distributions, which include all log-concave distributions. The algorithm works automatically in the sense that only the mode (or an approximation of it) and an oracle is required, i.e., a subroutine that returns the value of the density function at any point x. We show that the number of evaluations of the density increases slowly with dimension. (author's abstract) / Series: Preprint Series / Department of Applied Statistics and Data Processing
62

Performance Analysis of TCAMs in Switches

Tawakol, Abdel Maguid 25 April 2012 (has links)
The Catalyst 6500 is a modern commercial switch, capable of processing millions of packets per second through the utilization of specialized hardware. One of the main hardware components aiding the switch in performing its task is the Ternary Content Addressable Memory (TCAM). TCAMs update themselves with data relevant to routing and switching based on the traffic flowing through the switch. This enables the switch to forward future packets destined to a location that has already been previously discovered - at a very high speed. The problem is TCAMs have a limited size, and once they reach their capacity, the switch has to rely on software to perform the switching and routing - a much slower process than performing Hardware Switching that utilizes the TCAM. A framework has been developed to analyze the switch’s performance once the TCAM has reached its capacity, as well as measure the penalty associated with a cache miss. This thesis concludes with some recommendations and future work.
63

Performance Analysis of TCAMs in Switches

Tawakol, Abdel Maguid 25 April 2012 (has links)
The Catalyst 6500 is a modern commercial switch, capable of processing millions of packets per second through the utilization of specialized hardware. One of the main hardware components aiding the switch in performing its task is the Ternary Content Addressable Memory (TCAM). TCAMs update themselves with data relevant to routing and switching based on the traffic flowing through the switch. This enables the switch to forward future packets destined to a location that has already been previously discovered - at a very high speed. The problem is TCAMs have a limited size, and once they reach their capacity, the switch has to rely on software to perform the switching and routing - a much slower process than performing Hardware Switching that utilizes the TCAM. A framework has been developed to analyze the switch’s performance once the TCAM has reached its capacity, as well as measure the penalty associated with a cache miss. This thesis concludes with some recommendations and future work.
64

Reciprocity : where art meets the community : action research in response to artistic encounters and relationships

Filardo, Giuseppe January 2009 (has links)
This practice-led research project examines some of the factors and issues facing artists working in the public domain who wish to engage with the community as audience. Using the methodology of action research, the three major creative projects in this study use art as a socio-political tool with the aim of providing an effective vehicle for broadening awareness, understanding forms of social protest and increasing tolerance for diversity. The three projects: Floodline November 7, 2004, Look in, Look out, and The Urban Terrorist Project, dealt with issues of marginalisation of communities, audiences and graffiti artists respectively. The artist/researcher is outlined as both creator and collaborator in the work. Processes included ephemeral elements, such as temporary installation and performance, as well as interactive elements that encouraged direct audience involvement as part of the work. In addition to the roles of creator and collaborator, both of which included audience as well as artist, the presence of an outside entity was evident. Whether local, legal authorities or prevailing attitudes, outside entities had an unavoidable impact on the processes and outcomes of the work. Each project elicited a range of responses from their respective audiences; however, the overarching concept of reciprocity was seen to be the crucial factor in conception, artistic methods and outcomes.
65

Antibody and Antigen in Heparin-Induced Thrombocytopenia

Newman, Peter Michael, Pathology, UNSW January 2000 (has links)
Immune heparin-induced thrombocytopenia (HIT) is a potentially serious complication of heparin therapy and is associated with antibodies directed against a complex of platelet factor 4 (PF4) and heparin. Early diagnosis of HIT is important to reduce morbidity and mortality. I developed an enzyme immunoassay that detects the binding of HIT IgG to PF4-heparin in the fluid phase. This required techniques to purify and biotinylate PF4. The fluid phase assay produces consistently low background and can detect low levels of anti-PF4-heparin. It is suited to testing alternative anticoagulants because, unlike in an ELISA, a clearly defined amount of antigen is available for antibody binding. I was able to detect anti-PF4-heparin IgG in 93% of HIT patients. I also investigated cross-reactivity of anti-PF4-heparin antibodies with PF4 complexed to alternative heparin-like anticoagulants. Low molecular weight heparins cross-reacted with 88% of the sera from HIT patients while half of the HIT sera weakly cross-reacted with PF4-danaparoid (Orgaran). The thrombocytopenia and thrombosis of most of these patients resolved during danaparoid therapy, indicating that detection of low affinity antibodies to PF4-danaparoid by immunoassay may not be an absolute contraindication for danaparoid administration. While HIT patients possess antibodies to PF4-heparin, I observed that HIT antibodies will also bind to PF4 alone adsorbed on polystyrene ELISA wells but not to soluble PF4 in the absence of heparin. Having developed a technique to affinity-purify anti-PF4-heparin HIT IgG, I provide the first estimates of the avidity of HIT IgG. HIT IgG displayed relatively high functional affinity for both PF4-heparin (Kd=7-30nM) and polystyrene adsorbed PF4 alone (Kd=20-70nM). Furthermore, agarose beads coated with PF4 alone were almost as effective as beads coated with PF4 plus heparin in depleting HIT plasmas of anti-PF4-heparin antibodies. I conclude that the HIT antibodies which bind to polystyrene adsorbed PF4 without heparin are largely the same IgG molecules that bind PF4-heparin and thus most HIT antibodies bind epitope(s) on PF4 and not epitope(s) formed by part of a PF4 molecule and part of a heparin molecule. Binding of PF4 to heparin (optimal) or polystyrene/agarose (sub-optimal) promotes recognition of this epitope. Under conditions that are more physiological and sensitive than previous studies, I observed that affinity-purified HIT IgG will cause platelet aggregation upon the addition of heparin. Platelets activated with HIT IgG increased their release and surface expression of PF4. I quantitated the binding of affinity-purified HIT 125I-IgG to platelets as they activate in a plasma milieu. Binding of the HIT IgG was dependent upon heparin and some degree of platelet activation. Blocking the platelet Fc??? receptor-II with the monoclonal antibody IV.3 did not prevent HIT IgG binding to activated platelets. I conclude that anti-PF4-heparin IgG is the only component specific to HIT plasma that is required to induce platelet aggregation. The Fab region of HIT IgG binds to PF4-heparin that is on the surface of activated platelets. I propose that only then does the Fc portion of the bound IgG activate other platelets via the Fc receptor. My data support a dynamic model of platelet activation where released PF4 enhances further antibody binding and more release.
66

Överraskning - Vilka indikatorer påverkar? : En studie av de två fallen Pearl Harbor och Sexdagarskriget / Surprise - What indicators affect? : The cases of Pearl Harbor and the Six Days War

Axelsson, Lucas January 2013 (has links)
Principen överraskning är en av de äldsta principer som finns att använda i striden. Principen ses som grunden för striden och för vilka metoder som kan använda för att lyckas i striden. Men vad innebär överraskning egentligen och hur har den använts? Uppsatsen kommer att problematisera principen överraskning och anknyta till forskningen om överraskning till Pearl Harbor och sexdagarskriget, som har setts vara typiska överraskningsanfall. Såg anfallen likadana ut och vilka indikatorer utifrån litteraturen påvisar att det var en överraskning? Inledningsvis kommer överraskning beskrivas utifrån vald litteratur för att åskådliggöra innebörden. Litteraturen kommer att analyseras och göra underlag för senare del av uppsatsen. Därefter appliceras och granskas de på de två olika överraskningsanfallen med vilka indikatorer som användes vid respektive anfall till litteraturteorin. Avslutningsvis diskuteras resultatet och författaren ger förslag på ny forskning med anknytning till ämnet. Syftet med uppsatsen är att problematisera principen överraskning och vilka indikatorer som påverkar principen överraskning. Jag kommer att genomföra en komparativ modell mellan Pearl Harbor och sexdagarskriget. Denna analys genomförs genom en kvalitativ textanalys och uppsatsens analytiska ram är principen överraskning. De teoretiska utgångspunkterna i denna uppsats är analys utifrån teorin överraskning samt litteratur om Pearl Harbor och sexdagarskriget. / The principle of surprise is one of the oldest principles that is to used in battle. Principle as the basis of the conflict and the methods you can use to succeed in battle. But what does surprise really mean, and how is it used? The essay will problematize the principle of surprise and connect the research on surprise to Pearl Harbor and the Six Day War that has been seen to be typical surprise attacks. Did the attacks look the same and do the indicators from the literature show that it was a surprise?Initially the principle surprise will describe the selected literature to illustrate the meaning. The literature will be analyzed and make a basis for the later part of the essay. Then I will analyze the literature and apply it on the two surprise attacks, if they were using the same indicators. Finally it discusses the results and the author suggests new research related to the topic. The purpose of this paper is to problematize the principle of surprise and the indicators that affect the principle of surprise. The essay will conduct a comparative method between Pearl Harbor and the Six Day War. This analysis is performed by a qualitative text analysis and essay analytical framework is the principle of surprise. The theoretical basis of this paper is the analysis based on the theory and literature surprise of Pearl Harbor and the Six-Day War.
67

Métodos para detecção de outliers em séries de preços do índice de preços ao consumidor

Lyra, Taíse Ferraz 24 February 2014 (has links)
Submitted by Taíse Ferraz Lyra (taise.lyra@fgv.br) on 2014-05-14T15:24:28Z No. of bitstreams: 1 Dissertação - Taíse Ferraz Lyra (Versão Final).pdf: 1069993 bytes, checksum: 3407689a27bfac06aff01d4fda05f6f2 (MD5) / Approved for entry into archive by Janete de Oliveira Feitosa (janete.feitosa@fgv.br) on 2014-05-19T16:45:31Z (GMT) No. of bitstreams: 1 Dissertação - Taíse Ferraz Lyra (Versão Final).pdf: 1069993 bytes, checksum: 3407689a27bfac06aff01d4fda05f6f2 (MD5) / Approved for entry into archive by Marcia Bacha (marcia.bacha@fgv.br) on 2014-05-26T19:26:19Z (GMT) No. of bitstreams: 1 Dissertação - Taíse Ferraz Lyra (Versão Final).pdf: 1069993 bytes, checksum: 3407689a27bfac06aff01d4fda05f6f2 (MD5) / Made available in DSpace on 2014-05-26T19:28:52Z (GMT). No. of bitstreams: 1 Dissertação - Taíse Ferraz Lyra (Versão Final).pdf: 1069993 bytes, checksum: 3407689a27bfac06aff01d4fda05f6f2 (MD5) Previous issue date: 2014-02-24 / Outliers are observations that appear to be inconsistent with the others. Also called atypical, extreme or aberrant values, these inconsistencies can be caused, for instance, by political changes or economic crises, unexpected cold or heat waves, and measurement or typing errors. Although outliers are not necessarily incorrect values, they can distort the results of an analysis and lead researchers to erroneous conclusions if they are related to measurement or typing errors. The objective of this research is to study and compare different methods for detecting abnormalities in the price series from the Consumer Price Index (Índice de Preços ao Consumidor - IPC), calculated by the Brazilian Institute of Economy (Instituto Brasileiro de Economia - IBRE) from Getulio Vargas Foundation (Fundação Getulio Vargas - FGV). The IPC measures the price variation of a fixed set of goods and services, which are part of customary expenses for families with income levels between 1 and 33 monthly minimum wages and is mainly used as an indice of reference to evaluate the purchasing power of consumer. In addition to the method currently used by price analysts in IBRE, the study also considered variations of the IBRE Method, the Boxplot Method, the SIQR Boxplot Method, the Adjusted Boxplot Method, the Resistant Fences Method, the Quartile Method, the Modified Quartile Method, the Median Absolute Deviation Method and the Tukey Algorithm. These methods wre applied to data of the munucipalities Rio de Janeiro and São Paulo. In order to analyze the performance of each method, it is necessary to know the real extreme values in advance. Therefore, in this study, it was assumed that prices which were discarded or changed by analysts in the critical process were the real outliers. The method from IBRE is correlated with altered or discarded prices by analysts. Thus, the assumption that the changed or discarded prices by the analysts are the real outliers can influence the results, causing the method from IBRE be favored compared to other methods. However, thus, it is possible to compute two measurements by which the methods are evaluated. The first is the method’s accuracy score, which displays the proportion of detected real outliers. The second is the number of false-positive produced by the method, that tells how many values needed to be flagged to detect a real outlier. As higher the hit rate generated by the method and as the lower the amount of false positives produced therefrom, the better the performance of the method. Therefore, it was possible to construct a ranking relative to the performance of the methods, identifying the best among those analyzed. In the municipality of Rio de Janeiro, some of the variations of the method from IBRE showed equal or superior to the original method performances. As for the city of São Paulo, the method from IBRE showed the best performance. It is argued that a method correctly detects an outlier when it signals a real outlier as an extreme value. The method with the highest accuracy score and with smaller number of false-positive was from IBRE. For future investigations, we hope to test the methods in data obtained from simulation and from widely used data bases, so that the assumption related to the discarded or changed prices, during the critical process, does not alter the results. / Outliers são observações que parecem ser inconsistentes com as demais. Também chamadas de valores atípicos, extremos ou aberrantes, estas inconsistências podem ser causadas por mudanças de política ou crises econômicas, ondas inesperadas de frio ou calor, erros de medida ou digitação, entre outras. Outliers não são necessariamente valores incorretos, mas, quando provenientes de erros de medida ou digitação, podem distorcer os resultados de uma análise e levar o pesquisador à conclusões equivocadas. O objetivo deste trabalho é estudar e comparar diferentes métodos para detecção de anormalidades em séries de preços do Índice de Preços ao Consumidor (IPC), calculado pelo Instituto Brasileiro de Economia (IBRE) da Fundação Getulio Vargas (FGV). O IPC mede a variação dos preços de um conjunto fixo de bens e serviços componentes de despesas habituais das famílias com nível de renda situado entre 1 e 33 salários mínimos mensais e é usado principalmente como um índice de referência para avaliação do poder de compra do consumidor. Além do método utilizado atualmente no IBRE pelos analistas de preços, os métodos considerados neste estudo são: variações do Método do IBRE, Método do Boxplot, Método do Boxplot SIQR, Método do Boxplot Ajustado, Método de Cercas Resistentes, Método do Quartil, do Quartil Modificado, Método do Desvio Mediano Absoluto e Algoritmo de Tukey. Tais métodos foram aplicados em dados pertencentes aos municípios Rio de Janeiro e São Paulo. Para que se possa analisar o desempenho de cada método, é necessário conhecer os verdadeiros valores extremos antecipadamente. Portanto, neste trabalho, tal análise foi feita assumindo que os preços descartados ou alterados pelos analistas no processo de crítica são os verdadeiros outliers. O Método do IBRE é bastante correlacionado com os preços alterados ou descartados pelos analistas. Sendo assim, a suposição de que os preços alterados ou descartados pelos analistas são os verdadeiros valores extremos pode influenciar os resultados, fazendo com que o mesmo seja favorecido em comparação com os demais métodos. No entanto, desta forma, é possível computar duas medidas através das quais os métodos são avaliados. A primeira é a porcentagem de acerto do método, que informa a proporção de verdadeiros outliers detectados. A segunda é o número de falsos positivos produzidos pelo método, que informa quantos valores precisaram ser sinalizados para um verdadeiro outlier ser detectado. Quanto maior for a proporção de acerto gerada pelo método e menor for a quantidade de falsos positivos produzidos pelo mesmo, melhor é o desempenho do método. Sendo assim, foi possível construir um ranking referente ao desempenho dos métodos, identificando o melhor dentre os analisados. Para o município do Rio de Janeiro, algumas das variações do Método do IBRE apresentaram desempenhos iguais ou superiores ao do método original. Já para o município de São Paulo, o Método do IBRE apresentou o melhor desempenho. Em trabalhos futuros, espera-se testar os métodos em dados obtidos por simulação ou que constituam bases largamente utilizadas na literatura, de forma que a suposição de que os preços descartados ou alterados pelos analistas no processo de crítica são os verdadeiros outliers não interfira nos resultados.
68

Consumer Adoption of Personal Health Records

Majedi, Armin January 2014 (has links)
Health information technology (HIT) aims to improve healthcare services by means of technological tools. Patient centered technologies such as personal health records are relatively new HIT tools that enable individuals to get involved in their health management activities. These tools enable the transformation of health consumer behavior from one of passive health information consumers to that of active managers of their health information. This new role is more interactive and engaged, and with such tools, patients can better navigate their lives, and exercise more control over their treatments, hence potentially also leading to improvement in the quality of health services. Despite the benefits of using personal health record systems for health consumers, the adoption rate of these systems remains low. Many free and paid services have not received the uptake that had been anticipated when these services were first introduced. This study investigates some factors that affect the adoption of these systems, and may shed light on some potential reasons for low adoption rates. In developing the theoretical model of this study, social cognitive theory (SCT) and technology acceptance model (TAM) were utilized. The theoretical model was validated through a quantitative survey-based methodology, and the results were derived using structural equation modeling techniques. The key findings of this study highlight the role of individual and environmental factors as determinants of end-user behavior in the adoption of personal health records. The results show that in addition to perceptions of usefulness and ease of use, factors such as social norms and technology awareness are also significantly associated with various factors that directly and indirectly affect intention to use PHRs Based on the results obtained in this study, recommendations are offered for technology providers, and possible directions are proposed for academic researchers.
69

Implementation And Evaluation Of Hit Registration In Networked First Person Shooters

Jonathan, Lundgren January 2021 (has links)
Hit registration algorithms in First-Person Shooter games define how the server processes gunfire from clients. Network conditions, such as latency, cause a mismatch between the game worlds observed at the client and the server. To improve the experience for clients when authoritative servers are used, the server attempts to reconcile the differing views when performing hit registration through techniques known as lag compensation. This thesis surveys recent hit registration techniques and discusses how they can be implemented and evaluated with the use of a modern game engine. To this end, a lag compensation model based on animation pose rewind is implemented in Unreal Engine 4. Several programming models described in industry and research are used in the implementation, and experiences from further integrating the techniques into a commercial FPS project are also discussed. To reason about the accuracy of the algorithm, client-server discrepancy metrics are defined, as well as a hit rate metric which expresses the worst-case effect on the shooting experience of a player. Through automated tests, these metrics are used to evaluate the hit registration accuracy. The rewind algorithm was found to make the body-part-specific hit registration function well independently of latency. At high latencies, the rewind algorithm is completely necessary to make sure that clients can still aim at where they perceive their targets to be and expect their hits to be registered. Still, inconsistencies in the results remain, with hit rate values sometimes falling below 50%. This is theorized to be due to fundamental networking mechanisms of the game engine which are difficult to control. This presents a counterpoint to the otherwisegained ease of implementation when using Unreal Engine.
70

Aplikácia reštrikcie krvného obehu v športovom tréningu lezcov - inovatívna metóda tréningu športovcov? / : Restricted blood flow applied in climbers` training - a innovative method of training?

Javorský, Tomáš January 2021 (has links)
Title: Application of blood flow restriction by a sport training of climbers - an innovative training method for sportsmen? Author: Tomáš Javorský BSc. Department: Department of Physiology Supervisor: doc. Jiří Baláš, Ph.D. Abstract: The most common injuries of performance climbers include tendon injuries of finger flexors. This kind of injury can leave a sportsman unable to follow his training programme for several months, which can have a crucial impact on his peak season. The thesis comprised a comparison of a high-intensity training performed at 70% of muscle strength maximum, with a blood flow restriction training performed at a 30% muscle load, and also the physiological and functional aspects of the training. Objectives: The presumption is, that the combination of a low muscle load with an ischemy will achieve the same results as a high-intensity training. We also presume, that the alterations in muscle oxygenation remain the same despite different amounts of performed muscle work. Methods: 13 participants finished the experiment performed in the form of a crossover study. During the experiment the muscle oxidative capacity and the extent of the muscle deoxygenation were measured by spectroscopy. The maximum force, critical force, impulse and the impulse above the critical force point were measured...

Page generated in 0.0528 seconds