Spelling suggestions: "subject:"consistency"" "subject:"onsistency""
191 |
The Effects Of Rhythm Training On Tennis PerformanceSogut, Mustafa 01 August 2009 (has links) (PDF)
The purpose of the study were / to compare the effects of tennis specific and general rhythm training on the forehand consistency performance, rhythmic competence, tennis playing level and agility performance, and to examine the effects of different tempos on rhythmic competence of tennis players. 30 university students whose mean score of International Tennis Number (ITN) was 7.3 (SD=0.9) were divided randomly into three sub-groups: tennis group (TG), general rhythm training group (GRTG), and tennis-specific rhythm training group (TRTG). Measurement instruments were ITN, Agility Test, Rhythmic Competence Analysis Test (RCAT), and Untimed Consecutive Rally Test (UCRT). A Kruskal-Wallis Test was conducted to calculate possible differences between initial scores and to compare improvement scores of groups. A Mann-Whitney U Test was conducted to determine pairwise comparisons of groups for improvement scores and to analyze RCAT scores for different tempos. Results revealed that participants in both rhythm training groups (GRTG and TRTG) improved their forehand consistency performance and rhythmic competence significantly after training period. Results for the improvement scores indicated that there was significant difference in UCRT (3m) between TRTG and TG and in RCAT (50) between both rhythm training groups and TG. On the other hand, participation to additional rhythm trainings was unable to differentiate tennis playing level and agility performance of groups. There was no significant difference between rhythm training groups for all parameters tested. Results also revealed that synchronization of participants&rsquo / movements with the external stimulus was more precise at fast tempo than at slow tempo.
|
192 |
Effectiveness Of Set Accelerating Admixtures With Different Cement TypesUstuner, Didem Tugba 01 September 2009 (has links) (PDF)
Accelerating and mineral admixtures, one of the major ingredients in concrete, are primarily used to modify the properties of both fresh and hardened concrete.
Within the scope of this thesis, there were four types of cements having almost identical fineness. The mixes were prepared by using natural pozzolan, blast furnace slag and limestone conforming to TS EN 197-1 and two types of accelerating admixtures, namely triethanolamine (TEA) and calcium formate (CF).
The effect of set accelerating admixtures with different cement types on the setting time, water demand and compressive strength has been analyzed by an experimental study in accordance with relevant ASTM standards.
Finally, it has been observed that the amount of the accelerating admixtures used is suitable because of their effects on the water demand, setting and strength. Due to the density difference of mineral admixtures and clinker, the normal consistency and 110% flow water content should be considered on a volumetric basis. The effectiveness of the accelerating admixtures on the normal consistency water, 110% flow water content and setting time depends on the type and amount of mineral admixtures. The increase caused by CF in the normal consistency and 110% flow water content is higher than that by TEA. The accelerating effect of TEA and CF on the setting times is more significant for cements incorporating 6% mineral admixture. The effects of accelerating admixtures on the compressive strength change with specimen age, type and amount of mineral admixtures. Generally, for all cement types, early age compressive strengths increase with the increase of TEA, however long term strengths increase by increasing CF.
|
193 |
Improving Edge Detection Using Intersection ConsistencyCiftci, Serdar 01 October 2011 (has links) (PDF)
Edge detection is an important step in computer vision since edges are utilized by the successor visual processing stages including many tasks such as motion estimation, stereopsis, shape representation and matching, etc. In this study, we test whether a local consistency measure based on image orientation (which we call Intersection Consistency - IC), which was previously shown to improve detection of junctions, can be used for improving the quality of edge detection of seven different detectors / namely, Canny, Roberts, Prewitt, Sobel, Laplacian of Gaussian (LoG), Intrinsic Dimensionality, Line Segment Detector (LSD). IC works well on images that contain prominent objects which are different in color from their surroundings. IC give good results on natural images that have especially cluttered background. On images involving human made objects, IC leads to good results as well. But, depending on the amount of clutter, the loss of true positives might be more crucial. Through our comprehensive investigation, we show that approximately 21% increase in f-score is obtained whereas some important edges are lost. We conclude from our experiments that IC is suitable for improving the quality of edge detection in some detectors such as Canny, LoG and LSD.
|
194 |
Behavioural profiles : a relational approach to behaviour consistencyWeidlich, Matthias January 2011 (has links)
Business Process Management (BPM) emerged as a means to control, analyse, and optimise business operations. Conceptual models are of central importance for BPM. Most prominently, process models define the behaviour that is performed to achieve a business value. In essence, a process model is a mapping of properties of the original business process to the model, created for a purpose. Different modelling purposes, therefore, result in different models of a business process. Against this background, the misalignment of process models often observed in the field of BPM is no surprise. Even if the same business scenario is considered, models created for strategic decision making differ in content significantly from models created for process automation. Despite their differences, process models that refer to the same business process should be consistent, i.e., free of contradictions. Apparently, there is a trade-off between strictness of a notion of consistency and appropriateness of process models serving different purposes. Existing work on consistency analysis builds upon behaviour equivalences and hierarchical refinements between process models. Hence, these approaches are computationally hard and do not offer the flexibility to gradually relax consistency requirements towards a certain setting.
This thesis presents a framework for the analysis of behaviour consistency that takes a fundamentally different approach. As a first step, an alignment between corresponding elements of related process models is constructed. Then, this thesis conducts behavioural analysis grounded on a relational abstraction of the behaviour of a process model, its behavioural profile. Different variants of these profiles are proposed, along with efficient computation techniques for a broad class of process models. Using behavioural profiles, consistency of an alignment between process models is judged by different notions and measures. The consistency measures are also adjusted to assess conformance of process logs that capture the observed execution of a process. Further, this thesis proposes various complementary techniques to support consistency management. It elaborates on how to implement consistent change propagation between process models, addresses the exploration of behavioural commonalities and differences, and proposes a model synthesis for behavioural profiles. / Das Geschäftsprozessmanagement umfasst Methoden zur Steuerung, Analyse sowie Optimierung von Geschäftsprozessen. Es stützt sich auf konzeptionelle Modelle, Prozessmodelle, welche den Ablauf zur Erreichung eines Geschäftszieles beschreiben. Demnach ist ein Prozessmodell eine Abbildung eines Geschäftsprozesses, erstellt hinsichtlich eines Modellierungsziels. Unterschiedliche Modellierungsziele resultieren somit in unterschiedlichen Modellen desselben Prozesses. Beispielsweise unterscheiden sich zwei Modelle erheblich, sofern eines für die strategische Entscheidungsfindung und eines für die Automatisierung erstellt wurde. Trotz der in unterschiedlichen Modellierungszielen begründeten Unterschiede sollten die entsprechenden Modelle konsistent, d.h. frei von Widersprüchen sein. Die Striktheit des Konsistenzbegriffs steht hierbei in Konflikt mit der Eignung der Prozessmodelle für einen bestimmten Zweck. Existierende Ansätze zur Analyse von Verhaltenskonsistenz basieren auf Verhaltensäquivalenzen und nehmen an, dass Prozessmodelle in einer hierarchischen Verfeinerungsrelation stehen. Folglich weisen sie eine hohe Berechnungskomplexität auf und erlauben es nicht, den Konsistenzbegriff graduell für einen bestimmten Anwendungsfalls anzupassen.
Die vorliegende Arbeit stellt einen Ansatz für die Analyse von Verhaltenskonsistenz vor, welcher sich fundamental von existierenden Arbeiten unterscheidet. Zunächst werden korrespondierende Elemente von Prozessmodellen, welche den gleichen Geschäftsprozess darstellen, identifiziert. Auf Basis dieser Korrespondenzen wird ein Ansatz zur Konsistenzanalyse vorgestellt. Jener basiert auf einer relationalen Verhaltensabstraktion, dem Verhaltensprofil eines Prozessmodells. Die Arbeit führt verschiedene Varianten dieses Profils ein und zeigt wie sie für bestimmte Modellklassen effizient berechnet werden. Mithilfe von Verhaltensprofilen werden Konsistenzbegriffe und Konsistenzmaße für die Beurteilung von Korrespondenzen zwischen Prozessmodellen definiert. Weiterhin werden die Konsistenzmaße auch für den Anwendungsfall der Konformität angepasst, welcher sich auf beobachtete Abläufe in Form von Ausführungsdaten bezieht. Darüber hinaus stellt die Arbeit eine Reihe von Methoden vor, welche die Analyse von Verhaltenskonsistenz ergänzen. So werden Lösungen für das konsistente Übertragen von Änderungen eines Modells auf ein anderes, die explorative Analyse von Verhaltensgemeinsamkeiten, sowie eine Modellsynthese für Verhaltensprofile vorgestellt.
|
195 |
The estimation of the truncation ratio and an algorithm for the parameter estimation in the random interval truncation model.Zhu, Huang-Xu 01 August 2003 (has links)
For interval-censored and truncated failure time data, the truncation ratio is unknown. In this paper, we propose an algorithm, similar to Turnbull's, to estimate the parameters. The truncation ratio for the interval-censored and truncated failure time data can also be estimated by the convergence result of the algorithm. A simulation study is proposed to compare with Turnbull (1976). Our algorithm seems to have better result.
|
196 |
Essays on monetary policy and banking regulationLi, Jingyuan 15 November 2004 (has links)
A central bank is usually assigned two functions: the control of inflation and the maintenance of a safetybanking sector. What are the precise conditions under which trigger strategies from the private sector can solve the time inconsistency problem and induce the central bank to choose zero inflation under a nonstationary natural rate? Can an optimal contract be used together with reputation forces to implement a desired socially optimal monetary policy rule? How to design a truthtelling contract to control the risk taking behaviors of the bank? My dissertation attempts to deal with these issues using three primary methodologies: monetary economics, game theory and optimal stochastic control theory.
|
197 |
A groupware interface to a shared file systemFaltemier, Timothy Collin 17 February 2005 (has links)
Current shared file systems (NFS and SAMBA) are based on the local area network
model. To these file systems, performance is the major issue. However, as the Internet
grows, so does the distance between users and the Local Area Network. With this
increase in distance, the latency increases as well. This creates a problem when multiple
users attempt to work in a shared environment. Traditionally, the only way to
collaborate over the Internet required the use of locks.
These requirements motivated the creation of the State Difference
Transformation algorithm that allows users non-blocking and unconstrained interaction
across the Internet on a tree based structure. Fine Grain Locking, on the other hand,
allows a user the ability to set a lock on a character or range of characters while using a
form of the transformation algorithm listed above. This thesis proposes an
implementation that integrates these two technologies as well as demonstrating the
effectiveness and flexibility of State Difference Transformation.
The implementation includes two applications that can be used to further
research in both the transformation and locking communities. The first application
allows users to create tests for SDT and Fine Grain Locking and verify the correctness of
the algorithms in any given situation. The second application then furthers this research
by creating a real-world groupware interface to a shared file system based on a clientserver
architecture. This implementation demonstrates the usability and robustness of
these algorithms in real world situations.
|
198 |
On recovery and consistency preservation in distributed real-time database systemsGustavsson, Sanny January 2000 (has links)
<p>In this dissertation, we consider the problem of recovering a crashed node in a distributed database. We especially focus on real-time recovery in eventually consistent databases, where the consistency of replicated data is traded off for increased predictability, availability and performance. To achieve this focus, we consider consistency preservation techniques as well as recovery mechanisms.</p><p>Our approach is to perform a thorough literature survey of these two fields. The literature survey considers not only recovery in real-time, distributed, eventually consistent databases, but also related techniques, such as recovery in main-memory resident or immediately consistent databases. We also examine different techniques for consistency preservation.</p><p>Based on this literature survey, we present a taxonomy and state-of-the-art report on recovery mechanisms and consistency preservation techniques. We contrast different recovery mechanisms, and highlight properties and aspects of these that make them more or less suitable for use in an eventually consistent database. We also identify unexplored areas and uninvestigated problems within the fields of database recovery and consistency preservation. We find that research on real-time recovery in distributed databases is lacking, and we also propose further investigation of how the choice of consistency preservation technique affects (or should affect) the design of a recovery mechanism for the system.</p>
|
199 |
How to implement Bounded-Delay replication in DeeDSEriksson, Daniel January 2002 (has links)
<p>In a distributed database system, pessimistic concurrency control is often used to ensure consistency which implies that the execution time of a transaction is not predictable. The execution time of a transaction is not dependent on the local transactions only, but on every transaction in the system.</p><p>In real-time database systems it is important that transactions are predictable. One way to make transactions predictable is to use eventual consistency where transactions commit locally before they are propagated to other nodes in the system. It is then possible to get predictable transactions due to the fact that the execution time of the transaction only depends on concurrent transactions on the local node and not on delays on other nodes and delays from a network.</p><p>In this report an investigation is made on how a replication protocol using eventual consistency can be designed for, and implemented in, DeeDS, a distributed real-time database prototype. The protocol consists of three parts: a propagation method, a conflict detection algorithm, and a conflict resolution mechanism. The conflict detection algorithm is based on version vectors. The focus is on the propagation mechanism and the conflict detection algorithm of the replication protocol.</p><p>An implementation design of the replication protocol is made. A discussion on how the version vectors may be applied in terms of granularity (container, page, object or attribute) and how the log filter should be designed and implemented to suit the particular conflict detection algorithm is carried out. A number of test cases with focus on regression testing have been defined.</p><p>It is concluded that the feasibility of the conflict detection algorithm is dependent on the application type that uses DeeDS.</p>
|
200 |
Consistency management in collaborative modelling and simulationUlriksson, Jenny January 2005 (has links)
<p>The aim of this thesis is to exploit the technological capabilities of computer supported collaborative work (CSCW) in the field of collaborative Modelling and Simulation (M&S). The thesis focuses on addressing two main problems: (i) providing flexible means of consistency management in collaborative M&S, and (ii) the ability of providing platform and application independent services for collaborative M&S.</p><p>In this work, some CSCW technologies and how some of the concepts can be incorporated in a distributed collaborative M&S environment, have been studied. An environment for component based simulation development and visualization, which provides support for collaborative M&S, has been designed. Some consistency policies that can be used in conjunction with distributed simulation and the High Level Architecture (HLA) have been investigated. Furthermore, the efficient utilization of HLA and XML in combination, as the foundation of a CSCW infrastructure has been proved. Two consistency policies were implemented utilizing HLA, a strict and an optimistic, in the distributed collaborative environment. Their performance was compared to the performance of a totally relaxed policy, in various collaboration situations.</p>
|
Page generated in 0.0463 seconds