• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 549
  • 196
  • 81
  • 72
  • 52
  • 28
  • 12
  • 8
  • 8
  • 8
  • 6
  • 6
  • 4
  • 4
  • 4
  • Tagged with
  • 1176
  • 161
  • 152
  • 140
  • 122
  • 89
  • 87
  • 86
  • 85
  • 74
  • 74
  • 71
  • 70
  • 67
  • 67
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
341

Interaction Competence : A concept describing the competence needed for participation in face-to-face interaction

Lindgren, Josefin Astrid Maria January 2008 (has links)
Face-to-face interaction has been studied both within sociology and linguistics, as well as withinother disciplines. Often has the perspective been too narrow, something which is not compatiblewith the diverse and dynamic nature of this type of interaction. This narrow view prevents fullunderstanding of interaction. Within this theoretical paper it is suggested that face-to-faceinteraction has to be studied with a broad perspective; when studying face-to-face interaction itis necessary to acknowledge its dynamic nature and therefore it is necessary to combineknowledge from different disciplines. Within this paper, I combine theories from linguistics andsociology in order to gain a broader perspective upon interaction. What has been missing fromearlier research upon face-to-face interaction and upon the competence needed to participatein such interaction is not the knowledge of the different features of interaction but a will toconnect them all. Existing concepts for describing the competence needed in order to be ableto interact have often failed to describe the dynamic, multi-faceted nature of interaction; therehas been a tendency to try to explain everything with just one factor. Within this paper, amore covering concept of the competence needed of an interactant within face-to-faceinteraction is proposed and sketched; a concept which I name Interaction Competence. Thiscompetence is the knowledge and abilities needed of an interactant in order to be able tointeract with others. This concept, which can be a valuable analytical tool for analyzing faceto-face interaction, has Dell Hymes’ concept of Communicative Competence and ErvingGoffman’s and Ann Warfield Rawls’ concept of Interaction Order as building-blocks andconsists of four main areas of competence: Control Body, Command Language, HandleSocio-cultural Knowledge and Understand Interaction Order. Within this paper also the affectof two interactant-external factors: the context and acceptability. Both are found to be highlyrelevant for the Interaction Competence of an interactant, thus the need for acknowledging therole of sufficient and acceptable Interaction Competence is seen. / <p>Presenterades (utöver uppsatsseminariet) inom ramen för Sociologiska Institutionens IMER (Internationella Migration Etniska Relationer)-seminarium</p>
342

Analysis of some risk models involving dependence

Cheung, Eric C.K. January 2010 (has links)
The seminal paper by Gerber and Shiu (1998) gave a huge boost to the study of risk theory by not only unifying but also generalizing the treatment and the analysis of various risk-related quantities in one single mathematical function - the Gerber-Shiu expected discounted penalty function, or Gerber-Shiu function in short. The Gerber-Shiu function is known to possess many nice properties, at least in the case of the classical compound Poisson risk model. For example, upon the introduction of a dividend barrier strategy, it was shown by Lin et al. (2003) and Gerber et al. (2006) that the Gerber-Shiu function with a barrier can be expressed in terms of the Gerber-Shiu function without a barrier and the expected value of discounted dividend payments. This result is the so-called dividends-penalty identity, and it holds true when the surplus process belongs to a class of Markov processes which are skip-free upwards. However, one stringent assumption of the model considered by the above authors is that all the interclaim times and the claim sizes are independent, which is in general not true in reality. In this thesis, we propose to analyze the Gerber-Shiu functions under various dependent structures. The main focus of the thesis is the risk model where claims follow a Markovian arrival process (MAP) (see, e.g., Latouche and Ramaswami (1999) and Neuts (1979, 1989)) in which the interclaim times and the claim sizes form a chain of dependent variables. The first part of the thesis puts emphasis on certain dividend strategies. In Chapter 2, it is shown that a matrix form of the dividends-penalty identity holds true in a MAP risk model perturbed by diffusion with the use of integro-differential equations and their solutions. Chapter 3 considers the dual MAP risk model which is a reflection of the ordinary MAP model. A threshold dividend strategy is applied to the model and various risk-related quantities are studied. Our methodology is based on an existing connection between the MAP risk model and a fluid queue (see, e.g., Asmussen et al. (2002), Badescu et al. (2005), Ramaswami (2006) and references therein). The use of fluid flow techniques to analyze risk processes opens the door for further research as to what types of risk model with dependency structure can be studied via probabilistic arguments. In Chapter 4, we propose to analyze the Gerber-Shiu function and some discounted joint densities in a risk model where each pair of the interclaim time and the resulting claim size is assumed to follow a bivariate phase-type distribution, with the pairs assumed to be independent and identically distributed (i.i.d.). To this end, a novel fluid flow process is constructed to ease the analysis. In the classical Gerber-Shiu function introduced by Gerber and Shiu (1998), the random variables incorporated into the analysis include the time of ruin, the surplus prior to ruin and the deficit at ruin. The later part of this thesis focuses on generalizing the classical Gerber-Shiu function by incorporating more random variables into the so-called penalty function. These include the surplus level immediately after the second last claim before ruin, the minimum surplus level before ruin and the maximum surplus level before ruin. In Chapter 5, the focus will be on the study of the generalized Gerber-Shiu function involving the first two new random variables in the context of a semi-Markovian risk model (see, e.g., Albrecher and Boxma (2005) and Janssen and Reinhard (1985)). It is shown that the generalized Gerber-Shiu function satisfies a matrix defective renewal equation, and some discounted joint densities involving the new variables are derived. Chapter 6 revisits the MAP risk model in which the generalized Gerber-Shiu function involving the maximum surplus before ruin is examined. In this case, the Gerber-Shiu function no longer satisfies a defective renewal equation. Instead, the generalized Gerber-Shiu function can be expressed in terms of the classical Gerber-Shiu function and the Laplace transform of a first passage time that are both readily obtainable. In a MAP risk model, the interclaim time distribution must be phase-type distributed. This leads us to propose a generalization of the MAP risk model by allowing for the interclaim time to have an arbitrary distribution. This is the subject matter of Chapter 7. Chapter 8 is concerned with the generalized Sparre Andersen risk model with surplus-dependent premium rate, and some ordering properties of certain ruin-related quantities are studied. Chapter 9 ends the thesis by some concluding remarks and directions for future research.
343

Policy-Driven Framework for Static Identification and Verification of Component Dependencies

Livogiannis, Anastasios 02 June 2011 (has links)
Software maintenance is considered to be among the most difficult, lengthy and costly parts of a software application's life-cycle. Regardless of the nature of the software application and the software engineering efforts to reduce component coupling to minimum, dependencies between software components in applications will always exist and initiate software maintenance operations as they tend to threaten the "health" of the software system during the evolution of particular components. The situation is more serious with modern technologies and development paradigms, such as Service Oriented Architecture Systems and Cloud Computing that introduce larger software systems that consist of a substantial number of components which demonstrate numerous types of dependencies with each other. This work proposes a reference architecture and a corresponding software framework that can be used to model the dependencies between components in software systems and can support the verification of a set of policies that are derived from system dependencies and are relative to the software maintenance operations being applied. Dependency modelling is performed using configuration information from the system, as well as information harvested from component interface descriptions. The proposed approach has been applied to a medium scale SOA system, namely the SCA Travel Sample from Apache Software Foundation, and has been evaluated for performance in a configuration specification related to a simulated SOA system consisting to up to a thousand web services offered in a few hundred components.
344

Modeling and Querying Uncertainty in Data Cleaning

Beskales, George January 2012 (has links)
Data quality problems such as duplicate records, missing values, and violation of integrity constrains frequently appear in real world applications. Such problems cost enterprises billions of dollars annually, and might have unpredictable consequences in mission-critical tasks. The process of data cleaning refers to detecting and correcting errors in data in order to improve the data quality. Numerous efforts have been taken towards improving the effectiveness and the efficiency of the data cleaning. A major challenge in the data cleaning process is the inherent uncertainty about the cleaning decisions that should be taken by the cleaning algorithms (e.g., deciding whether two records are duplicates or not). Existing data cleaning systems deal with the uncertainty in data cleaning decisions by selecting one alternative, based on some heuristics, while discarding (i.e., destroying) all other alternatives, which results in a false sense of certainty. Furthermore, because of the complex dependencies among cleaning decisions, it is difficult to reverse the process of destroying some alternatives (e.g., when new external information becomes available). In most cases, restarting the data cleaning from scratch is inevitable whenever we need to incorporate new evidence. To address the uncertainty in the data cleaning process, we propose a new approach, called probabilistic data cleaning, that views data cleaning as a random process whose possible outcomes are possible clean instances (i.e., repairs). Our approach generates multiple possible clean instances to avoid the destructive aspect of current cleaning systems. In this dissertation, we apply this approach in the context of two prominent data cleaning problems: duplicate elimination, and repairing violations of functional dependencies (FDs). First, we propose a probabilistic cleaning approach for the problem of duplicate elimination. We define a space of possible repairs that can be efficiently generated. To achieve this goal, we concentrate on a family of duplicate detection approaches that are based on parameterized hierarchical clustering algorithms. We propose a novel probabilistic data model that compactly encodes the defined space of possible repairs. We show how to efficiently answer relational queries using the set of possible repairs. We also define new types of queries that reason about the uncertainty in the duplicate elimination process. Second, in the context of repairing violations of FDs, we propose a novel data cleaning approach that allows sampling from a space of possible repairs. Initially, we contrast the existing definitions of possible repairs, and we propose a new definition of possible repairs that can be sampled efficiently. We present an algorithm that randomly samples from this space, and we present multiple optimizations to improve the performance of the sampling algorithm. Third, we show how to apply our probabilistic data cleaning approach in scenarios where both data and FDs are unclean (e.g., due to data evolution or inaccurate understanding of the data semantics). We propose a framework that simultaneously modifies the data and the FDs while satisfying multiple objectives, such as consistency of the resulting data with respect to the resulting FDs, (approximate) minimality of changes of data and FDs, and leveraging the trade-off between trusting the data and trusting the FDs. In presence of uncertainty in the relative trust in data versus FDs, we show how to extend our cleaning algorithm to efficiently generate multiple possible repairs, each of which corresponds to a different level of relative trust.
345

Characterization and Avoidance of Critical Pipeline Structures in Aggressive Superscalar Processors

Sassone, Peter G. 20 July 2005 (has links)
In recent years, with only small fractions of modern processors now accessible in a single cycle, computer architects constantly fight against propagation issues across the die. Unfortunately this trend continues to shift inward, and now the even most internal features of the pipeline are designed around communication, not computation. To address the inward creep of this constraint, this work focuses on the characterization of communication within the pipeline itself, architectural techniques to avoid it when possible, and layout co-design for early detection of problems. I present work in creating a novel detection tool for common case operand movement which can rapidly characterize an applications dataflow patterns. The results produced are suitable for exploitation as a small number of patterns can describe a significant portion of modern applications. Work on dynamic dependence collapsing takes the observations from the pattern results and shows how certain groups of operations can be dynamically grouped, avoiding unnecessary communication between individual instructions. This technique also amplifies the efficiency of pipeline data structures such as the reorder buffer, increasing both IPC and frequency. I also identify the same sets of collapsible instructions at compile time, producing the same benefits with minimal hardware complexity. This technique is also done in a backward compatible manner as the groups are exposed by simple reordering of the binarys instructions. I present aggressive pipelining approaches for these resources which avoids the critical timing often presumed necessary in aggressive superscalar processors. As these structures are designed for the worst case, pipelining them can produce greater frequency benefit than IPC loss. I also use the observation that the dynamic issue order for instructions in aggressive superscalar processors is predictable. Thus, a hardware mechanism is introduced for caching the wakeup order for groups of instructions efficiently. These wakeup vectors are then used to speculatively schedule instructions, avoiding the dynamic scheduling when it is not necessary. Finally, I present a novel approach to fast and high-quality chip layout. By allowing architects to quickly evaluate what if scenarios during early high-level design, chip designs are less likely to encounter implementation problems later in the process.
346

The Political Economy of TNCs and the Host Country¡¦s Industrial Policies: a Case study of the Thai Automotive Industry

Hung, Po-Chih 10 July 2011 (has links)
Thailand is the 13th automotive manufacturer, and it is also the important part and component manufacturer in the world. After 2000, Toyota and GM established the R&D Centre and Production Base in Thailand. Now, Thailand has become ¡§Detroit of Asia.¡¨ How could we find the developmental model of Thailand automotive industry? Maybe it cannot be explained by one theory. So we try to select three theories, different ideology, to find the path of industrial development in three periods. Finally, we find the path of automotive industrial development in Thailand. Triple Alliance of dependency development theory let the industry updrade and become more and more important. And then, international division let Thailand became the key role of supply chain in the world. Laissez faire and compare advantage are the characteristics in this period, so we know the developmental model of neo-classical economic theory let Thailand shining in the world.
347

The Juristic Construction of the Separation of Public Affairs between Central and Local Governments in Taiwan, ROC.¢wA New-Institutionalist Approach

Wei, Chih-yen 16 January 2004 (has links)
Abstract Whether the public affairs are executed by central or local government in Taiwan is based on the constitution, law and orders. The clauses of constitution were derived from the ¡§Principle of proper separation of competence¡¨, asserted first by Dr. Sun Yet-sen. From these clauses local and central governments in Taiwan should deal with different things which are properly divided according to their nature. But those clauses were not executed because of the unpredicted failure in mainland. For the reasons to initiate and regulate the local-self government institution, many acts and decrees had enacted after 1949, which gradually twisted the meaning of previous clauses that are ought to be obeyed. Besides, the increasing chaos happened recently about the struggle of expenditure in local and central authorities, showed the problems as the result of deviation. This thesis elaborates on the articles of institution which combines the clauses in the constitution and the ultimate aim of local self-government, includes democracy, separation of power in vertical level, and how the local governments are protected by law. This thesis also describes and analyses the whole juristic construction in separating local and central affairs, and, with a new-institutionalist approach, tries to find the key factors which make this institution evolved, changed and disobeyed the principles which they should be fulfilled. With these factors, this thesis finds that the deviation of the institution which is ought to be fulfilled is evolved by the ¡§path¡¨, the way it changed the previous aim or purpose. Consequently, once the ¡§path¡¨ had formatted, the actors of institution-local governments and central governments, will not obey the constitutional clauses and will keep exercising the deviated system of institution about the separation of central and local affairs.
348

The Study of the Educational Thought of Martin Carnoy: The Relation between Education and the State

Lee, Jowquen 29 June 2004 (has links)
¡@In view of the political economy of education, the purpose of this thesis is to study the educational thought of Martin Carnoy, who is a political economist and an educationist in the U.S. We are concerned with the relationship between education and capitalist state.¡@Central to this thesis is the state theory and discuss the functions and roles of education in different context, including colonial period, developing countries and advanced capitalist state. ¡@Since the spread of imperialism in colonial period, the colonial schooling is dominated by the colonizer and rationalizes the colonialism. The colonial schooling is therefore a liberating force to help the colonized against the colonizer. According to Lenin¡¦s imperialism, Carnoy explains the relation between colonial education and colonizer in the colonial period. ¡@In developing countries, both the conditioned capitalist state and transition state, the state bureaucracy makes national economic growth its first priority and so does the educational goal. People desire their children to learn more knowledge, however, to increase mass education rapidly. Based on educational dependency theory, Carnoy accounts for the roles of education in the Third World state. ¡@In advanced capitalist state, the state is a product and shaper of class struggle. Thus, the source of education change is pressed by economic reproductive and democratic dynamics. According to the last thought of Poulantzas, Carnoy constructs the ¡§social-conflict theory¡¨ to predict that economic development and social movements influence the education policies. ¡@It should be concluded, from Carnoy¡¦s educational thought, that the core of Carnoy¡¦s education work is the state theory. He criticizes the problems of capitalist education and approves the positive functions of schooling.
349

Land Use Optimization For Improved Transportation System Performance, Case Study:ankara

Alayli, Berna 01 December 2006 (has links) (PDF)
This thesis investigates the effects of urban land use on transportation system performance in terms of various land use factors such as density, mixed or single land use, jobs-housing balance, street patterns, transit accessibility. Reviewed studies show that urban land use has considerable effects on transportation system performance measures which are average travel distances per person, level of service, air quality, gasoline consumption etc. Based on the obtained results, it is concluded that one of the basic reasons behind increasing auto dependency and outcoming problems in recent years is lack of coordination between land use and transportation system. The obtained results are used to analyze land use impacts on transportation system of Ankara. Urban transportation planning decisions, deficiencies in implementation and resulted problems are discussed in terms of land use and transportation interaction. Possible land use regulations which can contribute to relieve transportation problems of Ankara are proposed.
350

Video Distribution Over Ip Networks

Ozdem, Mehmet 01 February 2007 (has links) (PDF)
As applications like IPTV and VoD (Video on demand) are gaining popularity, it is becoming more important to study the behavior of video signals in the Internet access infrastructures such as ADSL and cable networks. Average delay, average jitter and packet loss in these networks affect the quality of service, hence transmission and access speeds need to be determined such that these parameters are minimized. In this study the behavior of the above mentioned IP networks under variable bit rate (VBR) video traffic is investigated. ns-2 simulator is used for this purpose and actual as well as artificially generated signals are applied to the networks under test. Variable bit rate (VBR) traffic is generated synthetically using ON/OFF sources with ON/OFF times taken from exponential or Pareto distributions. As VBR video shows long range dependence with a Hurst parameter between 0.5 and 1, this parameter was used as a metric to measure the accuracy of the synthetic sources. Two different topologies were simulated in this study: one similar to ADSL access networks and the other behaving like cable distribution network. The performance of the networks (delay, jitter and packet loss) under VBR video traffic and different access speeds were measured. According to the obtained results, minimum access speeds in order achieve acceptable quality video delivery to the customers were suggested.

Page generated in 0.0339 seconds