Spelling suggestions: "subject:"bcheme"" "subject:"ascheme""
311 |
Distributed Detection Using Censoring Schemes with an Unknown Number of NodesHsu, Ming-Fong 04 September 2008 (has links)
The energy efficiency issue, which is subjected to an energy constraint, is important for the applications in wireless sensor network. For the distributed detection problem considered in this thesis, the sensor makes a local decision based on its observation and transmits a one-bit message to the fusion center. We consider the local sensors employing a censoring scheme, where the sensors are silent and transmit nothing to fusion center if their observations are not very informative. The goal of this thesis is to achieve an energy efficiency design when the distributed detection employs the censoring scheme. Simulation results show that we can have the same error probabilities of decision fusion while conserving more energy simultaneously as compared with the detection without using censoring schemes. In this thesis, we also demonstrate that the error probability of decision fusion is a convex function of the censoring probability.
|
312 |
How does credit rating migration impacts an optimal capital structure decision?Chen, Chang-chih 07 December 2009 (has links)
This paper examines the impact of credit rating migration in determining optimal capital structure. The models we propose capture empirical behavior in two ways; the behavior of linking firm¡¦s rating to the promised coupons and the behavior of targeting minimum rating. We find that as long as the rating at issuing time is not too low, tax shields of the rating-linked coupon debt are larger than those of standard debt with the same par, and hence, optimal leverage usage of the firm with the rating- linked coupon scheme is greater. Further, we also show that the behavior of targeting a minimum rating causes mean-reverting leverage dynamics. Managers are appeared to make over-repurchase choices for adjusting the current rating back to the initial target following a downgrade from target minimum rating.
|
313 |
Development and Application of Kinetic Meshless Methods for Euler EquationsC, Praveen 07 1900 (has links)
Meshless methods are a relatively new class of schemes for the numerical solution of partial differential equations. Their special characteristic is that they do not require a mesh but only need a distribution of points in the computational domain. The approximation at any point of spatial derivatives appearing in the partial differential equations is performed using a local cloud of points called the "connectivity" (or stencil). A point distribution can be more easily generated than a grid since we have less constraints to satisfy.
The present work uses two meshless methods; an existing scheme called Least Squares Kinetic Upwind Method (LSKUM) and a new scheme called Kinetic Meshless Method (KMM). LSKUM is a "kinetic" scheme which uses a "least squares" approximation} for discretizing the derivatives occurring in the partial differential equations. The first part of the thesis is concerned with some theoretical properties and application of LSKUM to 3-D point distributions. Using previously established results we show that first order LSKUM in 1-D is positivity preserving under a CFL-like condition. The 3-D LSKUM is applied to point distributions obtained from FAME mesh. FAME, which stands for Feature Associated Mesh Embedding, is a composite overlapping grid system developed at QinetiQ (formerly DERA), UK, for store separation problems.
The FAME mesh has a cell-based data structure and this is first converted to a node-based data structure which leads to a point distribution. For each point in this distribution we find a set of nearby nodes which forms the connectivity. The connectivity at each point (which is also the "full stencil" for that point) is split along each of the three coordinate directions so that we need six split (or half or one-sided) stencils at each point. The split stencils are used in LSKUM to calculate the split-flux derivatives arising in kinetic schemes which gives the upwind character to LSKUM. The "quality" of each of these stencils affects the accuracy and stability of the numerical scheme. In this work we focus on developing some numerical criteria to quantify the quality of a stencil for meshless methods like LSKUM.
The first test is based on singular value decomposition of the over-determined problem and the singular values are used to measure the ill-conditioning (generally caused by a flat stencil). If any of the split stencils are found to be ill-conditioned then we use the full stencil for calculating the corresponding split flux derivative. A second test that is used is based on an accuracy measurement. The idea of this test is that a "good" stencil must give accurate estimates of derivatives and vice versa. If the error in the computed derivatives is above some specified tolerance the stencil is classified as unacceptable. In this case we either enhance the stencil (to remove disc-type degenerate structure) or switch to full stencil. It is found that the full stencil almost always behaves well in terms of both the tests. The use of these two tests and the associated modifications of defective stencils in an automatic manner allows the solver to converge without any blow up. The results obtained for a 3-D configuration compare favorably with wind tunnel measurements and the framework developed here provides a rational basis for approaching the connectivity selection problem.
The second part of the thesis deals with a new scheme called Kinetic Meshless Method (KMM) which was developed as a consequence of the experience obtained with LSKUM and FAME mesh. As mentioned before the full stencil is generally better behaved than the split stencils. Hence the new scheme is constructed so that it does not require split stencils but operates on a full stencil (which is like a centered stencil). In order to obtain an upwind bias we introduce mid-point states (between a point and its neighbour) and the least squares fitting is performed using these mid-point states. The mid-point states are defined in an upwind-biased manner at the kinetic/Boltzmann level and moment-method strategy leads to an upwind scheme at the Euler level. On a standard 4-point Cartesian stencil this scheme reduces to finite volume method with KFVS fluxes. We can also show the rotational invariance of the scheme which is an important property of the governing equations themselves.
The KMM is extended to higher order accuracy using a reconstruction procedure similar to finite volume schemes even though we do not have (or need) any cells in the present case. Numerical studies on a model 2-D problem show second order accuracy. Some theoretical and practical advantages of using a kinetic formulation for deriving the scheme are recognized. Several 2-D inviscid flows are solved which also demonstrate many important characteristics. The subsonic test cases show that the scheme produces less numerical entropy compared to LSKUM, and is also better in preserving the symmetry of the flow. The test cases involving discontinuous flows show that the new scheme is capable of resolving shocks very sharply especially with adaptation. The robustness of the scheme is also very good as shown in the supersonic test cases.
|
314 |
A Study Of Organizational Rightsizing : Actors, Processes And OutcomeNirmala, Maria Christine 01 1900 (has links)
The pressure for economic integration has been reinforced by developments in technology, changes in market structures and the emergence of transnational corporations. Rightsizing has emerged as a critical process in this present era of shrinking space, shrinking time and disappearing borders in the context of employee engagement and human capital. It is often adopted by most organizations to help them become more agile and flexible and thereby cater to the competitive demands. The diverse impacts of rightsizing on various actors however question the justice aspect of the entire process.
This study addresses rightsizing from the perspective of social justice by taking into consideration the assessments of the processes by the affected actors namely, the implementers who drive the rightsizing processes; the separated who leave the organization as a result of rightsizing; and the stayers who remain in the organization and have observed the process. It also aims at understanding the various rightsizing processes from an empirical perspective and examines the causal relatedness of the rightsizing processes and outcome across some of the Indian organizations and the actors.
Review of literature:
The gamut of literature in rightsizing has provided a strong foundation for the researcher to gain a critical understanding of the various processes underlying rightsizing. The key challenge in rightsizing concerns the fairness aspect of the entire process considering the fact that in most cases rightsizing results in gains for some people and loss for others. Given that judgments of fairness are highly subjective, the lack of an absolute standard for determining fairness in this situation has been identified as a gap.
As many studies highlight the ambivalence in results with regard to the outcome of rightsizing and attribute them to the rightsizing processes, the relationship of the rightsizing processes and the outcome has emerged as an area of interest. Though there have been correlation based analysis between various rightsizing variables,
causal models that link the rightsizing processes to the outcome have been found missing. The dearth of studies from the Indian set up have also prompted the need to build segregate and aggregate causal models of rightsizing processes and outcome at the organization and actor levels.
Aim, objectives and methodology:
The aim of this study has been to identify the rightsizing processes that contribute towards positive outcome for both the organization and the individuals concerned from the social justice perspective.
The objectives were:
1. To compare and contrast the implementation of rightsizing processes in some
of the Indian organizations.
2. To develop a framework for understanding and classifying rightsizing
processes in relation to the social justice perspective.
3. To identify the effective rightsizing processes that contribute significantly
towards minimizing individual stress and maximizing commitment towards
the organization.
4. To outline appropriate guidelines based on the justice perspectives of the
actors for better implementation of rightsizing in organizations.
The conceptual model links the actors, their assessments of the rightsizing processes and the outcome of the entire process as affecting their individual stress and commitment towards the organization. The just processes of rightsizing have been decided based on the assessment of actors and on the extent of their agreement with one another on implementation of the discrete rightsizing practices. Accordingly those practices that all the three groups of actors, namely the implementers, stayers and separated perceive to have been implemented will be classified as the "best practices" or system 4 practices; the practices that have been perceived to have been implemented by the implementers and stayers but not the separated will be classified as the "better practices" or system 3 practices; those practices that the implementers and separated perceive as implemented will be the "ineffective practices" or system 2 practices; and the practices where all the three groups differ with one another with
regard to the extent of implementation will be termed the "poor practices" or system 1 practices.
The questionnaire was finalized after a preliminary and pilot study. Data was collected from 727 respondents across four organizations, one private manufacturing unit referred to as Org-1, one state public sector unit referred to as Org-2, two central public sector units referred to as Org-3 and Org-4. The total sample consisted of 137 implementers, 320 stayers and 270 separated.
Results and discussion:
The first part of the analysis focused on validating the rightsizing processes through factor analysis and also testing the reliability using Chronbach alpha. The implementation of the rightsizing processes across the four organizations was compared using Bonferroni post hoc comparisons. Org-1 and Org-4 had implemented most of the rightsizing practices adequately. The perceptions of the employees of Org-2 and Org-3 were found to be significantly inadequate when compared to Org-1 and Org-4 with respect to many of the practices.
The second set of analysis compares the assessments of the actors with regard to the implementation of the various rightsizing practices, and classifies them into one of the four systems based on the framework developed. The system 4 practices consist of, the notification period; the severance package; the amount of money that the organizations wished to save after rightsizing and avoidance of ineffective cost reduction strategies. The outcome of rightsizing with respect to role clarity and role sufficiency also falls into system 4. The system 3 practices consist of understanding the need for rightsizing; the need for manpower reduction, proactive cost reduction strategies, separation of the sick and criteria for separation of the redundant. System 1 practices comprise of internal stakeholders, alternate strategies adopted by the organization before resorting to separation of the employees, preparation and communication, leadership, review and control and assistance provided to the separated. The outcome with regard to job security and commitment also falls in this category.
The final set of analysis aims at identifying those processes that contribute significantly towards the outcome at both the organizational level and from the
perceptions of the actors through path analysis. The path analysis was conducted at the segregate and aggregate levels for the organizations and the actors. Initially a full segregate model where all the independent variables are linked to the dependent variables was fit for the 4 organizations and for the 3 categories of actors. Those processes that contributed significantly towards the outcome with respect to the actors and the organizations were structured onto two final aggregate models. The validity of these aggregate models was examined for the organizations and actors respectively.
Conclusion:
This study provides a deeper understanding of the various processes underlying rightsizing in the three different stages of implementation. These validated measures can be used as a template by the organizations to study and guide further rightsizing initiatives. Through this research three groups of individuals diversely affected by rightsizing have been brought together under one common framework which is a methodological innovation. Inspite of having different interests, it is possible to obtain a consensus in their assessments of some of the rightsizing practices. This is an important conclusion that can be drawn in support of the social justice perspective with regard to rightsizing. The relationship between the rightsizing processes as affecting the outcome of stress and commitment can also be understood from a causal perspective across organizations and actors through segregate and aggregate models. The best practices with knowledge capital and social capital can also be included in understanding the perspectives of the actors and classification of rightsizing best practices in future work.
|
315 |
Identifying design issues related to the knowledge bases of medical decision support systemsAbbas, Assad January 2010 (has links)
<p>The modern medical diagnostic systems are based on the techniques using digital data formats – a natural feed for the computer based systems. With the use of modern diagnostic techniques the diagnosis process is becoming more complex as many diseases seem to have the same pre-symptoms at early stages. And of course computer based systems require more efficient and effective ways to identify such complexities. However, the existing formalisms for knowledge representation, tools and technologies, learning and reasoning strategies seem inadequate to create meaningful relationship among the entities of medical data i.e. diseases, symptoms and medicine etc. This inadequacy actually is due to the poor design of the knowledge base of the medical system and leads the medical systems towards inaccurate diagnosis. This thesis discusses the limitations and issues specific to the design factors of the knowledge base and suggests that instead of using the deficient approaches and tools for representing, learning and retrieving the accurate knowledge, use of semantic web tools and techniques should be adopted. Design by contract approach may be suitable for establishing the relationships between the diseases and symptoms. The relationship between diseases and symptoms and their invariants can be represented more meaningfully using semantic web. This can lead to more concrete diagnosis, by overcoming the deficiencies and limitations of traditional approaches and tools.</p>
|
316 |
Débogage symbolique multi-langages pour les plates-formes d'exécution généralistesCiabrini, Damien 03 October 2006 (has links) (PDF)
Cette thèse est consacrée à l'amélioration des débogueurs symboliques pour tenir compte des spécificités des langages de haut niveau, notamment leur compilation délicate vers des plates-formes d'exécution généralistes. Ces travaux ont conduit à la réalisation de Bugloo, un débogueur multi-langages pour la machine virtuelle Java. <br /><br />Deux nouveaux mécanismes de représentations virtuelles sont proposés pour éliminer les perturbations de débogage dues à la présence sur la pile de fonctions intermédiaires produites par la compilation des langages de haut niveau. Le premier utilise des règles fournies par les implanteurs de langage pour maintenir une correspondance entre le code source d'un programme et le code produit par sa compilation. Cela permet au débogueur de reconstruire une vue logique dans laquelle les détails de compilation ont été expurgés. Le second mécanisme sert à contrôler l'exécution pas-à-pas, afin de ne jamais s'arrêter dans les fonctions intermédiaires engendrées par le compilateur. Ces deux mécanismes ont été adaptés pour réaliser un profileur d'allocation mémoire produisant des statistiques dans lesquelles les fonctions intermédiaires sont masquées. <br /><br />Durant ces travaux, un support de débogage complet a été développé pour le langage Bigloo, un dialecte du langage fonctionnel Scheme. Des expérimentations similaires ont étés menées sur les langages ECMAScript et Python. Les résultats obtenus montrent que les techniques de représentations virtuelles développées s'appliquent efficacement quel que soit le schéma de compilation adopté, y compris lorsque les programmes sont composés de plusieurs langages de haut niveau.
|
317 |
Privatization of public housing in Hong Kong a comparison with the privatization of council housing in the UK /Chung, Chik-leung. January 2000 (has links)
Thesis (M. Hous. M.)--University of Hong Kong, 2001. / Includes bibliographical references.
|
318 |
A study of the Hong Kong government's Electronic Service Delivery Scheme /Chak, Man-yee, Rene. January 2001 (has links)
Thesis (M.P.A.)--University of Hong Kong, 2001. / Includes bibliographical references (leaves 107-112).
|
319 |
An evaluation of the pilot scheme of urban renewal in Hong Kong /Mo, Chan-ming. January 1900 (has links)
Thesis (M. Soc. Sc.)--University of Hong Kong, 1980. / Typescript.
|
320 |
Analysis of blockage effects on urban cellular networksBai, Tianyang 22 October 2013 (has links)
Large-scale blockages like buildings affect the performance of urban cellular networks, especially in the millimeter-wave frequency band. Unfortunately, such blockage effects are either neglected or characterized by oversimplified models in the analysis of cellular networks. Leveraging concepts from random shape theory, this paper proposes a mathematical framework to model random blockages, and quantifies their effects on the performance of cellular networks. Specifically, random buildings are modeled as a process of rectangles with random sizes and orientations whose centers form a Poisson point process on the plane, which is called a Boolean scheme. The distribution of the number of blockages in a link is proven to be Poisson with parameter dependent on the length of the link, which leads to the distribution of penetration losses of a single link. A path loss model that incorporates the blockage effects is proposed, which matches experimental trends observed in prior work. The blockage model is applied to analyze blockage effects on cellular networks assuming blockages are impenetrable, in terms of connectivity, coverage probability, and average rate. Analytic results show while buildings may block the desired signal, they may still have a positive impact on network performance since they also block more interference. / text
|
Page generated in 0.0385 seconds