• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 598
  • 172
  • 137
  • 125
  • 59
  • 51
  • 40
  • 21
  • 19
  • 18
  • 18
  • 18
  • 18
  • 18
  • 17
  • Tagged with
  • 1435
  • 407
  • 393
  • 113
  • 110
  • 108
  • 101
  • 101
  • 100
  • 93
  • 89
  • 83
  • 82
  • 79
  • 72
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
761

Selected essays on the success of mergers and acquisitions : evidence from the banking and REIT industries /

Keisers, Maximilian. January 1900 (has links)
Thesis (doctoral)--Oestrich-Winkel, European Business School, 2007. / Includes bibliographic references (p. 139-149).
762

Quality competition and mergers : evidence from the Medicare HMO market /

Healy, Deborah A. January 2002 (has links)
Thesis (Ph. D.)--University of Chicago, Dept. of Economics, August 2002 / Includes bibliographical references. Also available on the Internet.
763

Two essays in corporate finance

Low, An Chee , January 2007 (has links)
Thesis (Ph. D.)--Ohio State University, 2007. / Title from first page of PDF file. Includes bibliographical references (p. 95-101).
764

Essays in corporate finance

Wang, Cong. January 2007 (has links)
Thesis (Ph. D. in Management)--Vanderbilt University, Aug. 2007. / Title from title screen. Includes bibliographical references.
765

Die Stellungnahme der Zielgesellschaft zu öffentlichen Angeboten nach dem WpÜG /

Kubalek, Jörn. January 1900 (has links)
Thesis (doctoral)--Johannes Gutenberg-Universität, 2004. / Includes bibliographical references (p. [238]-263) and index.
766

Avaliação dos parâmetros de compressibilidade da camada de argila mole da Baixada de Jacarepaguá, após longo período de sobrecarga de aterro. / Evaluation of Compressibility Parameters of a soft Clay layer at Baixada de Jacarepaguá after a long period of an embankment surcharge.

Bianca da Silva Baldez 30 August 2013 (has links)
Foram realizados ensaios de adensamento, SIC e CRS, em amostras retiradas de um depósito argiloso muito mole, na Baixada de Jacarepaguá, 15 anos após a execução de um aterro. As amostras foram retiradas do mesmo local onde foram obtidas as amostras da primeira campanha, por ocasião do projeto. Os ensaios CRS, realizados com diferentes velocidades de deformação, são comparados aos resultados dos ensaios SIC da campanha atual de investigação. Os parâmetros geotécnicos da camada de argila muito mole, 15 anos após a construção do aterro, são comparados aos parâmetros da camada original. O aumento das tensões de sobreadensamento e redução do OCR são obtidos da interpretação dos ensaios atuais. A grandeza do recalque foi inferida a partir da nova estratigrafia, através da espessura atual da camada na região investigada, pela variação do índice de vazios e pela variação do teor médio de umidade. Os recalques previstos originalmente, incluindo a parcela de compressão secundária, são comparados aos recalques inferidos e medidos através de placa de recalque. As principais conclusões da pesquisa sugerem que a qualidade dos corpos de prova da primeira campanha foram superiores aos atuais, apesar dos cuidados com a amostragem, transporte das amostras e preparação dos corpos de prova no laboratório na segunda campanha de ensaios. Atribuiu-se esta ocorrência ao processo construtivo, que impôs movimentação excessiva ao maciço argiloso, interferindo com suas características de maior uniformidade em seu processo de deposição natural. Os ensaios de adensamento com diferentes velocidades de carregamento apresentaram comportamento similar, com variação da posição relativa das curvas e x σv, com ensaios mais rápidos exibindo maiores índices de vazios. As curvas do índice de vazios versus tensão efetiva ilustram, de forma acentuada, a redução significativa do índice de vazios da segunda campanha em relação ao solo natural, antes do lançamento do aterro. Os recalques previstos e os obtidos, seja pela instrumentação, seja pelos demais registros, indicam valores bastante próximos, em face da variabilidade da estratigrafia e dos parâmetros geotécnicos inerentes à natureza dos depósitos sedimentares. / SIC and CRS consolidation tests have been performed on samples obtained from a very soft clay deposit from Jacarepaguá lowland, 15 years after the execution of a fill. The samples have been extracted from the same site where preliminary samples had been obtained at designing phase. The CRS tests have been performed at different strain velocities and compared to the SIC tests results carried out for the second investigation campaign. The geotechnical parameters of the very soft clay 15 years after fill construction are compared to the parameters of the natural shallow clay. The increase in pre-consolidation pressure and OCR reduction has been obtained after interpretation of the actual tests results. The settlement extent has been inferred from the new stratigraphy, due to the actual thickness of the layer in the investigated region, by the variation in void ratio and by the reduction in soil water content. The settlements originally predicted, including the secondary compression, are compared to the settlement obtained from different estimations and to those obtained from instrumentation. The main conclusions suggest that the sample quality from the first investigation were superior than from the actual one, in spite of the careful sampling, transportation and preparation in the laboratory for the second investigation campaign. This occurrence has been attributed to the construction process, imposing excessive movement of the clayey mass, with great interference on the uniformity of its characteristics when compared to its natural deposition. The consolidation tests with different strain velocity presented similar results, with variation in the relative position of the e x σv curve, with the tests with higher strain velocity showing higher void indices. The e x σv curves illustrate in an accentuated means the significant reduction in void ratio from the first to the second laboratory test campaign, due to the fill construction. The predicted settlements and that actually obtained by the instrumentation or by other sources indicate similar values compared to the stratification variability of the geotechnical parameters inherent to the natural origin of the sedimentary deposits.
767

Reactivation and reinstatement of hippocampal assemblies

van de Ven, Gido January 2017 (has links)
New memories are labile, but over time some of them are stabilized. This thesis investigates the network mechanisms in the brain underlying the gradual consolidation of memory representations. Specifically, I performed a causal test of the long-standing hypothesis that the offline reactivation of new, memory-representing cell assemblies supports memory consolidation by stabilizing those assemblies and increasing the likelihood of their later reinstatement - and therefore presumably of memory recall. I performed multi-unit extracellular recordings in the dorsal CA1 region of behaving mice, from which I detected short-timescale (25 ms) co-activation patterns of principal neurons during exploration of open-field enclosures. These cell assembly patterns appeared to represent space as their expression was spatially tuned and environment specific; and these patterns were preferentially reactivated during sharp wave-ripples (SWRs) in subsequent sleep. Importantly, after exposure to a novel - but not a familiar - enclosure, the strength with which an assembly pattern was reactivated predicted its later reinstatement strength during context re-exposure. Moreover, optogenetic silencing of hippocampal pyramidal neurons during on-the-fly detected SWRs during the sleep following exposure to a novel - but again not a familiar - enclosure impaired subsequent assembly pattern reinstatement. These results are direct evidence for a causal role of SWR-associated reactivation in the stability of new hippocampal cell assemblies. Surprisingly, offline reactivation was only important for the stability of a subset of the assembly patterns expressed in a novel enclosure. Optogenetic SWR silencing only impaired the reinstatement of "gradually strengthened" patterns that had had a significant increasing trend in their expression strength throughout the initial exposure session. Consistent with this result, a positive correlation between reactivation and subsequent reinstatement was only found for these gradually strengthened patterns and not for the other, "early stabilized" patterns. An interesting interpretation is that the properties of the gradually strengthened patterns are all consistent with the Hebbian postulate of "fire together, wire together". To enable investigation of the relation between interneurons and principal cell assembly patterns from extracellular recordings, as a final contribution this thesis describes a statistical framework for the unsupervised classification of interneurons based on their firing properties alone.
768

The re-engineering of communication processes to manage post-merger integration challenges at a selected financial institution

Ederies, Nuraan January 2016 (has links)
Thesis (MTech (Business Information Systems))--Cape Peninsula University of Technology, 2016. / In any business and in particular complex, merging businesses there is a need for effective communication with both internal and external stakeholders. This is often difficult when, in the process of a merger, communication departments of the different consolidating entities, each of which may have had its own processes, are required to function as one synergistic unit. Merger-related disruptions to business, and particularly communication processes, result in communication failure, causing delays, duplication, incoherent flow of communication and greater margins for business error. This becomes a further challenge without a thorough audit and re-evaluation of all the existing and prospective communication-related business process options. Optimal consolidated business processes need to be re-engineered and continuously improved within the new environment. This study aims to explore the importance of re-engineering communication processes during mergers to ensure a successful communication function supported by a robust business process. The analyses and recommendations further aim to strengthen the body of knowledge available to communication specialists operating within the context of severe organisational change, helping them to understand not only why it may be necessary to undertake an exercise of this nature, but also the challenges, issues and questions likely to arise. This study aims to answer the questions: i) What factors affect communication processes when companies merge? ii) How do communications processes need to be structured to be effective when companies merge? iii) Why do communications processes fail when businesses merge? Primary and secondary research is undertaken with a prominent financial services institution as a business case. Primary research includes face-to-face interviews and empirical research. The interviews took the form of one-on-one sessions with each of the interviewees using an interview guide of semi-structured questions. Secondary research was undertaken for a sound theoretical foundation. In terms of research philosophy, the ontology is subjective, while the epistemology is interpretivism. Responses are coded, summarised, categorised and prominent themes identified as a basis for making recommendations. Throughout this process, ethical practice was undertaken with respect to colleagues, participants, the environment, society and sponsors. This research also aims to move towards an understanding of the factors contributing to increased productivity, enhanced organisational effectiveness and heightened staff morale. It also works toward a resultant improvement in profit and perception due to better communication processes in a dynamic environment where communication to stakeholders is a critical factor in business success.
769

Energy-aware adaptation in Cloud datacenters

Mahadevamangalam, Srivasthav January 2018 (has links)
Context: Cloud computing is providing services and resources to customers based on pay-per-use. As the services increasing, Cloud computing using a vast number of data centers like thousands of data centers which consumes high energy. The power consumption for cooling the data centers is very high. So, recent research going on to implement the best model to reduce the energy consumption by the data centers. This process of minimizing the energy consumption can be done using dynamic Virtual Machine Consolidation (VM Consolidation) in which there will be a migration of VMs from one host to another host so that energy can be saved. 70% of energy consumption will be reduced/ saved when the host idle mode is switched to sleep mode, and this is done by migration of VM from one host to another host. There are many energy adaptive heuristics algorithms for the VM Consolidation. Host overload detection, host underload detection and VM selection using VM placement are the heuristics algorithms of VM Consolidation which results in less consumption of the energy in the data centers while meeting Quality of Service (QoS). In this thesis, we proposed new heuristic algorithms to reduce energy consumption. Objectives: The objective of this research is to provide an energy efficient model to reduce energy consumption. And proposing a new heuristics algorithms of VM Consolidationtechnique in such a way that it consumes less energy. Presenting the advantages and disadvantages of the proposed heuristics algorithms is also considered as objectives of our experiment. Methods: Literature review was performed to gain knowledge about the working and performances of existing algorithms using VM Consolidation technique. Later, we have proposed a new host overload detection, host underload detection, VM selection, and VM placement heuristic algorithms. In our work, we got 32 combinations from the host overload detection and VM selection, and two VM placement heuristic algorithms. We proposed dynamic host underload detection algorithm which is used for all the 32 combinations. The other research method chosen is experimentation, to analyze the performances of both proposed and existing algorithms using workload traces of PlanetLab. This simulation is done usingCloudSim. Results: To compare and get the results, the following parameters had been considered: Energy consumption, No. of migrations, Performance Degradation due to VM Migrations (PDM),Service Level Agreement violation Time per Active Host (SLATAH), SLA Violation (SLAV),i.e. from a combination of the PDM, SLATAH, Energy consumption and SLA Violation (ESV).We have conducted T-test and Cohen’s d effect size to measure the significant difference and effect size between algorithms respectively. For analyzing the performance, the results obtained from proposed algorithms and existing algorithm were compared. From the 32 combinations of the host overload detection and VM Selection heuristic algorithms, MADmedian_MaxR (Mean Absolute Deviation around median (MADmedian) and Maximum Requested RAM (MaxR))using Modified Worst Fit Decreasing (MWFD) VM Placement algorithm, andMADmean_MaxR (Mean Absolute Deviation around mean (MADmean), and MaximumRequested RAM (MaxR)) using Modified Second Worst Fit Decreasing (MSWFD) VM placement algorithm respectively gives the best results which consume less energy and with minimum SLA Violation. Conclusion: By analyzing the comparisons, it is concluded that proposed algorithms perform better than the existing algorithm. As our aim is to propose the better energy- efficient model using the VM Consolidation techniques to minimize the power consumption while meeting the SLAs. Hence, we proposed the energy- efficient algorithms for VM Consolidation technique and compared with the existing algorithm and proved that our proposed algorithm performs better than the other algorithm. We proposed 32 combinations of heuristics algorithms (host overload detection and VM selection) with two adaptive heuristic VM placement algorithms. We have proposed a dynamic host underload detection algorithm, and it is used for all 32 combinations. When the proposed algorithms are compared with the existing algorithm, we got 22 combinations of host overload detection and VM Selection heuristic algorithms with MWFD(Modified Worst Fit Decreasing) VM placement and 20 combinations of host overload detection and VM Selection heuristic algorithms with MSWFD (Modified Second Worst FitDecreasing) VM placement algorithm which shows the better performance than existing algorithm. Thus, our proposed heuristic algorithms give better results with minimum energy consumption with less SLA violation.
770

A comparison of energy efficient adaptation algorithms in cloud data centers

Penumetsa, Swetha January 2018 (has links)
Context: In recent years, Cloud computing has gained a wide range of attention in both industry and academics as Cloud services offer pay-per-use model, due to increase in need of factors like reliability and computing results with immense growth in Cloud-based companies along with a continuous expansion of their scale. However, the rise in Cloud computing users can cause a negative impact on energy consumption in the Cloud data centers as they consume huge amount of overall energy. In order to minimize the energy consumption in virtual datacenters, researchers proposed various energy efficient resources management strategies. Virtual Machine dynamic Consolidation is one of the prominent technique and an active research area in recent time, used to improve resource utilization and minimize the electric power consumption of a data center. This technique monitors the data centers utilization, identify overloaded, and underloaded hosts then migrate few/all Virtual Machines (VMs) to other suitable hosts using Virtual Machine selection and Virtual Machine placement, and switch underloaded hosts to sleep mode.   Objectives: Objective of this study is to define and implement new energy-aware heuristic algorithms to save energy consumption in Cloud data centers and show the best-resulted algorithm then compare performances of proposed heuristic algorithms with old heuristics.   Methods: Initially, a literature review is conducted to identify and obtain knowledge about the adaptive heuristic algorithms proposed previously for energy-aware VM Consolidation, and find the metrics to measure the performance of heuristic algorithms. Based on this knowledge, for our thesis we have proposed 32 combinations of novel adaptive heuristics for host overload detection (8) and VM selection algorithms (4), one host underload detection and two adaptive heuristic for VM placement algorithms which helps in minimizing both energy consumption and reducing overall Service Level Agreement (SLA) violation of Cloud data center. Further, an experiment is conducted to measure the performances of all proposed heuristic algorithms. We have used the CloudSim simulation toolkit for the modeling, simulation, and implementation of proposed heuristics. We have evaluated the proposed algorithms using PlanetLab VMs real workload traces.   Results: The results were measured using metrics energy consumption of data center (power model), Performance Degradation due to Migration (PDM), Service Level Agreement violation Time per Active Host (SLATAH), Service Level Agreement Violation (SLAV = PDM . SLATAH) and, Energy consumption and Service level agreement Violation (ESV).  Here for all four categories of VM Consolidation, we have compared the performances of proposed heuristics with each other and presented the best heuristic algorithm proposed in each category. We have also compared the performances of proposed heuristic algorithms with existing heuristics which are identified in the literature and presented the number of newly proposed algorithms work efficiently than existing algorithms. This comparative analysis is done using T-test and Cohen's d effect size.   From the comparison results of all proposed algorithms, we have concluded that Mean absolute Deviation around median (MADmedain) host overload detection algorithm equipped with Maximum requested RAM VM selection (MaxR) using Modified First Fit Decreasing VM placement (MFFD), and Standard Deviation (STD) host overload detection algorithm equipped with Maximum requested RAM VM selection (MaxR) using Modified Last Fit decreasing VM placement (MLFD) respectively performed better than other 31 combinations of proposed overload detection and VM selection heuristic algorithms, with regards to Energy consumption and Service level agreement Violation (ESV). However, from the comparative study between existing and proposed algorithms, 23 and 21 combinations of proposed host overload detection and VM selection algorithms using MFFD and MLFD VM placements respectively performed efficiently compared to existing (baseline) heuristic algorithms considered for this study.   Conclusions: This thesis presents novel proposed heuristic algorithms that are useful for minimization of both energy consumption and Service Level Agreement Violation in virtual datacenters. It presents new 23 combinations of proposed host overloading detection and VM selection algorithms using MFFD VM placement and 21 combinations of proposed host overloading detection and VM selection algorithms using MLFD VM placement, which consumes the minimum amount of energy with minimal SLA violation compared to the existing algorithms. It gives scope for future researchers related to improving resource utilization and minimizing the electric power consumption of a data center. This study can be extended in further by implementing the work on other Cloud software platforms and developing much more efficient algorithms for all four categories of VM consolidation.

Page generated in 0.1032 seconds