Spelling suggestions: "subject:"cumulative"" "subject:"kumulative""
161 |
Ordonnancement cumulatif en programmation par contraintes : caractérisation énergétique des raisonnements et solutions robustes / Cumulative scheduling in constraint programming : energetic characterization of reasoning and robust solutionsDerrien, Alban 27 November 2015 (has links)
La programmation par contraintes est une approche régulièrement utilisée pour traiter des problèmes d’ordonnancement variés. Les problèmes d’ordonnancement cumulatifs représentent une classe de problèmes dans laquelle des tâches non morcelable peuvent être effectuées en parallèle. Ces problèmes apparaissent dans de nombreux contextes réels, tels que par exemple l’allocation de machines virtuelles ou l’ordonnancement de processus dans le "cloud", la gestion de personnel ou encore d’un port. De nombreux mécanismes ont été adaptés et proposés en programmation par contraintes pour résoudre les problèmes d’ordonnancement. Les différentes adaptations ont abouti à des raisonnements qui semblent à priori significativement distincts. Dans cette thèse nous avons effectué une analyse détaillée des différents raisonnements, proposant à la fois une notation unifiée purement théorique mais aussi des règles de dominance, permettant une amélioration significative du temps d’exécution d’algorithmes issus de l’état de l’art, pouvant aller jusqu’à un facteur sept. Nous proposons aussi un nouveau cadre de travail pour l’ordonnancement cumulatif robuste, permettant de trouver des solutions supportant qu’à tout moment une ou plusieurs tâches soit retardées, sans remise en cause de l’ordonnancement généré et en gardant une date de fin de projet satisfaisante. Dans ce cadre, nous proposons une adaptation d’un algorithme de l’état de l’art, Dynamic Sweep. / Constraint programming is an approach regularly used to treat a variety of scheduling problems. Cumulative scheduling problems represent a class of problems in which non-preemptive tasks can be performed in parallel. These problems appear in many contexts, such as for example the allocation of virtual machines, the ordering process in the "cloud", personnel management or a port. Many mechanisms have been adapted and offered in constraint programming to solve scheduling problems. The various adaptations have resulted in reasoning that appear a priori significantly different. In this thesis we performed a detailed analysis of the various arguments, offering both a theoretical unified caracterization but also dominance rules, allowing a significant improvement in execution time of algorithms from the state of the art, up to a factor of seven. we also propose a new framework for robust cumulative scheduling, to find solutions that support at any time one or more tasks to be delayed while keeping a satisfactory end date of the project and without calling into question the generated scheduling. In this context, we propose an adaptation of an algorithm of the state of the art, Dynamic Sweep.
|
162 |
Audit Committee Director TurnoverSinghvi, Meghna 11 July 2011 (has links)
Actions by both private sector organizations and legislators in recent years have highlighted the importance of the audit committee of the board of directors of corporations in the financial reporting process. For example, the Sarbanes Oxley Act of 2002 has multiple sections that deal with the composition and functioning of audit committees. My dissertation examines multiple issues related to the composition of audit committees.
In the first two parts of my dissertation, I examine the stock market reactions to disclosures of audit committee appointments and departures in the 8-Ks filed with the SEC during 2008 and 2009. I find that there is a positive stock market reaction to the appointment of audit committee directors who are financial experts. The second essay investigates the cumulative abnormal return to departure of audit committee directors. I find that when an accounting expert leaves the audit committee, the market reaction is significantly negative. These results are consistent with regulators’ concerns related to having directors with audit, accounting and other financial expertise on corporate audit committees.
The third essay of my dissertation examines the changes in audit committee composition in the last decade. I find that while the increase in audit committee size is relatively modest, there has been a significant increase in the number of audit committee experts and the frequency of audit committee meetings over the past decade; interestingly, such increase in the number of meetings has persisted even after the media focus on the auditing profession, in the immediate aftermath of the Enron and Andersen failures, have waned. My results show that audit committee composition and its role continues to evolve with regulatory and other corporate governance related changes.
|
163 |
In-situ photocatalytic remediation of organic contaminants in groundwaterLim, Leonard Lik Pueh January 2010 (has links)
This research is about the development of a photocatalytic reactor design, Honeycomb, for in-situ groundwater remediation. Photocatalysis, typically a pseudo first order advanced oxidation process, is initiated via the illumination of UVA light on the catalyst, i.e. titanium dioxide (TiO2). In the presence of oxygen, highly reactive oxidising agents are generated such as superoxide (O2-), hydroxyl (OH.-) radicals, and holes (hvb+) on the catalyst surface which can oxidise a wide range of organic compounds. The target contaminant is methyl tert butyl ether (MTBE), a popular gasoline additive in the past three decades, which gives the water an unpleasant taste and odour at 20 μg L-1, making it undrinkable. This research consists of three major parts, i.e. (i) establishing a suitable catalyst immobilisation procedure, (ii) characterisation and evaluation of reactor models and (iii) scale up studies in a sand tank. TiO2 does not attach well onto many surfaces. Therefore, the first step was to determine a suitable immobilisation procedure by preparing TiO2 films using several potential procedures and testing them under the same conditions, at small scale. The coatings were evaluated in terms of photocatalytic activity and adhesion. The photocatalytic activity of the coatings was tested using methylene blue dye (MB), which is a photocatalytic indicator. A hybrid coating, which comprises a sol gel solution enriched with Aeroxide TiO2 P25 powder, on woven fibreglass exhibited the best adhesion and photocatalytic activity among samples evaluated. Thus, it was used to produce immobilised catalyst for this research. Consequently, the immobilisation procedure was scaled up to synthesize TiO2 coatings for the potential photocatalytic reactor design. The photocatalytic activity of the coatings produced from the scaled up immobilisation procedure were reasonably comparable to that produced at small scale. Due to the UVA irradiation and mass transfer limitations, photocatalytic reactors are typically compact in order to maximise their efficiency to accommodate high flows, particularly in water and wastewater treatment. In the case of groundwater, however, the treatment area can span up to meters in width and depth. Groundwater flow is significantly lower than that of water treatment, as the reactor design does not need to be compact. Considering both factors, a photocatalytic reactor design of hexagonal cross-section (Honeycomb) was proposed, in which the structures can be arranged adjacent to each other forming a honeycomb. A model was constructed and tested in a 4 L column (cylindrical) reactor, using the MB test to characterise the reactor performance and operating conditions. This was followed by a hydraulic performance study, which encompasses single and double pass flow studies. The single pass flow study involves the photocatalytic oxidation (PCO) of MB and MTBE, while the double pass flow study was focused on the PCO of MTBE only. The double pass can simulate two serially connected reactors. Single pass flow studies found that the critical hydraulic residence time (HRT) for the PCO of MB and MTBE is approximately 1 day, achieving up to 84 % MTBE removal. Critical HRT refers to the minimum average duration for a batch of contaminant remaining in the reactor in order to maintain the potential efficiency of the reactor. Double pass studies showed the reactor can achieve up to 95 % MTBE removal in 48 hours, and that reactor performance in the field of serially connected reactors can be estimated by sequential order of single pass removal efficiency. In groundwater, there are likely to be other impurities present and the effects of groundwater constituents on the reactor efficiency were studied. The MTBE PCO rate is affected by the presence of organic compounds and dissolved ions mainly due to the competition for hydroxyl radicals and the deactivation of catalyst surface via adsorption of the more strongly adsorbed organic molecules and ions. Despite the presence of organic compounds and dissolved ions, the reactor achieved about 80 % MTBE removal in 48 hours. A double pass flow study showed that the overall efficiency of the photocatalytic reactor in the field can be estimated via sequential order of its efficiency in a single pass flow study using the actual groundwater sample in the laboratory. A sand tank was designed for the simulation of the clean up of an MTBE plume from a point source leakage using the 200 mm i.d. Honeycomb I prototype. Honeycomb I achieved up to 88.1 % MTBE removal when the contaminated groundwater flowed through (single pass) at 14.6 cm d-1. The critical HRT for Honeycomb I was also approximately 1 day, similar to that in the column reactor. The response of MTBE removal efficiency towards flow obtained in the column reactor and sand tank was generic, indicating that the reactor efficiency can be obtained via testing of the model in the column reactor. The presence of toluene, ethylbenzene and o-xylene (TEo-X) decreased the MTBE removal efficiency in both the sand tank and column reactor. The same set of catalyst and 15 W Philips Cleo UVA fluorescent lamp was operated for a total of about 582 h (24 d) out of the cumulative 1039 h (43 d) sand tank experiments, achieving an overall MTBE removal efficiency of about 76.2 %. The experiments in the column reactor and sand tank exhibited the reliability of the immobilised catalyst produced in this research. This research demonstrates the potential of Honeycomb for in-situ groundwater remediation and also proposes its fabrication and installation options in the field.
|
164 |
Apports bioinformatiques et statistiques à l'identification d'inhibiteurs du récepteur MET / Bioinformatics and statistical contributions to the identification of inhibitors for the MET receptorApostol, Costin 21 December 2010 (has links)
L’effet des polysaccharides sur l’interaction HGF-MET est étudié à l’aide d’un plan d’expérience comportant plusieurs puces à protéines sous différentes conditions d’expérimentation. Le but de l’analyse est la sélection des meilleurs polysaccharides inhibiteurs de l’interaction HGF-MET. D’un point de vue statistique c’est un problème de classification. Le traitement informatique et statistique des biopuces obtenues nécessite la mise en place de la plateforme PASE avec des plug-ins d’analyse statistique pour ce type de données. La principale caractéristique statistique de ces données est le caractère de répétition : l’expérience est répétée sur 5 puces et les polysaccharides, au sein d’une même puce, sont répliqués 3 fois. On n’est donc plus dans le cas classique des données indépendantes globalement, mais de celui d’une indépendance seulement au niveau intersujets et intrasujet. Nous proposons les modèles mixtes pour la normalisation des données et la représentation des sujets par la fonction de répartition empirique. L’utilisation de la statistique de Kolmogorov-Smirnov apparaît naturelle dans ce contexte et nous étudions son comportement dans les algorithmes de classification de type nuées dynamique et hiérarchique. Le choix du nombre de classes ainsi que du nombre de répétitions nécessaires pour une classification robuste sont traités en détail. L’efficacité de cette méthodologie est mesurée sur des simulations et appliquée aux données HGF-MET. Les résultats obtenus ont aidé au choix des meilleurs polysaccharides dans les essais effectués par les biologistes et les chimistes de l’Institut de Biologie de Lille. Certains de ces résultats ont aussi conforté l’intuition des ces chercheurs. Les scripts R implémentant cette méthodologie sont intégrés à la plateforme PASE. L’utilisation de l’analyse des données fonctionnelles sur ce type de données fait partie des perspectives immédiates de ce travail. / The effect of polysaccharides on HGF-MET interaction was studied using an experimental design with several microarrays under different experimental conditions. The purpose of the analysis is the selection of the best polysaccharides, inhibitors of HGF-MET interaction. From a statistical point of view this is a classification problem. Statistical and computer processing of the obtained microarrays requires the implementation of the PASE platform with statistical analysis plug-ins for this type of data. The main feature of these statistical data is the repeated measurements: the experiment was repeated on 5 microarrays and all studied polysaccharides are replicated 3 times on each microarray. We are no longer in the classical case of globally independent data, we only have independence at inter-subjects and intra-subject levels. We propose mixed models for data normalization and representation of subjects by the empirical cumulative distribution function. The use of the Kolmogorov-Smirnov statistic appears natural in this context and we study its behavior in the classification algorithms like hierarchical classification and k-means. The choice of the number of clusters and the number of repetitions needed for a robust classification are discussed in detail. The robustness of this methodology is measured by simulations and applied to HGF-MET data. The results helped the biologists and chemists from the Institute of Biology of Lille to choose the best polysaccharides in tests conducted by them. Some of these results also confirmed the intuition of the researchers. The R scripts implementing this methodology are integrated into the platform PASE. The use of functional data analysis on such data is part of the immediate future work.
|
165 |
Fondförvaltares riskhantering av företagsobligationer : En kvalitativ studie utifrån den kumulativa prospektteorinKarlsson, Philip, Karlsson, Olle January 2017 (has links)
Sammanfattning Beteendeekonomi var fram till år 1979 ett forskningsämne som saknade större motsättningar. Sedan 1700-talet var den allmänna uppfattningen att de beslut som individer fattade under risk var baserade på ett rationellt beteende. Daniel Kahneman och Amos Tverskys åsikt var polär mot den tidigare forskningen och baserat på deras kritik mot föregående studier inom beteendeekonomi presenterade de år 1979 prospektteorin, en teori som senare renderade i nobelpriset. Därefter har teorin utvecklats och år 1992 publicerade Tversky och Kahneman den kumulativa prospektteorin. Den kumulativa prospektteorin (1992) baseras på att individer frångår objektiva sannolikheter och istället utgår beslut från subjektiva preferenser och därav ett irrationellt beteende. Kahneman och Tversky ansåg att rationella individer inte alltid fattar beslut baserat på vilket alternativ som genererar den högsta nyttan utan tidigare erfarenheter och upplevelser resulterar i att individer agerar annorlunda. Ett flertal studier har funnit empiriskt bevis för att den kumulativa prospektteorin är applicerbar på investerare, däribland på förvaltare inom fonder samt inom private banking. Denna studies syfte är att med hjälp av tolv kvalitativa intervjuer erhålla en djupare förståelse huruvida den kumulativa prospektteorin är applicerbar på svenska fondförvaltare med inriktning på företagsobligationer. Samtidigt som allmänheten enligt de intervjuade förvaltarna tenderar att ha bristfälliga kunskaper gällande risker associerade till företagsobligationer anser många journalister, bland annat på grund av de förväntade räntehöjningarna, att obligationsmarknaden befinner sig i en bubbla. Detta gör företagsobligationsmarknaden intressant att undersöka. Studiens slutsats är att förvaltarna, i likhet med den kumulativa prospektteorin, agerar irrationellt vid investeringsbeslut. Detta på grund av att förvaltarna ger indikationer på att de inte enbart investerar i de företagsobligationer som genererar den högsta nyttan, det vill säga avkastning, utan tar stor hänsyn till risker kopplade till företagsobligationer. I likhet med teorin tenderar förvaltarna att hantera likviditetsproblematiken och kreditrisken i enlighet med den kumulativa prospektteorin. Vidare är studiens slutsats att förvaltarna, i kontrast till den kumulativa prospektteorin, övervärderar en redan hög sannolikhet för att ränte- och inflationsrisken ska påverka fonderna negativt. Dessutom ges indikationer att förvaltarna, i likhet med teorin, agerar riskavert mot vinster, men i kontrast till teorin, agerar de också riskavert mot förluster. Detta stöds bland annat genom att majoriteten av förvaltarna agerar med en hög grad av försiktighet samt deras bemötande av kreditrisk.
|
166 |
Cumulative mild head injury (CMHI) in contact sports:an evaluation of pre and post season cognitive profiles rugby players compared with non-contact sport controls at the University of Limpopo(Turfloop Campus)Rapetsoa, Mokgadi Johanna January 2015 (has links)
Thesis (M.A.(Clinical Psychology)) -- University of Limpopo, 2015. / The effect of Cumulative Mild Head Injury (CMHI) in contact sports, such as rugby, is seen increasingly at school level where more and more injuries are reported. Research on CMHI in contact sport is needed specifically amongst previously disadvantaged groups where little or no research has taken place. The research is thus intended to seek a better understanding of CMHI in the contact sport of rugby specifically amongst amateur players. A quantitative research approach was utilised with a quasi-experimental research design. A sample of 18 student rugby players and 18 volleyball (non-contact sport) controls was used. In terms of mean performances the tests did not reveal a consistent pattern of deficits which is typically associated with the effects of Cumulative Mild Head Injuries. There were significant results however, in terms of variability that suggests potential deficits in attention among the rugby group. The results are therefore indicative of a poorer overall cognitive profile for the rugby playing group. It is concluded that the increased variability may be displayed in individuals who suffer CMHI at an earlier age.
|
167 |
Broadening Horizons : The FMECA-NETEP model, offshore wind farms and the permit application processOhlson, John January 2013 (has links)
Abstract The permit application process for offshore wind farms (OWF) in Sweden conceivably requires a comprehensive and transparent complement within risk management. The NETEP framework (covering risks concerning navigation, economics, technology, environment and politics), based on a futures planning mechanism (STEEP) has consequently been brought forward as a structure for the application of FMECA (Failure Mode, Effects, and Criticality Analysis) methodology to the permit application process of the Swedish offshore wind farm sector. FMECA, originating from the aeronautical and automobile industries, presents a systematic method for the prediction of future failure in a product, part or process, to evaluate the consequences of that failure and to suggest possible measures for its mitigation or eradication. Its application to attitude and acceptance, safety and environmental effect remains, however, limited which creates the research gap for this thesis. Three Swedish offshore wind farm (OWF) projects in the Baltic Sea area (Lillgrund, Taggen and Trolleboda) were put forward as case studies for use in the evaluation of the proposed FMECA-NETEP methodology, which was approached in two stages. The first evaluation stage results showed that the model accommodates the precautionary principle, the consideration of stakeholder viewpoints, the mitigation of negative effects, the analysis of alternative sites, the observation of relevant legislation and the utilisation of contemporary research. In the subsequent stage of evaluation, the factor for incorporation into the adapted model was intra- and inter-sector cumulative impact. Results showed that positive cumulative impact cannot be illustrated by the model whereas neutral and negative cumulative impact can. The model’s added value is that it facilitates decision making by providing a rigorous, transparent and structured methodology, the holistic approach of which provides a sound basis for the incorporation of contemporary research.
|
168 |
Implementace síťového protokolu do prostřední network simulator 2 / Implementation of network protocol into network simulator 2 environmentJaniga, Robin January 2009 (has links)
This thesis describes a protocol for the multiple data collection system and his implementation into Network Simulator 2 environment. The system defines two communication units. CU central unit and measuring unit MU. The units operate according to the rules defined by communication protocol. The content of this work is as follows. At the beginning is described the simulation tool, namely a system NS-2 and a tool for visualization of simulation results, the NAM. This is followed by a description of the proposed protocol, his principle of functions, units description and communication messages. The method of communication between units. Mainly was described the multicast and the types of multicast ASM and SSM. Additionally, was described the principle of unicast communication. This is followed by chapter describing methods of enlargement simulator. Adding an own protocol and support of multicast communication SSM. Adding a new protocol is represented by programming a new agent, a new application and a new protocol header definition. In this chapter are also described the necessary changes in the source files that are need to the recompilation. The main objective of this thesis is own implementation of the proposed protocol. In the programming language C++ were created two agents who represent a central and a measuring unit. These agents were compiled into a simulator and by using a simple script have been tested for functionality. The simulation script define MU 200 and one unit CU. Conclusion of work is devoted to simulation the load line between the central unit and "access" node. It was examined whether the use the method of cumulative acknowledgement saves the transmission capacity of line compared to normal method acknowledgement.
|
169 |
An event study : the market reactions to share repurchase announcements on the JSEPunwasi, Kiran 24 February 2013 (has links)
This study examines the market reactions to share repurchase announcements made by companies listed on the Johannesburg Stock Exchange from 2003 to 2012. We use an event study methodology and the Capital Asset Pricing Model to determine if there is an announcement effect when a share repurchase announcement is made. Our analysis show that consistent with signalling theory and the announcement effect, share repurchase announcements are associated with positive abnormal returns. The average abnormal return and cumulative average abnormal return noted was 0.46% and 3.81% respectively for the event period (t -20, t +20). There was an observable trend of declining share prices before the share repurchase announcement however the decline in the shares prices was not significant. We found some evidence of market timing ability in 2005 and 2010 however as a collective, we found no significant difference in timing a share repurchase announcement. / Dissertation (MBA)--University of Pretoria, 2012. / Gordon Institute of Business Science (GIBS) / unrestricted
|
170 |
Enhancing a value portfolio with price acceleration momentumSchoeman, Cornelius Etienne 24 February 2013 (has links)
Value shares are notorious for remaining stagnant for extended periods of time, forcing value investors to remain locked in their investments often for excessive periods. This research study applied the price acceleration momentum indicator of Bird and Casavecchia (2007) on a value portfolio with the objective of improving the timing of value share acquisitions.A time series study was conducted, taking into account the top 160 JSE shares over the period 1 January 1985 to 31 August 2012. A price acceleration momentum indicator was applied to enhance a value portfolio formed on the basis of book-tomarket ratio, dividend yield and EBITDA/EV. Cumulative average abnormal returns (CAAR) were used to compare portfolio results statistically.A substantial contribution is made to the literature by proving that a value-only portfolio can be significantly enhanced by the combination of price acceleration momentum. Results indicated an increase in CAAR from 199.83% to 321.29%. Risk-adjusted returns (Sharpe ratio) were also improved without the detriment of increased share price volatility (standard deviation). This research study further contributes to the literature by proving that a price acceleration momentum indicator adds no additional value over a value portfolio combined with ordinary price momentum. / Dissertation (MBA)--University of Pretoria, 2012. / Gordon Institute of Business Science (GIBS) / unrestricted
|
Page generated in 0.0739 seconds