• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 503
  • 76
  • 69
  • 58
  • 56
  • 32
  • 20
  • 17
  • 16
  • 12
  • 10
  • 4
  • 4
  • 4
  • 3
  • Tagged with
  • 1042
  • 120
  • 86
  • 81
  • 74
  • 57
  • 57
  • 56
  • 48
  • 48
  • 46
  • 45
  • 43
  • 43
  • 43
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
731

Knowledge Transfer and The Timing of Information Technology Methods : A study within six organizations in Sweden during the COVID-19 pandemic

Larsson, Filip, Thorsell, Anna January 2021 (has links)
Abstract  Title: Information Technology and the Timing of Knowledge Transfer Methods: A study within six organizations in Sweden  Level: Master Thesis for Master’s Degree in Business Administration Author: Anna Thorsell and Filip LarssonSupervisor: Daniella FjellströmExaminer: Ehsanul Huda Chowdhury  Date: 2021 June  Aim: It has been stated that knowledge transfer processes fail due to poorly timed transfer methods and that it is of importance to understand underlying mechanisms of transfer. It has also been argued that information technology (IT) systems can increase knowledge transfer in organizations. This study aims to research the influence IT has on the timing of transfer methods and knowledge transfer. The authors of this study discovered a gap for the use and timing of IT transfer methods and its influence on knowledge transfer within organizations in existing research.  Method: This study included the use of secondary data from a literature review, as well as primary data using a qualitative research method with an inductive approach as semi- structured in-depth interviews were conducted for a multi-case study. The interviews were held with individuals working in six organizations in Sweden. Phases of analysis including content analysis and data coding were used for the analysis and reporting of the data collected.  Findings and Conclusion: The study showed that IT provides platforms for knowledge transfer through different types of IT and IT transfer methods used. That the type of IT chosen can depend on the urgency of the transfer, what type of knowledge is transferred as well as the amount of knowledge transferred. That the timing of IT transfer methods I can speed up knowledge transfer as it enables the transfer to happen faster and in later stages of the transfer process due to providing direct knowledge transfer channels and accessible knowledge to all members of a team. It was shown that IT transfer methods can influence internal stickiness, decrease causal ambiguity and reduce the knowledge barrier of arduous relationships as well as influence the use of front-loading and back- loading modes of transfer. That the urgency of transfer can be a determining factor for the modes used rather than the level of causal ambiguity of the knowledge.  Contribution of the Study: This study contributes to studies on types of IT and IT transfer methods used in organizations for knowledge transfer. It adds to existing research on timing of knowledge transfer methods including both explicit and tacit knowledge, as well as adding the influence of IT on the timing of knowledge transfer methods, knowledge transfer, internal stickiness, knowledge barriers and affordance of interaction. It provides new findings to the timing of knowledge transfer from the timing of types of IT and IT transfer methods used based on the urgency of transfer and what type of knowledge that is being transferred. The study contributes to highlighting the value of IT for the management of knowledge transfer within organizations, especially due to global events such as the COVID-19 pandemic. The study provides a basis for managers to examine their use of IT for knowledge transfer in organizations. It also showcases the continuous increased need for effective knowledge transfer processes between organizations, people and across locations and how IT can facilitate that.  Study Reflections and Suggestions for Future Research: This study included individuals from six organizations. It is suggested to include more participants for future studies as well as investigating teams, departments and organizations on narrow as well as on broader levels in different sectors to gain deeper insight into the field. Further research on how the urgency of transfer affects the modes of transfer used is also advised.  Keywords: Knowledge, Knowledge Transfer, Methods of Transfer, Timing, Modes of Transfer, Internal Stickiness, Information Technology, COVID-19
732

Étude probabiliste des contraintes de bout en bout dans les systèmes temps réel / Probabilistic study of end-to-end constraints in real-time systems

Maxim, Cristian 11 December 2017 (has links)
L'interaction sociale, l'éducation et la santé ne sont que quelques exemples de domaines dans lesquels l'évolution rapide de la technologie a eu un grand impact sur la qualité de vie. Les entreprises s’appuient de plus en plus sur les systèmes embarqués pour augmenter leur productivité, leur efficacité et leurs valeurs. Dans les usines, la précision des robots tend à remplacer la polyvalence humaine. Bien que les appareils connectés comme les drônes, les montres intelligentes ou les maisons intelligentes soient de plus en plus populaires ces dernières années, ce type de technologie a été utilisé depuis longtemps dans les industries concernées par la sécurité des utilisateurs. L’industrie avionique utilise des ordinateurs pour ses produits depuis 1972 avec la production du premier avion A300; elle a atteint des progrès étonnants avec le développement du premier avion Concorde en 1976 en dépassant de nombreuses années les avions de son époque, et ça a été considéré comme un miracle de la technologie. Certaines innovations et connaissances acquises pour le Concorde sont toujours utilisées dans les modèles récents comme A380 ou A350. Un système embarqué est un système à microprocesseur qui est construit pour contrôler une fonction ou une gamme de fonctions et qui n’est pas conçu pour être programmé par l'utilisateur final de la même manière qu'un ordinateur personnel. Un système temps-réel est un système de traitement de l’information qui doit répondre aux stimuli d’entrées générées de manière externe dans une période finie et spécifiée. Le comportement de ces systèmes prend en compte non seulement l'exactitude dépend non seulement du résultat logique mais aussi du temps dans lequel il a été livré. Les systèmes temps-réel peuvent être trouvés dans des industries comme l'aéronautique, l'aérospatiale, l'automobile ou l’industrie ferroviaire mais aussi dans les réseaux de capteurs, les traitements d'image, les applications multimédias, les technologies médicales, les robotiques, les communications, les jeux informatiques ou les systèmes ménagers. Dans cette thèse, nous nous concentrons sur les systèmes temps-réel embarqués et pour la facilité des notations, nous leur nommons simplement des systèmes temps réel. Nous pourrions nous référer aux systèmes cyber-physiques si tel est le cas. Le pire temps d’exécution (WCET) d'une tâche représente le temps maximum possible pour qu’elle soit exécutée. Le WCET est obtenu après une analyse de temps et souvent il ne peut pas être déterminé avec précision en déterminant toutes les exécutions possibles. C'est pourquoi, dans l'industrie, les mesures sont faites uniquement sur un sous-ensemble de scénarios possibles, celui qui générerait les temps d'exécution les plus élevés, et une limite supérieure de temps d’exécution est estimé en ajoutant une marge de sécurité au plus grand temps observé. L’analyses de temps est un concept clé qui a été utilisé dans les systèmes temps-réel pour affecter une limite supérieure aux WCET des tâches ou des fragments de programme. Cette affectation peut être obtenue soit par analyse statique, soit par analyse des mesures. Les méthodes statiques et par mesure, dans leurs approches déterministes, ont tendance à être extrêmement pessimistes. Malheureusement, ce niveau de pessimisme et le sur-provisionnement conséquent ne peut pas être accepté par tous les systèmes temps-réels, et pour ces cas, d'autres approches devraient être prises en considération. / In our times, we are surrounded by technologies meant to improve our lives, to assure its security, or programmed to realize different functions and to respect a series of constraints. We consider them as embedded systems or often as parts of cyber-physical systems. An embedded system is a microprocessor-based system that is built to control a function or a range of functions and is not designed to be programmed by the end user in the same way that a PC is. The Worst Case Execution Time (WCET) of a task represents the maximum time it can take to be executed. The WCET is obtained after analysis and most of the time it cannot be accurately determined by exhausting all the possible executions. This is why, in industry, the measurements are done only on a subset of possible scenarios (the one that would generate the highest execution times) and an execution time bound is estimated by adding a safety margin to the greatest observed time. Amongst all branches of real-time systems, an important role is played by the Critical Real-Time Embedded Systems (CRTES) domain. CRTESs are widely being used in fields like automotive, avionics, railway, health-care, etc. The performance of CRTESs is analyzed not only from the point of view of their correctness, but also from the perspective of time. In the avionics industry such systems have to undergo a strict process of analysis in order to fulfill a series of certification criteria demanded by the certifications authorities, being the European Aviation Safety Agency (EASA) in Europe or the Federal Aviation Administration (FAA) in United States. The avionics industry in particular and the real-time domain in general are known for being conservative and adapting to new technologies only when it becomes inevitable. For the avionics industry this is motivated by the high cost that any change in the existing functional systems would bring. Any change in the software or hardware has to undergo another certification process which cost the manufacturer money, time and resources. Despite their conservative tendency, the airplane producers cannot stay inactive to the constant change in technology and ignore the performance benefices brought by COTS processors which nowadays are mainly multi-processors. As a curiosity, most of the microprocessors found in airplanes flying actually in the world, have a smaller computation power than a modern home PC. Their chips-sets are specifically designed for embedded applications characterized by low power consumption, predictability and many I/O peripherals. In the actual context, where critical real-time systems are invaded by multi-core platforms, the WCET analysis using deterministic approaches becomes difficult, if not impossible. The time constraints of real-time systems need to be verified in the context of certification. This verification, done during the entire development cycle, must take into account architectures more and more complex. These architectures increase the cost and complexity of actual, deterministic, tools to identify all possible time constrains and dependencies that can occur inside the system, risking to overlook extreme cases. An alternative to these problems is the probabilistic approach, which is more adapted to deal with these hazards and uncertainty and which allows a precise modeling of the system. 2. Contributions. The contribution of the thesis is three folded containing the conditions necessary for using the theory of extremes on executions time measurements, the methods developed using the theory of extremes for analyzing real-time systems and experimental results. 2.1. Conditions for use of EVT in the real-time domain. In this chapter we establish the environment in which our work is done. The use of EVT in any domain comes with a series of restrictions for the data being analyzed. In our case the data being analyzed consists in execution time measurements.
733

ATLAS : Search for Supersymmetry and optimization of the High Granularity timing detector / ATLAS : recherche de la supersymétrie et optimisation du détecteur de temps fortement segmenté

Allaire, Corentin 27 September 2019 (has links)
Le Modèle Standard de la physique des particules a jusqu’alors extrêmement bien réussi à décrire les particules élémentaires et leurs interactions. Malgré cela, il demeure toujours des questions ouvertes. La possibilité de répondre à ces questions grâce la Supersymétrie est actuellement à l’étude dans les collisions proton-proton à 13 TeV dans le cadre de l’expérience ATLAS au LHC. Cette thèse présente la recherche de la production de paires de particules colorées dans ATLAS, ces dernières se désintégrant en paires de jets. Pour ce faire, les données de 2016, 2017 et 2018 ont été utilisées. De telles particules échappent aux recherches standards de la Supersymétrie du fait de l’absence d’énergie transverse manquante dans l’état final. Deux signatures furent considérées, la désintégration de stops via des couplages violant la R-parité et la production de sgluon, le partenaire scalaire du gluino. En l’absence de signal, une amélioration de 200 GeV sur la masse maximum exclue est attendue. Le HL-LHC augmentera la luminosité intégrée délivrée afin de nous permettre de rechercher des particules plus massives et d'améliorer les mesures de précision du Modèle Standard. La luminosité instantanée augmentera d’un facteur 5 et une luminosité intégrée de 4000 fb⁻¹ devrait pouvoir être atteinte à la fin du LHC en 2037.Cette thèse présente également une étude des perspectives de mesure des couplages du Higgs au HL-LHC effectuée à l’aide de SFitter. Il est démontré que dans le cadre des Delta et d’une EFT, l’augmentation de la luminosité génère une amélioration de la précision de la mesure des couplages. Finalement, le Détecteur de temps fortement segmenté, qui sera installé dans ATLAS au HL-LHC, est présenté. La simulation de ce détecteur a été développée pour prendre en compte la résolution temporelle du détecteur et fut utilisée pour optimiser sa géométrie. Les performances de ce détecteur ont été étudiées, plus de 80 % des traces ont leurs temps correctement associés avec une résolution de 20 ps avant irradiation de 50 ps après. En utilisant les informations temporelles, l’isolation des électrons peut être amélioré de 10 %. / The Standard Model of particle physics has been extremely successful in describing the elementary particles and their interactions. Nevertheless, there are open questions that are left unanswered. Whether supersymmetry can provide answers to some of these is being studied in 13 TeV proton-proton collisions in the ATLAS experiment at the LHC. In this thesis a search for pair produced colored particles in ATLAS decaying into pairs of jets using data from 2016, 2017 and 2018 is presented. Such particles would escape standard Supersymmetry searches due to the absence of missing transverse energy in the final state. Stops decaying via a R-parity violating coupling and sgluon, scalar partners of the gluino, were considered. In the absence of a signal, an improvement of 200 GeV on the limit on the stop mass is expected. The HL-LHC will increase the integrated luminosity delivered to probe even higher mass ranges as well as improving the precision of Standard model measurements. The instantaneous luminosity will be increased by a factor 5 and an integrated luminosity of 4000 fb⁻¹ should be reached by the end of the LHC in 2037.A study of the Higgs coupling measurement prospects at the HL-LHC using SFitter is performed. Using the Delta and EFT framework shows that the increase in luminosity will result in a significant improvement of the precision of the measurement of the couplings. The High granularity timing detector detector will be installed in ATLAS for the HL-LHC. A simulation of the detector that takes into account the timing resolution was developed and used to optimize its layout. The detector performance was studied. More than 80 % of the tracks have their time correctly reconstructed with a resolution of 20 ps before irradiation and 50 ps after. Using the timing information, the electron isolation efficiency is improved by 10 %.
734

The Effect of Explicit Timing on Math Performance Using Interspersal Assignments with Students with Mild/Moderate Disabilities

Hou, Fangjuan 01 May 2010 (has links)
Explicit timing and interspersal assignments have been validated as effective methods to facilitate students' math practice. However, no researchers have explored the combinative effect of these two methods. In Study 1, we extended the literature by comparing the effect of explicit timing with interspersal assignments, and interspersal assignments without timing. Generally, participants' rate of digits correct on easy and hard addition problems was higher during the explicit timing condition than during the untimed condition. However, the participants' rate of digits correct decreased after initial implementation of the explicit timing condition. Motivation plays a crucial role in maintaining performance levels and helping students make continuous progress. Preferred reinforcers and setting academic targets have been widely utilized as active motivational components to increase the likelihood of a successful strategy in school settings. In Study 2, we employed a brief MSWO reinforcer assessment to identify individual student's low- and high-preference reinforcers and examined the effects of explicit timing on interspersed assignments combined with high preference or low preference reinforcers, and setting academic targets. In general, explicit timing combined with preferred reinforcers and academic targets produced a more sustainable effect on participants' rate of digits correct than explicit timing alone. In addition, high-preference reinforcers were more effective than low-preference reinforcers for three of five participants. For two participants, an increasing trend was observed when low preference reinforcers were contingent on meeting academic targets. These results are discussed relative to using preference assessments with students with mild/moderate disabilities.
735

User Choice in Elderly Care in Sweden: Quality, Cost, and Covid-19

Westin, Karolina January 2021 (has links)
This thesis investigates the impacts of user choice in Swedish elderly care on quality and cost as well as the impact of marketisation on the Covid-19 death toll. In the last three decades welfare service provision in Sweden has been increasingly marketised. Since 2009, Swedish municipalities have been able to introduce user choice in elderly care and it has been widely adopted in home care. To investigate the impact of introducing user choice, new insights from econometrics literature is used to estimate a staggered Difference-in-Difference model, using panel data for the years 2003-2019 and the 290 Swedish municipalities. The impact of marketisation on the Covid-19 death toll is estimated through Ordinary Least Squares using a cross-sectional data set. There are three main findings of this thesis. (i) The impact on quality and cost of the introduction of user choice has had heterogeneous effects across adoption groups, calendar time, and exposure length of treatment, and hence, the standard Difference- in-Difference approach is likely to provide biased estimates in this setting. (ii) The introduction of user choice has no clear effect on non-contractible quality measured by mortality rate and fall accidents, nor on cost. However, user choice has increased subjective quality, as measured by user satisfaction. (iii) A higher degree of marketisation in home care is associated with a higher Covid-19 death toll amongst those which had home care.
736

The Effect of Cotton Growth Stage on Injury and Yield Effects When Exposed to Sub-Lethal Concentrations of the Auxinic Herbicides 2,4-D and Dicamba

Buol, John Tyler 06 May 2017 (has links)
Seed companies have developed novel weed control technologies to combat herbicide-resistant (HR) weeds based on the use of new genetically-modified (GM) crop cultivars and auxin herbicide formulations. These herbicides can variably affect the growth and yield of susceptible cotton even at low concentrations depending on growth stage at exposure. As such, research was conducted in each of two locations in Mississippi in 2014, 2015, and 2016 to determine the cotton growth stage most susceptible to injury and yield effects from simulated misapplications of sub-lethal 2,4-D or dicamba concentrations. Results indicate that generally a decrease in yield partitioned on lower nodes and inner positions was accompanied by a compensatory increase in yield partitioned on vegetative branches and aborted terminals. However, the magnitude of these yield effects differed based on growth stage at exposure and based on which herbicide was used.
737

Robust Stationary Time and Frequency Synchronization with Integrity in Support of Alternative Position, Navigation, and Timing

Smearcheck, Matthew A. 13 June 2013 (has links)
No description available.
738

Meal Patterns and Practical Applications for Obesity Management

Good, Matthew F. 15 May 2008 (has links)
No description available.
739

Polymer Coated Urea in Kentucky Bluegrass

Buss, Jessica Chelise 01 March 2016 (has links) (PDF)
Nitrogen (N) is the most commonly over-applied nutrient in urban environments because of the large visual and growth increases. This over-application has led to an increase in the loss of N gas in the forms of ammonia and nitrous oxide, as well as an increase in nitrate leaching to surface and groundwater. Furthermore, excess N results in increased maintenance costs and landfill volume due to increased shoot growth from mowed clipping removal. Polymer coated urea (PCU) has proven to be an excellent source to these losses of N to the environment, but rate and timing parameters need study. A two-year field study, on sand and sandy loam soils in Provo, UT, was initiated in April 2014. Seven fertilized treatments included: urea split applied monthly; a single application of PCU (Agrium One Ap) applied in spring, a single PCU application in fall; two evenly split applications in spring and late summer; and three evenly split applications in spring, late summer, and late fall. These were compared to an untreated control. In addition the two application of PCU also had reduced rates of half and three-quarters, in addition to the full rate. Height and verdure measurements were taken on a weekly basis, along with periodic visual and biomass readings. All fertilized treatments resulted in a significant response to N as compared to the control. The single annual application treatments had significantly greater shoot growth during the weeks immediately after application and a significant reduction in verdure months later and, therefore, were unacceptable for consumer recommendation. Two applications of PCU, either at the three-quarter or full rates, were nearly identical in all measurements as compared to the spoon feeding of urea applied monthly. The half rate of two applications showed signs of inadequate N. Three applications of PCU was identical to two and, therefore, not recommended. This study shows two applications of PCU at the three-quarter rate is equally effective as spoon feeding the N. Doing so would result in less labor for fertilization. Further work is needed to evaluate other timing approaches for a single annual application, as well as long term effects of a reduced rate of N.
740

Effect of Initial Scarification and Overlay Treatment Timing on Chloride Concentrations in Concrete Bridge Decks

Nolan, Curtis Daniel 19 November 2008 (has links) (PDF)
Considering the pervasive presence of chlorides in concrete bridge decks, bridge engineers have a critical responsibility to perform proper and effective preventive maintenance and rehabilitation operations. Bridge engineers often perform scarification and overlay (SO) procedures on concrete bridge decks to minimize the corrosion of reinforcing steel due to chloride ingress. Given the need to develop guidelines for the initial timing of SO treatments, the specific objectives of this research were to collect information from several department of transportation (DOT) personnel about their SO procedures and, subsequently, to determine the recommended timing of initial SO procedures on concrete bridge decks for preventing the accumulation of corrosion-inducing levels of chlorides and extending deck service life. A questionnaire survey of state DOTs was conducted, and numerical modeling of SO treatments was performed. Simulations involving both decks with and without stay-in-place metal forms (SIPMFs) were performed. Numerical modeling was performed for each unique combination of variables through a service life of 50 years to determine the recommended initial timing of SO treatment in each case. The research results show that, overall, bridge decks without SIPMFs can endure longer delays in SO treatment timing than those with SIPMFs; in all cases, the absence of SIPMFs extended the amount of time before an SO treatment was needed. For decks with SIPMFs, the allowable delay in SO timing ranged from 2 to 6 years, while on decks without SIPMFs the allowable delay in SO timing ranged from 6 to 18 years. These delays are only 1 to 3 years longer than allowable delays associated with placement of surface treatments investigated in previous research. On average, the period of additional delay allowed before an SO treatment is required in decks with SIPMFs was 2 years with each additional 0.5 in. of OCD. In decks without SIPMFs, the presence of a greater OCD had a more pronounced effect on the latest recommended timing of treatment than in the decks with SIPMFs; an average additional delay period of 5 years was obtained with each additional 0.5 in. of OCD in decks without SIPMFs. Together with the findings of this research and the specific properties of the bridge deck under scrutiny, engineers can determine the appropriate timing of rehabilitation procedures to prevent or mitigate corrosion of the steel reinforcement of a bridge deck and ensure the usability of the deck for its intended service life. Although the conditions studied in this research were consistent with bridges located in the state of Utah, bridge decks that exist in similar environments and that are subjected to similar treatments of deicing salts as part of winter maintenance could exhibit similar properties to the decks simulated in this research. Engineers should carefully consider the results of this research and implement proper timing of SO treatments on their respective bridge decks to protect against and minimize the effects of corrosion due to chloride ingress.

Page generated in 0.0802 seconds