• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 59
  • 11
  • 5
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 105
  • 105
  • 13
  • 11
  • 10
  • 10
  • 10
  • 10
  • 9
  • 9
  • 9
  • 8
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Towards Scalable Performance Analysis of MPI Parallel Applications

Aguilar, Xavier January 2015 (has links)
A considerably fraction of science discovery is nowadays relying on computer simulations. High Performance Computing  (HPC) provides scientists with the means to simulate processes ranging from climate modeling to protein folding. However, achieving good application performance and making an optimal use of HPC resources is a heroic task due to the complexity of parallel software. Therefore, performance tools  and runtime systems that help users to execute  applications in the most optimal way are of utmost importance in the landscape of HPC.  In this thesis, we explore different techniques to tackle the challenges of collecting, storing, and using  fine-grained performance data. First, we investigate the automatic use of real-time performance data in order to run applications in an optimal way. To that end, we present a prototype of an adaptive task-based runtime system that uses real-time performance data for task scheduling. This runtime system has a performance monitoring component that provides real-time access to the performance behavior of anapplication while it runs. The implementation of this monitoring component is presented and evaluated within this thesis. Secondly, we explore lossless compression approaches  for MPI monitoring. One of the main problems that  performance tools face is the huge amount of fine-grained data that can be generated from an instrumented application. Collecting fine-grained data from a program is the best method to uncover the root causes of performance bottlenecks, however, it is unfeasible with extremely parallel applications  or applications with long execution times. On the other hand, collecting coarse-grained data is scalable but  sometimes not enough to discern the root cause of a performance problem. Thus, we propose a new method for performance monitoring of MPI programs using event flow graphs. Event flow graphs  provide very low overhead in terms of execution time and  storage size, and can be used to reconstruct fine-grained trace files of application events ordered in time. / <p>QC 20150508</p>
52

The Impact of Remote Work on Performance Monitoring

Gustavsson, Robert, Söderlund, Marcus January 2021 (has links)
This study aims to examine how the shift from analog to remote work, as a result of digitalization, has affected performance monitoring in organizations and impacted management control systems. Earlier studies within the subject show that a shift results in a reduction in performance monitoring or that classical performance monitoring are replaced with electronic performance monitoring (EPM). In order to examine how performance monitoring changes as a result of the shift and if management control has gotten tighter or looser, this thesis has been based on four semi-structured interviews that were held with managers from three different Swedish organizations. The shift to remote work shows tendencies of a loosening of control in the organizations and to compensate for the loss, managers have implemented different Electronic Performance Monitoring (EPM) systems. The most common EPM systems implemented were video meetings, but other EPM systems were also implemented such as online monitoring. But managers still experience an overall loosening of control as a result of the shift to remote work. Trust seems to be a factor that becomes more important as a result of the shift and seems to have a mitigating effect on the degree of EPM systems the managers choose to implement.
53

Die Bedeutung des Thalamus für das menschliche Handlungsüberwachungssystem im fronto-striato-thalamo-corticalen Netzwerk

Seifert, Sebastian 20 September 2012 (has links)
Für das zielgerichtete Verhalten des Menschen ist ein funktionierendes Handlungsüberwachungssystem eine wichtige Voraussetzung. Somit können Fehlhandlungen registriert und verarbeitet werden, um dann anschließend das Verhalten an die entsprechende Situation besser anzupassen. Ein wichtiges neuroanatomisches Korrelat dieses Handlungsüberwachungssystems ist der anteriore Anteil des mittleren cingulären Cortex (anterior midcingulate cortex, aMCC), der in der Funktion der Fehlerverarbeitung eng mit den Basalganglien und dem lateralen präfrontalen Cortex verknüpft ist. In der vorliegenden Arbeit wurde die Bedeutung des Thalamus im Netzwerk der Fehlerverarbeitung genauer untersucht. Es konnte mittels diffusionsgewichteter Traktografie bei 16 gesunden Probanden gezeigt werden, dass speziell der Nucleus ventralis anterior (VA) und der Nucleus ventralis lateralis anterior (VLa) quantitativ stärkere Faserverbindungen mit dem aMCC aufweisen, als die restlichen Thalamuskerne. Desweiteren zeigten 15 Patienten mit Läsionen im Thalamus im Vergleich zur gesunden Kontrollgruppe im Eriksen Flanker Task fehlerspezifische Verhaltensunterschiede. Obwohl die Fehlerrate zwischen diesen Patienten und den Kontrollprobanden nahezu identisch war, konnten die Patienten ihre Fehler als solche signifikant schlechter detektieren und ihr Verhalten nach einem Fehler daher auch schlechter anpassen. Die EEG Daten zeigten für die Patientengruppe eine in der Amplitude signifikant verminderte error-related negativity (ERN – ein ereignis-korreliertes Hirnpotential, ausgelöst durch Fehlhandlungen, z.B. in Flankierreizaufgaben) im Vergleich zur Kontrollgruppe. Bei 6 Patienten mit Läsionen der VA und VLa Kerngruppe war die ERN nahezu komplett erloschen, wohingegen bei den 9 Patienten, deren Läsionen nicht VA und VLa betrafen, die ERN lediglich vermindert war. / Performance monitoring is an essential prerequisite of successful goal-directed behavior. Research of the last two decades implicates the anterior midcingulate cortex (aMCC) in the human medial frontal cortex and frontostriatal basal ganglia circuits in this function. Here, we addressed the function of the thalamus in detecting errors and adjusting behavior accordingly. Using diffusion-based tractography we found that, among the thalamic nuclei, the ventral anterior and ventral lateral anterior nuclei (VA, VLa) have the relatively strongest connectivity with the RCZ. Patients with focal thalamic lesions showed diminished error-related negativity, behavioral error detection, and post-error adjustments. When the lesions specifically affected the thalamic VA/VLa nuclei these effects were significantly pronounced, which was reflected by complete absence of the error-related negativity. These results reveal that the thalamus, particularly its VA/VLa region, is a necessary constituent of the performance-monitoring network, anatomically well connected and functionally closely interacting with the aMCC.
54

Concentrated network tomography and bound-based network tomography

Feng, Cuiying 17 September 2020 (has links)
Modern computer networks pose a great challenge for monitoring the network performance due to their large scale and high complexity. Directly measuring the performance of internal network elements is prohibitive due to the tremendous overhead. Alternatively, network tomography, a technique that infers the unobserved network characteristics (e.g., link delays) from a small number of measurements (e.g., end-to-end path delays), is a promising solution for monitoring the internal network state in an e cient and e ective manner. This thesis initiates two variants of network tomography: concentrated network tomography and bound-based network tomography. The former is motivated by the practical needs that network operators normally concentrate on the performance of critical paths; the latter is due to the need of estimating performance bounds whenever exact performance values cannot be determined. This thesis tackles core technical di culties in concentrated network tomography and bound- based network tomography, including (1) the path identi ability problem and the monitor deploy- ment strategy for identifying a set of target paths, (2) strategies for controlling the total error bound as well as the maximum error bound over all network links, and (3) methods of constructing measure- ment paths to obtain the tightest total error bound. We evaluate all the solutions with real-world Internet service provider (ISP) networks. The theoretical results and the algorithms developed in this thesis are directly applicable to network performance management in various types of networks, where directly measuring all links is practically impossible. / Graduate
55

An assessment of the performance management system for senior managers at Chris Hani district municipality

Sotenjwa, Fundiswa Patience January 2021 (has links)
Masters in Public Administration - MPA / This study examines the implementation of the Performance Management System (PMS) in local government, with specific reference to municipalities in the Chris Hani District in the Eastern Cape. The research is premised on the assumption that even though a PMS has been adopted in municipalities with the aim of assisting them to function effectively, municipalities in the Eastern Cape, particularly in Christ Hani District, continue to experience performance challenges. The study presupposes that the implementation of the performance management system at the municipality, whether effective or ineffective, has a direct relationship with the performance of the municipality. The study includes a historical overview of local government with the aim of understanding government reforms introduced to assist municipalities to build their capacity to enable them to perform well. It utilises purposive sampling to identify the most appropriate participants based on the research objectives. The data was collected through semi-structured interviews and a review of relevant documents. As part of the analysis, summaries of the responses of interviewees were written in a meaningful way in line with the thematic areas determined in accordance with the research objectives. The municipality uses the Balances Scorecard as a performance management tool to determine the performance level of individuals and to detect areas that need corrective measures across the local municipalities. There are inconsistencies in the implementation, depending on how well the particular local municipality is resourced. In any municipality, the effective implementation of the PMS requires the municipality to reward excellent performers, which requires increases in the personnel budget to cater for monitory rewards.
56

Meaningful Metrics in Software Engineering : The Value and Risks of Using Repository Metrics in a Company

Jacobsson, Frida January 2023 (has links)
Many large companies use various business intelligence solutions to filter, process, and visualize their software source code repository data. These tools focus on improving continuous integration and are used to get insights about people, products, and projects in the organization. However, research has shown that the quality of measurement programs in software engineering often is low since the science behind them is unexplored. In addition, code repositories contain a considerable amount of information about the developers, and several ethical and legal aspects need to be considered before using these tools, such as compliance with GDPR. This thesis aims to investigate how companies can use repository metrics and these business intelligence tools in a safe and valuable way. In order to answer the research questions, a case study was conducted in a Swedish company, and repository metrics from a real business intelligence tool were analyzed based on several questions. These questions were related to software measurement theory, ethical and legal aspects of software engineering and metrics, and institutionalized theory. The results show how these metrics could be of value to a company in different ways, for instance by visualization collaboration in a project or by differentiating between read and active repositories. These metrics could also be valuable by linking them to other data in the company such as bug reports and repository downloads. The findings show that the visualizations could potentially be perceived as a type of performance monitoring by developers, causing stress and unhealthy incitements in the organization. In addition, repository metrics are based on identifiable data from Git, which according to the GDPR is classified as personal data. Further, there is a risk that these tools are used simply because they are available, as a way to legitimize the company. In order to mitigate these risks, the thesis states that the metrics should be anonymized, and the focus of the metrics should be on teams and processes rather than individual developers. The teams themself should be a part of creating the Goal-Question-Metrics that link the metrics to what the teams wish to establish.
57

PathCase SB: Automating Performance Monitoring And Bugs Detection

Azzam, Yves Said 24 August 2012 (has links)
No description available.
58

True-time all optical performance monitoring by means of optical correlation

Abou-Galala, Feras Moustafa 06 June 2007 (has links)
No description available.
59

Analytics adoption in manufacturing – benefits, challenges and enablers

Cupertino Ribeiro, Junia January 2022 (has links)
Digitalisation is changing the manufacturing landscape with promises to enhance industrial competitiveness with new technologies and business approaches. Various data-driven applications, enabled by digital technologies, can support process monitoring, production quality control, smart planning, and optimisation by making relevant data available and accessible to different roles in production. In this context, analytics is a relevant tool for improved decision-making for production activities since it entails extracting insights from data to create value for decision-makers. However, previous research has identified a lack of guidelines to manage the technological implementation needed for analytics. Furthermore, there are few studies in a real manufacturing setting that describe how companies are exploiting analytics. To address this gap, the purpose of this study is to investigate the implementation and use of analytics for production activities in the manufacturing industry. To fulfil the purpose of the study, the following research questions were formulated: RQ1: What does the adoption of analytics look like and what results can it bring to production activities of a manufacturing company? RQ2: What are the challenges and enablers for analytics adoption in production activities of a manufacturing company? This study was based on a literature review in addition to a single case study in a large multinational machinery manufacturing company. Data collection included observations and semi-structured interviews about three analytics use cases: for production performance follow-up, production disturbances tracking and production planning and scheduling. The first use case was based on the Design Thinking process and tools while the other two cases were narrower in scope and do not cover the development process in detail. Qualitative data analysis was the method used to examine the empirical and theoretical data. The empirical findings indicate that analytics solutions for production activities do not need to be sophisticated and characterised by high automation and complexity to bring meaningful value to manufacturing companies. The three analytics use cases investigated improved effectiveness and efficiency of production performance follow-up, production disturbances and production planning and scheduling activities. The main contributor to these benefits was a higher level of transparency of the factory manufacturing operations, which in turn aids collaboration, preventive decision-making, prioritization and better resource allocation. The identified challenges for analytics adoption were related to information system challenges and people &amp; organization challenges. In other to address these challenges, this study suggests that manufacturing companies should focus on securing sponsorship from senior management and leadership, implementing cultural change to embrace fact-based decisions, training the existing workforce in analytics skills and empowering and recruiting people with digital skills. Moreover, it is recommended that manufacturing companies integrate information systems vertically and horizontally, link and aggregate data to deliver contextualised information to different roles and finally, invest in data-related Industry 4.0 technologies to capture, transfer, store, and process manufacturing data efficiently.
60

EMPIRICALLY-BASED INTERVENTIONS FOR ERROR MONITORING DEFICITS IN DEMENTIA

Bettcher, Brianne Magouirk January 2010 (has links)
The diminished ability to perform everyday tasks is a salient problem for individuals diagnosed with a dementia. Recent research suggest that dementia patients detect significantly fewer action errors than age-matched controls; however, very little is known about the derivation of their error monitoring difficulties. The primary aims of my dissertation were to evaluate a novel, task training action intervention (TT-NAT) designed to increase error monitoring in dementia patients, and to pinpoint the relation between error monitoring and neuropsychological processes in participants who receive the task training intervention. Results indicated that dementia participants in the TT-NAT condition produced fewer total errors and detected significantly more of their errors than individuals in the Standard condition (z = 3.0 and t = 3.36, respectively; p &lt; . 05). Error detection in the TT-NAT condition was strongly related to the language/semantic knowledge composite index only (r = .57, p = .00), whereas it was moderately related to both the language and executive composite indices in the Standard condition. No differences in error correction rates were noted, although patients in all groups corrected the majority of errors detected. The findings suggest that the TT-NAT may be a promising intervention for error monitoring deficits in dementia patients, and have considerable implications for neuropsychological rehabilitation. / Psychology

Page generated in 0.137 seconds