• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 315
  • 171
  • 45
  • 27
  • 18
  • 11
  • 10
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • Tagged with
  • 767
  • 767
  • 111
  • 110
  • 105
  • 94
  • 88
  • 79
  • 78
  • 77
  • 74
  • 74
  • 70
  • 67
  • 55
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
451

Investigating the integration of ICT in Mathematics and Science in Swaziland classrooms

Simelane, Armstrong Siboniso January 2013 (has links)
The purpose of this study is to investigate the status quo of the integration of Information and Communication Technology (ICT) in Swaziland Mathematics and Science form four classrooms. The rationale is to understand the status in order to propose an appropriate strategy for policy and practice in education. The research design is a survey conducted in the Manzini region, the most central, largest region with the largest number of schools. The study focused on all schools that had computers for the purpose of education. There were 149 participants from 43 schools. All Mathematics, Science, and ICT teachers were included in the study, including head teachers. The study instruments were adapted from the Second Information Technology in Education Study (SITES) undertaken under the auspices of the International Association for the Evaluation of Educational Achievement. Permission to adapt South African instruments was granted through the Centre for Evaluation and Assessment at the University of Pretoria. Four types of questionnaires were adapted and used. The data was analysed using descriptive statistics. The study revealed that very little had been done by Swaziland to integrate ICT in Mathematics and Science classrooms. The status quo indicated that the Student Computer Ratio was very high (14:1). In terms of ICT literacy 67% of the ICT teachers were diploma holders, 72% of the Mathematics, 78% of the Science teachers had no ICT qualification and most of vi the learners were not competent. The computers were found old and out of date and lacking maintenance. The lack of School ICT policy as a possible factor that could explain the pedagogical use of ICT was found to be a major obstacle in the integration of ICT in the classrooms, and this was a major thematic area that has to be addressed. This is viewed as an obstacle by Mathematics, Science and ICT teachers together with their principals. The elimination of this obstacle could possibly ensure sufficient funding, appropriate knowledge and skills, efficient leadership, proficient teachers and competent students, and thereby give impetus to ICT integration in Mathematics and Science classrooms. It was then concluded that Mathematics and Science teachers do not integrate ICT in their practice, despite being core subjects. The channelling of more resources by government to ICT education was strongly recommended. / Dissertation (MEd)--University of Pretoria, 2013. / gm2014 / Science, Mathematics and Technology Education / unrestricted
452

Design in globalen Industrien – Ein Blick hinter die Kulissen von Dräger

Glass, Herbert, Willner, Matthias 19 July 2017 (has links) (PDF)
Aus der Einführung: "Dräger ist ein international führendes Unternehmen der Medizin- und Sicherheitstechnik. Das 1889 in Lübeck gegründeten Familienunternehmen besteht in der fünften Generation und hat sich zu einem globalen börsennotierten Konzern entwickelt. „Technik für das Leben“ ist die Leitidee des Unternehmens. Ob im Operationsbereich, auf der Intensivstation, bei der Feuerwehr oder im Rettungsdienst: Dräger-Produkte schützen, unterstützen und retten Leben. Dräger bietet seinen Kunden unter anderem Anästhesie-Arbeitsplätze, Beatmungsgeräte für die Intensiv- und Notfallmedizin, Patientenmonitoring sowie die medizinische Versorgung von Frühchen und Neugeborenen. Mit Deckenversorgungseinheiten, IT-Lösungen für den OP und Gasmanagementsystemen steht das Unternehmen seinen Kunden im gesamten Krankenhaus zur Seite. Feuerwehren, Rettungsdienste, Behörden und die Industrie vertrauen auf das ganzheitliche Gefahrenmanagement von Dräger, insbesondere für den Personen- und Anlagenschutz. Dazu gehören: Atemschutzausrüstungen, stationäre und mobile Gasmesssysteme, professionelle Tauchtechnik sowie Alkohol- und Drogenmessgeräte. Darüber hinaus entwickelt Dräger gemeinsam mit seinen Kunden maßgeschneiderte Lösungen wie komplette Brandübungsanlagen, Trainingskonzepte und Schulungen. ..."
453

How can the Principal-Agent Theoretical Framework be used to attain a better understanding of Performance Management in Primary Care?

Rogan, Lisa January 2013 (has links)
Principal-Agent Theory has been used by researchers to understand relationships among different professional groups and explain contractual arrangements and performance management between organisations. Use of principal-agent theory in research within primary care is limited. This study uses the principal-agent theoretical framework to attain a better understanding of performance management in primary care. The relationship between managers and primary care clinicians is explored. The study uses the main theoretical assumptions of information asymmetry and goal alignment to explain the findings. Case study methodology was used to explore the research question. Purposive sampling techniques using the following datasets were used to select participants: practice size; percentage South Asian population; levels of deprivation; and relative performance on selected national productivity performance indicators. Managers were selected based on their corporate responsibility for organisational performance or role in performance management within the organisation. Twenty six formal recorded interviews were carried out with various directors, managers and clinicians across NHS East Lancashire and NHS North West. Interviews were transcribed and organised using NVIVO 8 software and analysed using matrix analysis techniques. There is confusion between performance measurement and performance management. The terms are often used interchangeably. The NHS tends to focus on things that are easy to measure at the risk of omitting important aspects of care that are more difficult to capture. Soft performance intelligence such as: training, multidisciplinary teamwork, communication and shared vision are just as important as tangible metrics. Three primary care performance management systems emerged from the findings: The Quality and Outcomes Framework (QOF), the Practice Development Framework (PDF) and the Better Care Better Value (BCBV) Productivity Performance Framework. Performance indicators do not always reflect quality outcomes. This can lead to poorer outcomes for patients especially if dependent on incentives. The study addresses the research question through analysis of the following themes: Understanding performance management; Use of performance management systems; and Barriers to performance management in primary care. A common understanding of performance management is required across all stakeholders. Clear, strategic direction and consistent interpretation of organisational objectives at all levels is essential to achieve goal alignment. Soft intelligence and qualitative measures need to be considered as well as tangible metrics. Clinical engagement and ownership are key factors to achieving goal alignment. Trust between principal and agent affects the degree to which information asymmetry is overcome. Appropriate use of incentives, good communication and strategies to overcome the problem of information asymmetry and address ‘professional superiority’ improves goal alignment. Achieving the right balance between accountability and clinical autonomy is important to ensure governance and financial balance without stifling innovation. Goal alignment depends on quality and accuracy of information and robust targets to avoid misunderstanding and misinterpretation of information and inadvertent use of incentives. Principal-agent theory suggests goal alignment is achieved by applying the underlying assumptions of the theoretical framework. In complex systems like the English NHS, it is likely that only partial goal alignment will be achieved. The degree of achievement depends on the number and level of alignment of individual factors.
454

Designing a knowledge management architecture to support self-organization in a hotel chain

Kaldis, Emmanuel January 2014 (has links)
Models are incredibly insidious; they slide undetected into discussions and then dominate the way people think. Since Information Systems (ISs) and particularly Knowledge Management Systems (KMSs) are socio-technical systems, they unconsciously embrace the characteristics of the dominant models of management thinking. Thus, their limitations can often be attributed to the deficiencies of the organizational models they aim to support. Through the case study of a hotel chain, this research suggests that contemporary KMSs in the hospitality sector are still grounded in the assumptions of the mechanistic organizational model which conceives an organization as a rigid hierarchical entity governed from the top. Despite the recent technological advances in terms of supporting dialogue and participation between members, organizational knowledge is still transferred vertically; from the top to the bottom or from the bottom to the top. A number of limitations still exist in terms of supporting effectively the transfer of knowledge horizontally between the geographically distributed units of an organization. Inspired from the key concepts of the more recent complex systems model, referred frequently as complexity theories, a Knowledge Management Architecture (KMA) is proposed aiming to re-conceptualize the existing KMSs towards conceiving an organization as a set self-organizing communities of practice (CoP). In every such CoP, order is created from the dynamic exchange of knowledge between the structurally similar community members. Thus, the focus of the KMA is placed on capturing systematically for reuse the architectural knowledge created upon every initiative for change and share such knowledge with the rest of the members of the CoP. A KMS was also developed to support the dynamic dimensions that the KMA proposes. The KMS was then applied in the case of the hotel chain, where it brought significant benefits which constitute evidence of an improved self-organizing ability. The previously isolated hotel units residing in distant regions could now trace but also reapply easily changes undertaken by the other community members. Top-management’s intervention to promote change was reduced, while the pace of change increased. Moreover, the organizational cohesion, the integration of new members as well as the level of management alertness was enhanced. The case of the hotel chain is indicative. It is believed that the KMA proposed can be applicable to geographically distributed organizations operating in different sectors too. At the same time, this research contributes to the recent discourse between the fields of IS and complexity by demonstrating how fundamental concepts from complexity such as self-organization, emergence and edge-of-chaos can be embraced by contemporary KMSs.
455

Využití Balanced ScoreCard k měření výkonnosti procesů agentury CzechTrade / Measurement of Process Performance in CzechTrade using Balanced ScoreCard

Achrerová, Gabriela January 2007 (has links)
Theoretical part: measurement of process performance in quality management system according to ISO 9001 Applied part: description of agency, processes, current BSC methodology, suggested new BSC methodology according to new strategy
456

Řízení identit v bance / Identity management in the bank

Gustab, Tomáš January 2015 (has links)
The main focus of this thesis is to analyse the identity management of Czech bank. The author deals with the authentication and authorization of bank employees, the lifecycle of identities and also suggests the operational improvement of systems that relate to identity management. The aim is to show the view from every angle on identity management, highlight the problems and propose countermeasures, respectively measures to improve the identity management running.The main methods of research are analysis of documents, unstructured interview with employees and observation of systems operation. The benefit of this work is in providing author's opinion and recommendations, which can lead to an overall better running identity management in the bank.
457

An Empirical Investigation of the Willingness of US Intelligence Community Analysts to Contribute Knowledge to a Knowledge Management System (KMS) in a Highly Classified and Sensitive Environment

Hambly, Robert 01 January 2016 (has links)
Since September 11, 2001, the United States Government (USG) has possessed unparalleled capability in terms of dedicated intelligence and information collection assets supporting the analysts of the Intelligence Community (IC). The USG IC has sponsored, developed, and borne witness to extraordinary advances in technology, techniques, and procedures focused on knowledge harvesting, knowledge sharing, and collaboration. Knowledge, within successful (effective & productive) organizations, exists as a commodity; a commodity that can be created, captured, imparted, shared, and leveraged. The research problem that this study addressed is the challenge of maintaining strong organizational effectiveness and productivity through the use of an information technology-based knowledge management system (KMS). The main goal of this study was to empirically assess a model testing the impact of the factors of rewards, power, centrality, trust, collaborative environment, resistance to share, ease-of-using KMS, organizational structure, and top management support to inducement, willingness to share, as well as opportunity to contribute knowledge to a KMS on knowledge-sharing in a highly classified and sensitive environment of the USG IC. This study capitalized on prior literature to measure each of the 15 model constructs. This study was conducted with a select group of USG Departments and Agencies whose primary interest is Intelligence Operations. This study solicited responses from more than 1,000 current, as well as former, Intelligence Analysts of the USG IC, using an unclassified anonymous survey instrument. A total of 525 (52.5%) valid responses were analyzed using a partial least squares (PLS) structural equation modeling (SEM) statistical technique to perform model testing. Pre-analysis data screening was conducted to ensure the accuracy of the data collected, as well as to correct irregularities or errors within the gathered data. The 14 propositions outlined in this research study were tested using the PLS-SEM analysis along with reliability and validity checks. The results of this study provide insights into the key factors that shed light onto the willingness of US intelligence community analysts to contribute knowledge to a KMS in a highly classified and sensitive environment. Specifically, the significance of a knowledge worker’s willingness to contribute his/her knowledge to a KMS along with the opportunity to contribute knowledge, while inducement was not a significant factor when it comes to knowledge sharing using KMS in highly classified environments.
458

As estações rádio base na cidade de São Paulo: uma aboradagem sobre os riscos e uma contribuição para os sistemas de gerenciamento / RADIO BASE STATIONS IN THE CITY OF SÃO PAULO: a discussion of risks and a contribution to the management systems

Margarete Ponce Padueli 15 June 2012 (has links)
O desenvolvimento da tecnologia de telecomunicação móvel ensejou a implantação massiva de ERBs (Estações rádio base) para o atendimento da crescente demanda ocorrida na última década. As ERBs, por sua vez, constituem fontes emissoras de radiação não ionizante. Os estudos científicos concluídos nos últimos anos indicam que a exposição à radiação não ionizante, principalmente em longo prazo, pode produzir consequências negativas para a saúde humana, por essa razão, mister se faz a aplicação do Princípio da Precaução que vem evocar cautela diante do risco. Das principais normas existentes no planeta, nem todas adotaram o princípio da precaução ao fixar seus limites; em decorrência, a maioria das normas que determinam limites máximos de exposição teve como base apenas os efeitos biológicos danosos da exposição à radiação não ionizante já conhecidos pela comunidade científica e relacionados à exposição aguda. Dessa maneira, diante do risco resultante da exposição, surge outra questão de grande importância o gerenciamento dessas ERBs. Haveria, na cidade de São Paulo, um sistema de gerenciamento das ERBs que garanta o cumprimento dos limites e padrões definidos na legislação local? Nesse cenário, este estudo pretende: (i) elaborar um levantamento junto aos três órgãos públicos responsáveis pelo gerenciamento das ERBs na cidade de São Paulo, sobre o atual estágio dos sistemas de gerenciamento praticado por cada entidade pública legalmente competente para o respectivo exercício; (ii) desenvolver uma análise das características básicas dos sistemas de gerenciamento das ERBs no município de São Paulo, estudando a evolução ocorrida entre os anos de 2006 e 2011; e, (iii) avaliar as possibilidades de melhoria na proposta atual de gerenciamento das ERBs, no município de São Paulo. Esta pesquisa parte dos dados levantados no ano de 2006, na cidade de São Paulo, que na ocasião indicaram a existência de um sistema de gerenciamento das ERBs incipiente e frágil por parte dos órgãos públicos competentes (Secretaria Estadual da Saúde, Secretaria Municipal do Verde e Meio Ambiente e ANATEL). A partir deste cenário, elabora-se uma análise das características básicas dos sistemas de gerenciamento das ERBs no município de São Paulo, estudando a evolução ocorrida entre os anos de 2006 e 2011. Dessa avaliação, são identificadas falhas e possibilidades de melhoria nos sistemas de gerenciamento das ERBs praticados no município de São Paulo na atualidade. A metodologia adotada constitui-se de pesquisa de campo, por meio de entrevista estruturada, aplicada aos representantes dos órgãos públicos responsáveis pelo gerenciamento das ERBs no município de São Paulo. Finalmente, é desenvolvida a interpretação dos dados, global e individualmente, de forma que os resultados obtidos possibilitaram as considerações e propostas formuladas no capítulo conclusivo desta pesquisa. / The development of mobile telecommunication technology gave rise to the massive deployment of RBSs (Radio base stations) to attend the growing demand occurred in the last decade. The RBSs in turn, are emitting sources of non-ionizing radiation. Scientific studies concluded in the recent years indicate that exposure to non-ionizing radiation, especially in the long term, can produce negative consequences for human health, therefore, it becomes important to apply the Precautionary Principle that evokes caution in risk situation. Of the main rules in the world, not all adopted the precautionary principle in setting its limits; as a result, most of the rules that determine maximum exposure limits were based only on the harmful biological effects of exposure to ionizing radiation already known by the scientific community and related to acute exposure exclusively. Thus, with the risk resulting from exposure, there is another major issue - the management of these RBSs. Would it exist, in São Paulo city, a management system of RBSs to ensure compliance with the limits and standards defined by local law? In this scenario, this study aims to: (i) develop a survey along the three entities responsible for managing the RBSs in São Paulo, on the current stage of respective management systems practiced by each public entity legally responsible for the exercise thereof; (ii) develop an analysis of the basic features of the RBSs management systems in São Paulo, studying the evolution that occurred between the years 2006 and 2011; and, (iii) assess the possibilities for improvement in the current proposal for the RBSs management in São Paulo. This survey starts from the data collected in 2006, in São Paulo, which at the time indicated the existence of an incipient and fragile RBSs management system, by the public agencies (Secretaria Estadual da Saúde, Secretaria Municipal do Verde e Meio Ambiente e Anatel). From this scenario, it is drawn up an analysis of the basic features of the RBSs management systems in São Paulo, studying the evolution that occurred between the years 2006 and 2011. From this evaluation, are identified gaps and opportunities for improving the RBSs management systems practiced in São Paulo at present. The methodology consisted of field survey through structured questionnaires, applied to representatives of public agencies responsible for RBSs management in São Paulo. Finally, it is developed the interpretation of the data, overall and individually, so that the results enabled the considerations and proposals made in the concluding chapter of this research.
459

Management of generic and multi-platform workflows for exploiting heterogeneous environments on e-Science

Carrión Collado, Abel Antonio 01 September 2017 (has links)
Scientific Workflows (SWFs) are widely used to model applications in e-Science. In this programming model, scientific applications are described as a set of tasks that have dependencies among them. During the last decades, the execution of scientific workflows has been successfully performed in the available computing infrastructures (supercomputers, clusters and grids) using software programs called Workflow Management Systems (WMSs), which orchestrate the workload on top of these computing infrastructures. However, because each computing infrastructure has its own architecture and each scientific applications exploits efficiently one of these infrastructures, it is necessary to organize the way in which they are executed. WMSs need to get the most out of all the available computing and storage resources. Traditionally, scientific workflow applications have been extensively deployed in high-performance computing infrastructures (such as supercomputers and clusters) and grids. But, in the last years, the advent of cloud computing infrastructures has opened the door of using on-demand infrastructures to complement or even replace local infrastructures. However, new issues have arisen, such as the integration of hybrid resources or the compromise between infrastructure reutilization and elasticity, everything on the basis of cost-efficiency. The main contribution of this thesis is an ad-hoc solution for managing workflows exploiting the capabilities of cloud computing orchestrators to deploy resources on demand according to the workload and to combine heterogeneous cloud providers (such as on-premise clouds and public clouds) and traditional infrastructures (supercomputers and clusters) to minimize costs and response time. The thesis does not propose yet another WMS, but demonstrates the benefits of the integration of cloud orchestration when running complex workflows. The thesis shows several configuration experiments and multiple heterogeneous backends from a realistic comparative genomics workflow called Orthosearch, to migrate memory-intensive workload to public infrastructures while keeping other blocks of the experiment running locally. The running time and cost of the experiments is computed and best practices are suggested. / Los flujos de trabajo científicos son comúnmente usados para modelar aplicaciones en e-Ciencia. En este modelo de programación, las aplicaciones científicas se describen como un conjunto de tareas que tienen dependencias entre ellas. Durante las últimas décadas, la ejecución de flujos de trabajo científicos se ha llevado a cabo con éxito en las infraestructuras de computación disponibles (supercomputadores, clústers y grids) haciendo uso de programas software llamados Gestores de Flujos de Trabajos, los cuales distribuyen la carga de trabajo en estas infraestructuras de computación. Sin embargo, debido a que cada infraestructura de computación posee su propia arquitectura y cada aplicación científica explota eficientemente una de estas infraestructuras, es necesario organizar la manera en que se ejecutan. Los Gestores de Flujos de Trabajo necesitan aprovechar el máximo todos los recursos de computación y almacenamiento disponibles. Habitualmente, las aplicaciones científicas de flujos de trabajos han sido ejecutadas en recursos de computación de altas prestaciones (tales como supercomputadores y clústers) y grids. Sin embargo, en los últimos años, la aparición de las infraestructuras de computación en la nube ha posibilitado el uso de infraestructuras bajo demanda para complementar o incluso reemplazar infraestructuras locales. No obstante, este hecho plantea nuevas cuestiones, tales como la integración de recursos híbridos o el compromiso entre la reutilización de la infraestructura y la elasticidad, todo ello teniendo en cuenta que sea eficiente en el coste. La principal contribución de esta tesis es una solución ad-hoc para gestionar flujos de trabajos explotando las capacidades de los orquestadores de recursos de computación en la nube para desplegar recursos bajo demando según la carga de trabajo y combinar proveedores de computación en la nube heterogéneos (privados y públicos) e infraestructuras tradicionales (supercomputadores y clústers) para minimizar el coste y el tiempo de respuesta. La tesis no propone otro gestor de flujos de trabajo más, sino que demuestra los beneficios de la integración de la orquestación de la computación en la nube cuando se ejecutan flujos de trabajo complejos. La tesis muestra experimentos con diferentes configuraciones y múltiples plataformas heterogéneas, haciendo uso de un flujo de trabajo real de genómica comparativa llamado Orthosearch, para traspasar cargas de trabajo intensivas de memoria a infraestructuras públicas mientras se mantienen otros bloques del experimento ejecutándose localmente. El tiempo de respuesta y el coste de los experimentos son calculados, además de sugerir buenas prácticas. / Els fluxos de treball científics són comunament usats per a modelar aplicacions en e-Ciència. En aquest model de programació, les aplicacions científiques es descriuen com un conjunt de tasques que tenen dependències entre elles. Durant les últimes dècades, l'execució de fluxos de treball científics s'ha dut a terme amb èxit en les infraestructures de computació disponibles (supercomputadors, clústers i grids) fent ús de programari anomenat Gestors de Fluxos de Treballs, els quals distribueixen la càrrega de treball en aquestes infraestructures de computació. No obstant açò, a causa que cada infraestructura de computació posseeix la seua pròpia arquitectura i cada aplicació científica explota eficientment una d'aquestes infraestructures, és necessari organitzar la manera en què s'executen. Els Gestors de Fluxos de Treball necessiten aprofitar el màxim tots els recursos de computació i emmagatzematge disponibles. Habitualment, les aplicacions científiques de fluxos de treballs han sigut executades en recursos de computació d'altes prestacions (tals com supercomputadors i clústers) i grids. No obstant açò, en els últims anys, l'aparició de les infraestructures de computació en el núvol ha possibilitat l'ús d'infraestructures sota demanda per a complementar o fins i tot reemplaçar infraestructures locals. No obstant açò, aquest fet planteja noves qüestions, tals com la integració de recursos híbrids o el compromís entre la reutilització de la infraestructura i l'elasticitat, tot açò tenint en compte que siga eficient en el cost. La principal contribució d'aquesta tesi és una solució ad-hoc per a gestionar fluxos de treballs explotant les capacitats dels orquestadors de recursos de computació en el núvol per a desplegar recursos baix demande segons la càrrega de treball i combinar proveïdors de computació en el núvol heterogenis (privats i públics) i infraestructures tradicionals (supercomputadors i clústers) per a minimitzar el cost i el temps de resposta. La tesi no proposa un gestor de fluxos de treball més, sinó que demostra els beneficis de la integració de l'orquestració de la computació en el núvol quan s'executen fluxos de treball complexos. La tesi mostra experiments amb diferents configuracions i múltiples plataformes heterogènies, fent ús d'un flux de treball real de genòmica comparativa anomenat Orthosearch, per a traspassar càrregues de treball intensives de memòria a infraestructures públiques mentre es mantenen altres blocs de l'experiment executant-se localment. El temps de resposta i el cost dels experiments són calculats, a més de suggerir bones pràctiques. / Carrión Collado, AA. (2017). Management of generic and multi-platform workflows for exploiting heterogeneous environments on e-Science [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/86179 / TESIS
460

Use of Hair Cortisol Analysis to Evaluate the Influence of Housing Systems on Long-Term Stress in Horses

Werner, Jade Elizabeth January 2020 (has links)
No description available.

Page generated in 0.1267 seconds