• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 278
  • 189
  • 50
  • 48
  • 29
  • 24
  • 19
  • 16
  • 13
  • 11
  • 10
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 781
  • 197
  • 131
  • 118
  • 107
  • 93
  • 91
  • 88
  • 82
  • 81
  • 79
  • 78
  • 76
  • 70
  • 63
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
681

From machine learning to learning with machines:remodeling the knowledge discovery process

Tuovinen, L. (Lauri) 19 August 2014 (has links)
Abstract Knowledge discovery (KD) technology is used to extract knowledge from large quantities of digital data in an automated fashion. The established process model represents the KD process in a linear and technology-centered manner, as a sequence of transformations that refine raw data into more and more abstract and distilled representations. Any actual KD process, however, has aspects that are not adequately covered by this model. In particular, some of the most important actors in the process are not technological but human, and the operations associated with these actors are interactive rather than sequential in nature. This thesis proposes an augmentation of the established model that addresses this neglected dimension of the KD process. The proposed process model is composed of three sub-models: a data model, a workflow model, and an architectural model. Each sub-model views the KD process from a different angle: the data model examines the process from the perspective of different states of data and transformations that convert data from one state to another, the workflow model describes the actors of the process and the interactions between them, and the architectural model guides the design of software for the execution of the process. For each of the sub-models, the thesis first defines a set of requirements, then presents the solution designed to satisfy the requirements, and finally, re-examines the requirements to show how they are accounted for by the solution. The principal contribution of the thesis is a broader perspective on the KD process than what is currently the mainstream view. The augmented KD process model proposed by the thesis makes use of the established model, but expands it by gathering data management and knowledge representation, KD workflow and software architecture under a single unified model. Furthermore, the proposed model considers issues that are usually either overlooked or treated as separate from the KD process, such as the philosophical aspect of KD. The thesis also discusses a number of technical solutions to individual sub-problems of the KD process, including two software frameworks and four case-study applications that serve as concrete implementations and illustrations of several key features of the proposed process model. / Tiivistelmä Tiedonlouhintateknologialla etsitään automoidusti tietoa suurista määristä digitaalista dataa. Vakiintunut prosessimalli kuvaa tiedonlouhintaprosessia lineaarisesti ja teknologiakeskeisesti sarjana muunnoksia, jotka jalostavat raakadataa yhä abstraktimpiin ja tiivistetympiin esitysmuotoihin. Todellisissa tiedonlouhintaprosesseissa on kuitenkin aina osa-alueita, joita tällainen malli ei kata riittävän hyvin. Erityisesti on huomattava, että eräät prosessin tärkeimmistä toimijoista ovat ihmisiä, eivät teknologiaa, ja että heidän toimintansa prosessissa on luonteeltaan vuorovaikutteista eikä sarjallista. Tässä väitöskirjassa ehdotetaan vakiintuneen mallin täydentämistä siten, että tämä tiedonlouhintaprosessin laiminlyöty ulottuvuus otetaan huomioon. Ehdotettu prosessimalli koostuu kolmesta osamallista, jotka ovat tietomalli, työnkulkumalli ja arkkitehtuurimalli. Kukin osamalli tarkastelee tiedonlouhintaprosessia eri näkökulmasta: tietomallin näkökulma käsittää tiedon eri olomuodot sekä muunnokset olomuotojen välillä, työnkulkumalli kuvaa prosessin toimijat sekä niiden väliset vuorovaikutukset, ja arkkitehtuurimalli ohjaa prosessin suorittamista tukevien ohjelmistojen suunnittelua. Väitöskirjassa määritellään aluksi kullekin osamallille joukko vaatimuksia, minkä jälkeen esitetään vaatimusten täyttämiseksi suunniteltu ratkaisu. Lopuksi palataan tarkastelemaan vaatimuksia ja osoitetaan, kuinka ne on otettu ratkaisussa huomioon. Väitöskirjan pääasiallinen kontribuutio on se, että se avaa tiedonlouhintaprosessiin valtavirran käsityksiä laajemman tarkastelukulman. Väitöskirjan sisältämä täydennetty prosessimalli hyödyntää vakiintunutta mallia, mutta laajentaa sitä kokoamalla tiedonhallinnan ja tietämyksen esittämisen, tiedon louhinnan työnkulun sekä ohjelmistoarkkitehtuurin osatekijöiksi yhdistettyyn malliin. Lisäksi malli kattaa aiheita, joita tavallisesti ei oteta huomioon tai joiden ei katsota kuuluvan osaksi tiedonlouhintaprosessia; tällaisia ovat esimerkiksi tiedon louhintaan liittyvät filosofiset kysymykset. Väitöskirjassa käsitellään myös kahta ohjelmistokehystä ja neljää tapaustutkimuksena esiteltävää sovellusta, jotka edustavat teknisiä ratkaisuja eräisiin yksittäisiin tiedonlouhintaprosessin osaongelmiin. Kehykset ja sovellukset toteuttavat ja havainnollistavat useita ehdotetun prosessimallin merkittävimpiä ominaisuuksia.
682

Corporate publishing in South African banks : focus on formal, external publications

Mostert, Aleta 06 December 2004 (has links)
“What constitutes corporate publishing?” is the question that motivated the research for this study. It is not easily defined, but can be contextualised as part of the communications and marketing strategy of an organisation. In essence it entails the conceptualisation, planning and realisation of professional publications in an organisation. By conducting interviews with publishing personnel in selected South African banks, best practices pertaining to corporate publishing structures and processes were derived. It was found that traditional book publishing activities, such as commissioning; planning and creating content; reviewing, copy-editing and proofreading; design and layout; production, marketing; printing; and distribution can be used as basis for a corporate publishing venture. The convergence of media, however, is challenging publishers to rethink traditional methods of publishing. Electronic publishing is opening new vistas for organisations as it is an efficient tool for them to build and strengthen their corporate identity and to reach wider markets. To acommodate electronic dissemination, the adoption of an integrated, parallel publishing workflow is proposed in the study. Utilising a single source document for creating multiple formats enhances the publishing process and ensures the longevity of information. In order to draw all the publishing activities in an organisation together in a consistent and cohesive way, a centralised publishing strategy seems to be the most effective solution. The golden thread running through this study is the important role of corporate publishers as service providers in information-rich organisations. / Dissertation (MA (Publishing))--University of Pretoria, 2005. / Information Science / unrestricted
683

Podpora webových služeb v prostředí .NET framework / Web services support in the .NET framework

Fischer, Roman January 2008 (has links)
The work targets the very important part of service oriented architecture -- web services. It is focused at one of its practical implementation, from the Microsoft Corporation, especially in the last two releases of .NET Framework. The main target of this work is comprehensive summary and demonstration of Microsoft web services development. The principles of web services and their base in the .NET Framework are described here. At the beginning of this work the web services are put into the scope of service oriented architecture and their main principles are described. Subsequently the .NET Framework is explained with its main principles and with the comparison to the JAVA technology. This part is finished with the explanation of classic ASMX web services. The main part of this work focuses at detailed description of the theoretical and practical aspects of web services in the Windows Communication Foundation and the Windows Workflow Foundation context. Even the development of Windows Forms application is not left out, because they also have done a large progress and they are often consumers of web services. The options of securing web services in the .NET Framework are shown both theoretically and practically including authentication and authorization. The methods of transactional behavior are shown too. The work also focuses at the orchestration of web services including the explanation of workflows with the deployment example. At the end of the work there is revealed how to guarantee interoperability with other technologies, how to manage web services and how to ensure their versioning.
684

Automatizace digitalizačního workflow NTK / Automatization of the digitization workflow of the National Library of Technology

Řihák, Jakub January 2013 (has links)
This diploma thesis is focused on the automatization of digitization workflow in the National Library of Technology, Prague, Czech Republic. This thesis examines possibilities of digitization processes automatization by means of scripts written in Perl programming language and Apache Ant build tool. The advantages and disadvantages of both solutions are analyzed as well as their suitability for automatization of digitization workflow. Based on the comparison of both solutions, the scripts in Perl programming language are selected as the most suitable solution for automatization of digitization workflow. The question whether Ant build tool could be used for the purpose of automatization of digitization workflow is also answered in this thesis. The Ant build tool could be used for the above-mentioned purpose. However, once the activities in the given process divert from the general scope of tasks provided by the Ant build tool, the complexity of the whole solution increases rapidly. This complexity is given by the necessity to use predefined tasks -- sets of functions which have to be combined to create a functional automatization script. Even though Ant is an extendable tool, it is necessary to understand the Java programming language in order to create a new Ant task successfully. On the other hand, the Perl programming language allows easier customization of the script for the purposes of automatization of digitization workflow. Also, the modularity of the Perl programming language makes it easier to create those scripts and modify, correct or develop them even further.
685

Investigation of wireless sensor nodes with energy awareness for multichannel signal measurement

Zhu, Zhenhuan January 2015 (has links)
Wireless Sensor Networks (WSNets), consisting of a lot of Wireless Sensor Nodes (WSNs), play an important role in structural health and machine condition monitoring. But the WSNs provided by the current market cannot meet the diversity of application requirements because they have limited functions, unreliable node performance, high node cost, high system redundancy, and short node lifespan. The aim of the research is to design the architecture of a WSN with low power consumption and node cost, which can be dynamically configured according to application requirements for structural health and machine condition monitoring. This research investigates the improvement of node performance and reliability through the new design methodologies and the extension of node lifespan by interfacing energy harvesters and implementing node power management. The main contributions of the research are presented from the following aspects:1. Model development of node architecture for application diversityThe merits of model include: (1) The proposed node architecture can be dynamically configured in terms of application requirements for reducing system redundancy, power consumption and cost; (2) It supports multichannel signal measurement with the synchronous and asynchronous signal sampling modules and three interface circuits; (3)The model parameters can be calculated; (4) As the model is based on discrete electronic components, it can be implemented by using Components-Off-The-Shelf (COTS).2. A novel pipeline design of the built-in ADC inside a microprocessorThe merit of proposed pipeline solution lies in that the sampling time of the built-in ADCs is reduced to one third of the original value, when the ADC operates in sequence sampling mode based on multichannel signal measurement.3. Self-adjusting measurement of sampled signal amplitude This work provides a novel method to avoid the distortion of sampled signals even though the environmental signal changes randomly and over the sampling range of the node ADC. The proposed method can be implemented with four different solutions.4. Interface design to support energy harvesting The proposed interface will allow to: (1) collect the paroxysmal ambient energy as more as possible; (2) store energy to a distribution super-capacitor array; (3) harvest electrical energy at high voltage using piezoelectric materials without any transformer; (4) support the diversity of energy transducers; and (5) perform with high conversion efficiency.5. A new network task scheduling model for node wireless transceiver The model allows to: (1) calculate node power consumption according to network task scheduling; (2) obtain the optimal policy for scheduling network task.6. A new work-flow model for a WSN The model provides an easy way to (1) calculate node power consumption according to the work flow inside a WSN; (2) take fully advantage of the power modes of node electronic components rather than outside factors; (3) improve effectively node design.
686

Från anlogt till digitalt : digitaliseringen av svensk radiologi i ett produktions- och organisationsperspektiv

Selim, Marianne January 2015 (has links)
Syfte: Att beskriva den organisationsförändring som fem röntgenkliniker i Sverige genomgick i samband med att radiologin digitaliserades. Frågeställningar: Hur förändrades antalet genomförda radiologiska undersökningar och undersökningstyper från tidpunkten två år före digitaliseringen till två, fyra och sex år efter digitaliseringen? Hur förändrades antalet arbetade timmar och fördelningen av arbetade timmar mellan de olika personalkategorierna från tidpunkten två år före digitaliseringen till två, fyra och sex år efter digitaliseringen? Hur förändrades arbetsuppgifterna inom och mellan de olika personalkategorierna på röntgenklinikerna efter digitaliseringen? Hur beskriver personal med en nyckelfunktion genomförandet av digitaliseringen och eventuell förändring av arbetet efter digitaliseringen? Metod: Kvantitativ och kvalitativ metod har använts och kombinerats för att besvara studiens frågeställningar. Fem svenska röntgenkliniker ingick i studien och material avseende produktionsutfall samlades in från respektive röntgenklinik. Totalt intervjuades 22 personer, med tre till sex intervjuer per röntgenklinik. Teoretiska ramverk: För att försöka förstå de positiva fynden, avseende utökade antalet undersökningar och minskning av arbetade timmar, efter digitaliseringen, genomfördes vid en av röntgenklinikerna en fallstudie med ytterligare två frågeställningar: Hur genomfördes digitaliseringen? Hur har uppföljningen vad avser arbetsflöden, genomförandet och förändringar i detalj genomförts efter digitaliseringen? Delar av Bramson m.fl. (2005) perspektiv och Kotter och Cohens (2002) framgångsfaktorer har applicerats vid analysen av det resultatet. Resultat: Samtliga röntgenkliniker uppvisade en liten ökning av antalet radiologiska undersökningar, medan datortomografi- och magnetkameraundersökningarna ökade markant under den studerade tiden. Efter digitaliseringen genomförs således fler komplicerade undersökningar, vilket anses ha medfört en kvalitativt bättre och säkrare diagnosticering. Övriga undersökningstyper uppvisar varierande resultat. Vid tre av röntgenklinikerna minskade den totala arbetade tiden, och vid två ökade den. Antalet arbetade timmar ökade för röntgenläkare och röntgensjuksköterskor, men minskade för sekreterare och undersköterskor under den studerade tiden. Generellt utför röntgenläkarna och röntgensjuksköterskorna fler arbetsuppgifter efter digitaliseringen, och många av de traditionella arbetsuppgifterna för sekreterare och undersköterskor har försvunnit eller tagits över av datorn. Bättre tillgänglighet, förhöjd kvalitet och utökad diagnostik, tydligare arbetslistor som styr arbetsflödet, förbättrad ergonomi och miljö samt samordning med andra har möjliggjorts efter digitaliseringen. Utifrån intervjuerna i fallstudien framkom teman som beskriver: information, superanvändare, utbildning, tydliga arbetsflöden och rutiner, personalbehov och motivation, liksom ett tydligt ledarskap som genomsyrat förändringsprocessen. Dessa teman är, enligt Bramson och Bramson (2005) och Kotter och Cohen (2002), viktiga att beakta för att lyckas med en förändring. Konklusion: Ingen av de intervjuade vill återgå till ett analogt arbetssätt. Digitaliseringen har inneburit att mycket förändrats för de berörda personalkategorierna, men fördelarna anses uppväga nackdelarna. För att lyckas med en förändring bör Bramsons perspektiv och Kotters framgångsfaktorer tas i beaktande. / Background: Since Wilhelm Conrad Röntgen’s discovery of X-rays in 1896, radiology has undergone great changes. In Sweden, the process of digitalisation radiology was initiated in the mid 1990s. Aim: To describe the organisational changes that took place in five radiol- ogy clinics in Sweden in connection with the digitalisation of radiology. Study questions: How did the number and type of radiological examinations change when comparing two years before with two, four and six years after digitalisation? How did the number and distribution of hours worked change among different staff categories when comparing two years before with two, four and six years after digitalisation? How did the duties change after digitalisation within and between different staff categories at the radiology clinics? How do staff members in key positions describe the implementa- tion of digitalisation and any changes in their professional prac- tice after digitalisation? Method: Quantitative and qualitative methods were combined in order to answer the study questions. Five Swedish radiology clinics took part in the study, and production outcome data were gathered from each clinic. A total of 22 individuals were interviewed, with three to six interviews per radiology clinic. In order to gain an understanding of the positive results at one of the radiology clinics in terms of the increased number of exami- nations and reduction in the number of hours worked after digitalization a case study was conducted at this clinic, in which two questions were posed: How was the digitalisation carried out? How were workflow, im- plementation and changes followed up in detail after digitalisation? Ele- ments of Bramson and Bramson ́s (2005) perspectives and Kotter and Cohen’s (2002) success factors were applied. Findings: All radiology clinics had a small increase in the number of radio- logical examinations, while computed tomography and magnetic reso- nance imaging examinations increased significantly during the period un- der investigation. Thus, more complex examinations were carried out post digitalisation, a fact considered to have led to qualitatively better and safer diagnoses. The total working time was reduced at three clinics, while two exhibited an increase. The total number of hours worked increased for radiologists and radiographers but decreased for secretaries and assistant nurses. In general, radiologists and radiographers performed more tasks post digitalisation, and many of the traditional duties of secretaries and assistant nurses were eliminated or taken over by the computer. Digitalisa- tion enabled greater access to radiological images and patient data in addi- tion to improved quality and diagnostics of the radiological examination. Digitalisation also allowed a better-controlled workflow, as well as im- proving ergonomics, the environment and the coordination with other clinics. The case study interviews revealed themes describing: information, super users, education, clear work flows and routines, staff needs and motiva- tion as well as clear leadership throughout the change process. According to Bramson and Kotter, these themes are important for successful change. Conclusion: None of the individuals interviewed wished to return to the analogue mode of working. The digitalisation led to great changes for the different staff categories, with the advantages outweighing the disad- vantages. To succeed with change, Bramson’s perspectives and Kotter’s success factors should be taken into consideration.
687

Using Workflows to Automate Activities in MDE Tools

Gamboa, Miguel 09 1900 (has links)
Le génie logiciel a pour but de créer des outils logiciels qui permettent de résoudre des problèmes particuliers d’une façon facile et efficace. À cet égard, l’ingénierie dirigée par les modèles (IDM), facilite la création d’outils logiciels, en modélisant et transformant systématiquement des modèles. À cette fin, l’IDM s’appuie sur des workbenches de langage : des environnements de développement intégré (IDE) pour modéliser des langages, concevoir des modèles, les exécuter et les vérifier. Mais l’utilisation des outils est loin d’être efficace. Les activités de l’IDM typiques, telles que la création d’un langage de domaine dédié ou créer une transformation de modèles, sont des activités complexes qui exigent des opérations souvent répétitives. Par conséquent, le temps de développement augmentate inutilement. Le but de ce mémoire est de proposer une approche qui augmente la productivité des modélisateurs dans leurs activités quotidiennes en automatisant le plus possible les tâches à faire dans les outils IDM. Je propose une solution utilisant l’IDM où l’utilisateur définit un flux de travail qui peut être paramétré lors de l’exécution. Cette solution est implémentée dans un IDE pour la modélisation graphique. À l’aide de deux évaluations empiriques, je montre que la productivité des utilisateurs est augmentée et amééliorée. / Software engineering aims to create software tools that allow people to solve particular problems in an easy and efficient way. In this regard, Model-driven engineering (MDE) enables to generate software tools, by systematically modeling and transforming models. In order to do this, MDE relies on language workbenches: Integrated Development Environment (IDE) for engineering modeling languages, designing models executing them and verifying them. However, the usability of these tools is far from efficient. Common MDE activities, such as creating a domain-specific language or developing a model transformation, are nontrivial and often require repetitive tasks. This results in unnecessary risings of development time. The goal of this thesis is to increase the productivity of modelers in their daily activities by automating the tasks performed in current MDE tools. I propose an MDE-based solution where the user defines a reusable workflow that can be parameterized at run-time and executed. This solution is implemented in an IDE for graphical modeling. I also performed two empirical evaluations in which the users’ productivity is improved.
688

A Dynamic Workflow Framework for Mass Customization Using Web Service and Autonomous Agent Technologies

Karpowitz, Daniel J. 07 December 2006 (has links)
Custom software development and maintenance is one of the key expenses associated with developing automated systems for mass customization. This paper presents a method for reducing the risk associated with this expense by developing a flexible environment for determining and executing dynamic workflow paths. Strategies for developing an autonomous agent-based framework and for identifying and creating web services for specific process tasks are presented. The proposed methods are outlined in two different case studies to illustrate the approach for both a generic process with complex workflow paths and a more specific sequential engineering process.
689

Distribuovaný dokumentový server založený na databázi CouchDB / Distributed Document Server Based on CouchDB Database

Kanis, Martin January 2017 (has links)
Thesis discusses distributed database systems and its advantages and disadvantages. Further, text informs about document database CouchDB, storage of documents, synchronization and CAP theorem. The aim of the thesis is to implement distributed document management system and workflow management system based on CouchDB. The system contains a cluster with three CouchDB nodes with HAProxy in front, which does load balancing. The system allows creation of any document based on the template, manages its life cycle and workflow. It is also possible to create a custom workflow using BRMS rules. The implemented solution simplifies document management and workflow and allows a high degree of customization for the organizations needs.
690

Herní engine pro ITIL trenažér / An ITIL Simulator Game Engine

Pučálka, Martin January 2018 (has links)
This master's thesis is focused on Information Technology Infrastructure Library (ITIL). Objective of the project was to analyze, design and implement a game engine, which would provide simulation of IT service operation in real or accelerated time or in turns. Basic part of the engine is a creators mode, which allows users to create custom IT services and specify their behaviour during operation, like the service would be used in real. Another part of the engine is a players mode and simple service desk. In this mode, players can take care of fluent operation of their services. Thanks to this, they can learn and train practices, which are described in ITIL.

Page generated in 0.0264 seconds