• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 278
  • 189
  • 50
  • 48
  • 29
  • 24
  • 19
  • 16
  • 13
  • 11
  • 10
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 781
  • 197
  • 131
  • 118
  • 107
  • 93
  • 91
  • 88
  • 82
  • 81
  • 79
  • 78
  • 76
  • 70
  • 63
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
571

Implementation Of Concurrent Constraint Transaction Logic And Its User Interface

Altunyuva, Fethi 01 September 2006 (has links) (PDF)
This thesis implements a logical formalism framework called Concurrent Constraint Transaction Logic (abbr.,CCTR) which was defined for modeling and scheduling of workflows under resource allocation and cost constraints and develops an extensible and flexible graphical user interface for the framework. CCTR extends Concurrent Transaction Logic and integrates with Constraint Logic Programming to find the correct scheduling of tasks that involves resource and cost constraints. The developed system, which integrates Prolog and Java Platforms, is designed to serve as the basic environment for enterprise applications that involves CCTR based workflows and schedulers. Full implementation described in this thesis clearly illustrated that CCTR can be used as a workflow scheduler that involves not only temporal and causal constraints but also resource and cost constraints.
572

Information categories and editorial processes in multiple channel publishing

Sabelström Möller, Kristina January 2001 (has links)
No description available.
573

La Découverte de Workflow<br />Transactionnel pour la Fiabilisation des<br />Exécutions

Gaaloul, Walid 03 January 2007 (has links) (PDF)
Une évolution continue des paramètres, des contraintes et des besoins du procédé métier, non complètement prévisible initialement, exige des systèmes de gestion de procédés une conception continue et un modèle de procédé fiable. Dans cette thèse, nous nous intéressons à assurer une conception réactive par l'analyse des traces d'exécutions assurant une re-ingénierie du procédé métier et une fiabilisation des exécutions.<br /><br />Pour ce faire, nous introduisons d'abord un modèle de workflow transactionnel qui étend les systèmes de workflows en les fusionnant avec les modèles transactionnels avancés. Nous proposons, par la suite, des techniques d'analyse de traces d'exécutions pour la découverte de workflow et l'amélioration de leurs comportements transactionnels. Notre approche commence par la collecte des traces d'exécution. Nous construisons, ensuite, par des techniques d'analyse statistique, une représentation intermédiaire spécifiant des dépendances élémentaires entre les activités. Ces dépendances sont raffinées pour découvrir le modèle de workflow transactionnel. L'analyse des disparités entre le modèle découvert et le modèle initialement conçu nous permet de détecter des lacunes (anomalies) de conception, concernant particulièrement les mécanismes de recouvrement. En fonction de ces observations, nous appliquons finalement un ensemble de règles d'amélioration et/ou de correction du schéma initial.<br /><br />La contribution majeure de notre proposition est qu'elle permet de tenir compte des besoins d'évolution du procédé observés dans la phase d'exécution. Ceci nous permet d'assurer une conception continue garantissant, parmi d'autres, des exécutions correctes et fiables.
574

An investigation of factors impacting life-cycle application of Civil Integrated Management (CIM)

Sankaran, Bharathwaj 02 February 2015 (has links)
Highway projects are delivered in a complex environment that involves participation of diverse stakeholders with different objectives. Technological advancements have provided better tools and techniques that if incorporated can lead to effective project delivery complying with the multitude of objectives. Often the projects are cost-driven, schedule-driven, or both. Presence of ongoing traffic poses an additional challenge for the developers as it impacts the safety and comfort of both the commuters and the construction workers. A wide variety of tools, techniques and work processes are adopted across many projects depending on the project and agency requirements to make the process of project management efficient across its life-cycle. Civil Integrated Management (CIM) is a terminology that encompasses all such tools and technologies that can facilitate the process of digital project delivery and asset management. This study examines the current state of practice for CIM through surveys conducted at agency and project level. The results of these surveys are summarized to provide an understanding of the organizational and contractual issues related to CIM implementation and comprehend the process of technologies implementation and associated performance benefits at the project-level. Significant factors impacting successful life-cycle CIM utilization are elicited through the surveys and follow-up interviews and are investigated further under four main categories – Technology Implementation Planning, Model-based workflow and processes, design for construction automation and Information Management. Specific examples have been provided for each of these factors to demonstrate their utility on projects. The findings of this study will provide practitioners a list of key issues to be considered for profitable and effective implementation of the CIM technologies across a project’s life-cycle. / text
575

Computational Environment Design

Zhang, Haoqi 26 October 2012 (has links)
The Internet has evolved into a platform on which large numbers of individuals take action and join in collaborations via crowdsourcing, social media, and electronic commerce. When designing social and economic systems on the Internet, a key challenge is understanding how to promote particular desired behaviors and outcomes. I call this problem computational environment design. Notable abilities afforded by the Internet, such as the ability to recruit large numbers of individuals to join problem-solving efforts via crowdsourcing and social media, and the ability to engage in a data-driven iterative design process, are creating new opportunities and inspiring new methods for computational environment design. This dissertation focuses on these abilities and proposes an approach for arriving at effective designs by reasoning and learning about characteristics of participants and how these characteristics interact with a system’s design to influence behavior. The dissertation consists of two major components. The first component focuses on designing crowdsourcing and human computation systems that leverage a crowd to solve complex problems that require effective coordination among participants or the recruitment of individuals with relevant expertise. I show how reasoning about crowd abilities and limitations can lead to designs that make crowdsourcing complex tasks feasible, effective, and efficient. The solutions introduce new design patterns and methods for human computation and crowdsourcing; notable contributions include a crowdware design for tackling human computation tasks with global constraints, and incentive mechanisms for task routing that harness people’s expertise and social expertise by engaging them in both problem solving and routing. The second component focuses on understanding how to design effective environments automatically. I introduce a general active, indirect elicitation framework for automated environment design that learns relevant characteristics of participants based on observations of their behavior and optimizes designs based on learned models. Theoretical contributions include developing an active, indirect elicitation algorithm for a sequential decision-making setting that is guaranteed to discover effective designs after few interactions. Practical contributions include applications of the active, indirect elicitation framework to crowdsourcing. Specifically, I demonstrate how to automatically design tasks and synthesize workflows when optimizing for desired objectives given resource constraints. / Engineering and Applied Sciences
576

A MULTI-FUNCTIONAL PROVENANCE ARCHITECTURE: CHALLENGES AND SOLUTIONS

2013 December 1900 (has links)
In service-oriented environments, services are put together in the form of a workflow with the aim of distributed problem solving. Capturing the execution details of the services' transformations is a significant advantage of using workflows. These execution details, referred to as provenance information, are usually traced automatically and stored in provenance stores. Provenance data contains the data recorded by a workflow engine during a workflow execution. It identifies what data is passed between services, which services are involved, and how results are eventually generated for particular sets of input values. Provenance information is of great importance and has found its way through areas in computer science such as: Bioinformatics, database, social, sensor networks, etc. Current exploitation and application of provenance data is very limited as provenance systems started being developed for specific applications. Thus, applying learning and knowledge discovery methods to provenance data can provide rich and useful information on workflows and services. Therefore, in this work, the challenges with workflows and services are studied to discover the possibilities and benefits of providing solutions by using provenance data. A multifunctional architecture is presented which addresses the workflow and service issues by exploiting provenance data. These challenges include workflow composition, abstract workflow selection, refinement, evaluation, and graph model extraction. The specific contribution of the proposed architecture is its novelty in providing a basis for taking advantage of the previous execution details of services and workflows along with artificial intelligence and knowledge management techniques to resolve the major challenges regarding workflows. The presented architecture is application-independent and could be deployed in any area. The requirements for such an architecture along with its building components are discussed. Furthermore, the responsibility of the components, related works and the implementation details of the architecture along with each component are presented.
577

Semantics and planning based workflow composition and execution for video processing

Nadarajan, Gayathri January 2011 (has links)
Traditional workflow systems have several drawbacks, e.g. in their inabilities to rapidly react to changes, to construct workflow automatically (or with user involvement) and to improve performance autonomously (or with user involvement) in an incremental manner according to specified goals. Overcoming these limitations would be highly beneficial for complex domains where such adversities are exhibited. Video processing is one such domain that increasingly requires attention as larger amounts of images and videos are becoming available to persons who are not technically adept in modelling the processes that are involved in constructing complex video processing workflows. Conventional video and image processing systems, on the other hand, are developed by programmers possessing image processing expertise. These systems are tailored to produce highly specialised hand-crafted solutions for very specific tasks, making them rigid and non-modular. The knowledge-based vision community have attempted to produce more modular solutions by incorporating ontologies. However, they have not been maximally utilised to encompass aspects such as application context descriptions (e.g. lighting and clearness effects) and qualitative measures. This thesis aims to tackle some of the research gaps yet to be addressed by the workflow and knowledge-based image processing communities by proposing a novel workflow composition and execution approach within an integrated framework. This framework distinguishes three levels of abstraction via the design, workflow and processing layers. The core technologies that drive the workflow composition mechanism are ontologies and planning. Video processing problems provide a fitting domain for investigating the effectiveness of this integratedmethod as tackling such problems have not been fully explored by the workflow, planning and ontological communities despite their combined beneficial traits to confront this known hard problem. In addition, the pervasiveness of video data has proliferated the need for more automated assistance for image processing-naive users, but no adequate support has been provided as of yet. A video and image processing ontology that comprises three sub-ontologies was constructed to capture the goals, video descriptions and capabilities (video and image processing tools). The sub-ontologies are used for representation and inference. In particular, they are used in conjunction with an enhanced Hierarchical Task Network (HTN) domain independent planner to help with performance-based selection of solution steps based on preconditions, effects and postconditions. The planner, in turn, makes use of process models contained in a process library when deliberating on the steps and then consults the capability ontology to retrieve a suitable tool at each step. Two key features of the planner are the ability to support workflow execution (interleaves planning with execution) and can perform in automatic or semi-automatic (interactive) mode. The first feature is highly desirable for video processing problems because execution of image processing steps yield visual results that are intuitive and verifiable by the human user, as automatic validation is non trivial. In the semiautomaticmode, the planner is interactive and prompts the user tomake a tool selection when there is more than one tool available to perform a task. The user makes the tool selection based on the recommended descriptions provided by the workflow system. Once planning is complete, the result of applying the tool of their choice is presented to the user textually and visually for verification. This plays a pivotal role in providing the user with control and the ability to make informed decisions. Hence, the planner extends the capabilities of typical planners by guiding the user to construct more optimal solutions. Video processing problems can also be solved in more modular, reusable and adaptable ways as compared to conventional image processing systems. The integrated approach was evaluated on a test set consisting of videos originating from open sea environment of varying quality. Experiments to evaluate the efficiency, adaptability to user’s changing needs and user learnability of this approach were conducted on users who did not possess image processing expertise. The findings indicate that using this integrated workflow composition and execution method: 1) provides a speed up of over 90% in execution time for video classification tasks using full automatic processing compared to manual methods without loss of accuracy; 2) is more flexible and adaptable in response to changes in user requests (be it in the task, constraints to the task or descriptions of the video) than modifying existing image processing programs when the domain descriptions are altered; 3) assists the user in selecting optimal solutions by providing recommended descriptions.
578

Location-Aware Business Process Management for Real-time Monitoring of Patient Care Processes

Bougueng Tchemeube, Renaud 24 July 2013 (has links)
Long wait times are a global issue in the healthcare sector, particularly in Canada. Despite numerous research findings on wait time management, the issue persists. This is partly because for a given hospital, the data required to conduct wait times analysis is currently scattered across various information systems. Moreover, such data is usually not accurate (because of possible human errors), imprecise and late. The whole situation contributes to the current state of wait times. This thesis proposes a location-aware business process management system for real-time care process monitoring. More precisely, the system enables an improved visibility of process execution by gathering, as processes execute, accurate and granular process information including wait time measurements. The major contributions of this thesis include an architecture for the system, a prototype taking advantages of commercial real-time location system combined with a business process management system to accurately measure wait times, as well as a case study based on a real cardiology process from an Ontario hospital.
579

Retuschörens arbetsprocess : Djupintervjuer med professionella retuschörer / Retoucher’s work process : In-depth interviews with professional retouchers

Nilsson, Josefina January 2014 (has links)
Föreliggande studie har syftat till att, genom semistrukturerade djupintervjuer med professionella retuschörer aktiva i branschen, undersöka om gemensamma steg kunde urskiljas genom respondenternas beskrivningar av arbetsprocessen.Genom studien framkom det att respondenterna beskrev likartade processer samt att samarbete med medarbetare var en viktig faktor för ett utvecklande arbete. Beskrivningarna av arbetsprocessen tydde på att samtliga respondenter hade ett personligt intresse för arbetet och branschen.Respondenterna ansåg även att retuschörens tidigare erfarenhet var betydande och till stor del påverkar tidsåtgången för arbeten. De menade också att tidsåtgången även påverkas av hur utförlig arbetsordern är vid uppstart av projektet. Det framkom även att respondenterna själva inte genomför någon utförlig uppföljning och granskning av hur det slutliga materialet används. / The present study aimed to, through semi-structured in-depth interviews with professional retouchers active in the industry, examine if common steps could be identified by the respondents’ descriptions of the work process.The study revealed that respondents described similar processes, and collaboration with colleagues was an important factor for a developing work. The descriptions of the work indicated that all respondents had a personal interest in the work and in the industry.Respondents also felt that the retouchers previous experience was significant and largely affects the duration of the work. They also pointed out that the duration also is affected by the amount of details specified in the work order at the startup of the project. It also emerged that the respondents themselves did not make any detailed follow-up and review of how the final material was used.
580

Workflow modelling of coordinated inter-health-provider care plans

Browne, Eric Donald January 2005 (has links)
Workflow in healthcare, particularly for the shared and coordinated management of chronic illnesses, is very difficult to model. It is also difficult to support via current Clinical information Systems and current information technologies. This dissertation contributes significant enhancements to the current methodologies for designing and implementing workflow Management Systems (WfMSs) suitable for healthcare. The contribution comprises three interrelated aspects of workflow system architecture as follows:- Firstly, it shifts the emphasis of workflow modelling and enactment to a focus on goals, and the monitoring and facilitation of their achievement. Secondly, it introduces the concept of self-modifying workflow in the context of health care planning, whereby explicit tasks in the goal-based care plan are devoted to assessing and modifying downstream workflow. Thirdly, this dissertation proposes methodologies for identifying and dealing with tasks which overlap, subsume or interfere with other tasks elsewhere in a given workflow. / PhD Doctorate

Page generated in 0.0365 seconds