• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 81
  • 30
  • 30
  • 21
  • 9
  • 6
  • 6
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 211
  • 211
  • 47
  • 38
  • 38
  • 34
  • 33
  • 26
  • 25
  • 24
  • 24
  • 24
  • 23
  • 22
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Sharing is Caring : Map the communication chain for a manufacturing company between their dealers and customers

Svensson, Oskar, Henriksson, Erik January 2016 (has links)
The purpose of the case study is to create a greater understanding of how the case company works with communication and information between the company and its customers and dealers. The objective of this thesis is to map the invisible value chain for a manufacturing company and make their information flow visible and point out flaws in the communication process between its customers and dealers. Theory and empirical findings will be analyzed and result in different recommendations and suggestions from the authors. The recommendations are discussed with a positive and a negative attributes to broaden the reader’s perspective. All the recommendations have been carefully selected to make sure an implementation would make the information flow more efficient for the case company.
12

Operationalization of lean thinking through value stream mapping with simulation and FLOW

bin Ali, Nauman January 2015 (has links)
Background: The continued success of Lean thinking beyond manufacturing has led to an increasing interest to utilize it in software engineering (SE). Value Stream Mapping (VSM) had a pivotal role in the operationalization of Lean thinking. However, this has not been recognized in SE adaptations of Lean. Furthermore, there are two main shortcomings in existing adaptations of VSM for an SE context. First, the assessments for the potential of the proposed improvements are based on idealistic assertions. Second, the current VSM notation and methodology are unable to capture the myriad of significant information flows, which in software development go beyond just the schedule information about the flow of a software artifact through a process. Objective: This thesis seeks to assess Software Process Simulation Modeling (SPSM) as a solution to the first shortcoming of VSM. In this regard, guidelines to perform simulation-based studies in industry are consolidated, and the usefulness of VSM supported with SPSM is evaluated. To overcome the second shortcoming of VSM, a suitable approach for capturing rich information flows in software development is identified and its usefulness to support VSM is evaluated. Overall, an attempt is made to supplement existing guidelines for conducting VSM to overcome its known shortcomings and support adoption of Lean thinking in SE. The usefulness and scalability of these proposals is evaluated in an industrial setting. Method: Three literature reviews, one systematic literature review, four industrial case studies, and a case study in an academic context were conducted as part of this research. Results: Little evidence to substantiate the claims of the usefulness of SPSM was found. Hence, prior to combining it with VSM, we consolidated the guidelines to conduct an SPSM based study and evaluated the use of SPSM in academic and industrial contexts. In education, it was found to be a useful complement to other teaching methods, and in the industry, it triggered useful discussions and was used to challenge practitioners’ perceptions about the impact of existing challenges and proposed improvements. The combination of VSM with FLOW (a method and notation to capture information flows, since existing VSM adaptions for SE are insufficient for this purpose) was successful in identifying challenges and improvements related to information needs in the process. Both proposals to support VSM with simulation and FLOW led to identification of waste and improvements (which would not have been possible with conventional VSM), generated more insightful discussions and resulted in more realistic improvements. Conclusion: This thesis characterizes the context and shows how SPSM was beneficial both in the industrial and academic context. FLOW was found to be a scalable, lightweight supplement to strengthen the information flow analysis in VSM. Through successful industrial application and uptake, this thesis provides evidence of the usefulness of the proposed improvements to the VSM activities.
13

Idiosyncratic risk, information flow, and earnings informativeness for family businesses

2013 February 1900 (has links)
Many previous studies find that family firms are prevalent among the U.S. firms. In particular, more than 35 percent of the S&P 500 firms consist of family firms in which families control about 18 percent of their firms’ shares. According to agency theory, the characteristics of a firm’s ownership, governance, and control play a critical role in the firm’s risk-taking activities and information flow to the market. Our study aims to investigate two controversies in the family business literature: whether family firms undertake fewer or more risks than non-family firms do, and whether family firms exhibit higher or lower information flow, reflected in their stock price informativeness and earnings informativeness, to the market. Using a sample of the S&P 500 companies as of 2003 for the period 2003-2007, we find that compared with non-family firms, the stock prices of family firms have more firm specific information impounded and the accounting earnings of family firms are more informative and thereby have more explanatory power for stock returns. These results are robust to different model specifications and variable proxies. In terms of risk-taking levels in corporate investment, our results indicate that family firms, on average, undertake fewer risks than non-family firms do. In particular, we find that although G-index is negatively associated with corporate risk-taking in non-family firms as previous studies (e.g. John et al., 2008) find for general firms, governance provisions do not have any influence on corporate risk-taking decisions in family firms. Numerous additional sensitivity tests using different corporate risk-taking proxies confirm the robustness of the findings.
14

Tags: Augmenting Microkernel Messages with Lightweight Metadata

Saif Ur Rehman, Ahmad January 2012 (has links)
In this work, we propose Tags, an e cient mechanism that augments microkernel interprocess messages with lightweight metadata to enable the development of new, systemwide functionality without requiring the modi cation of application source code. Therefore, the technology is well suited for systems with a large legacy code base and for third-party applications such as phone and tablet applications. As examples, we detailed use cases in areas consisting of mandatory security and runtime veri cation of process interactions. In the area of mandatory security, we use tagging to assess the feasibility of implementing a mandatory integrity propagation model in the microkernel. The process interaction veri cation use case shows the utility of tagging to track and verify interaction history among system components. To demonstrate that tagging is technically feasible and practical, we implemented it in a commercial microkernel and executed multiple sets of standard benchmarks on two di erent computing architectures. The results clearly demonstrate that tagging has only negligible overhead and strong potential for many applications.
15

Efficiency in Emergency medical service system : An analysis on information flow

Zhang, Xiang January 2007 (has links)
<p>In an information system which includes plenty of information services, we are always seeking a solution to enhance efficiency and reusability. Emergency medical service system is a classic information system using application integration in which the requirement of information flow transmissions is extremely necessary. We should always ensure this system is running in best condition with highest efficiency and reusability since the efficiency in the system directly affects human life.</p><p>The aim of this thesis is to analysis emergency medical system in both qualitative and quantitative ways. Another aim of this thesis is to suggest a method to judge the information flow through the analysis for the system efficiency and the correlations between information flow traffic and system applications.</p><p>The result is that system is a main platform integrated five information services. Each of them provides different unattached functions while they are all based on unified information resources. The system efficiency can be judged by a method called Performance Evaluation, the correlation can be judged by multi-factorial analysis of variance method.</p>
16

Improvements to Information Flow in the Physician Order Tracking Process

Doudareva, Evgueniia 22 November 2013 (has links)
In an emergency department (ED), information flow is of high value, as the ability to react quickly directly affects the patients’ well being. One of the gaps in the information flow is in the order tracking process. This paper focuses on modelling the feedback in this process from the order being issued until it has been fulfilled. We address this problem using discrete-event simulation. Additionally, we use the mathematical theory of communication to evaluate the information con- tent in the current and proposed systems. We perform computational tests on these models to compare their performance. Experimental results show that the problem can be effectively modelled using our approach and the effects of feedback on the physician decision-making can be better understood. The results indicate that additions of as little as one point of feedback have practically significant effects on the amount of time that an order spends in the system.
17

Improvements to Information Flow in the Physician Order Tracking Process

Doudareva, Evgueniia 22 November 2013 (has links)
In an emergency department (ED), information flow is of high value, as the ability to react quickly directly affects the patients’ well being. One of the gaps in the information flow is in the order tracking process. This paper focuses on modelling the feedback in this process from the order being issued until it has been fulfilled. We address this problem using discrete-event simulation. Additionally, we use the mathematical theory of communication to evaluate the information con- tent in the current and proposed systems. We perform computational tests on these models to compare their performance. Experimental results show that the problem can be effectively modelled using our approach and the effects of feedback on the physician decision-making can be better understood. The results indicate that additions of as little as one point of feedback have practically significant effects on the amount of time that an order spends in the system.
18

Information flows in a biotechnology company

Martin, Helen January 2000 (has links)
This case study of the information flows within a British biotechnology company involved a population of 156 and took place over five years. It included information provision and information management as embedded studies. The main investigation into information flows was done in three parts, using questionnaires. The parts were: Use of Information Centre information resources, company-wide information flows and assessment of the perceived effectiveness of existing information flows. Combined, these three parts represent a 'snapshot' of the flows over a timespan of about three months. The methodology used to present the individual information flows is novel. The results showed that inter-personal communication or information flows were good, with e-mail being extensively used; that most inter-Group flows were functional, but that flows through the company were poor. Information flow out of the company was restricted. The main barriers to effective flows were excessive secrecy which prevented open exchange of information, lack of finance and the split sites. Although these were only a few miles from the main building, the staff felt isolated. The results further show that the most used information resources were colleagues, and that the most used non-human information resources were not held in the IC. The main users of the IC were the R&D staff, while more than 50% of the company rarely or never used the facility. The investigation represents an early example of Knowledge Management and further documents a stage in the evolution of biotechnology companies.
19

Tags: Augmenting Microkernel Messages with Lightweight Metadata

Saif Ur Rehman, Ahmad January 2012 (has links)
In this work, we propose Tags, an e cient mechanism that augments microkernel interprocess messages with lightweight metadata to enable the development of new, systemwide functionality without requiring the modi cation of application source code. Therefore, the technology is well suited for systems with a large legacy code base and for third-party applications such as phone and tablet applications. As examples, we detailed use cases in areas consisting of mandatory security and runtime veri cation of process interactions. In the area of mandatory security, we use tagging to assess the feasibility of implementing a mandatory integrity propagation model in the microkernel. The process interaction veri cation use case shows the utility of tagging to track and verify interaction history among system components. To demonstrate that tagging is technically feasible and practical, we implemented it in a commercial microkernel and executed multiple sets of standard benchmarks on two di erent computing architectures. The results clearly demonstrate that tagging has only negligible overhead and strong potential for many applications.
20

Foundations of Quantitative Information Flow: Channels, Cascades, and the Information Order

Espinoza Becerra, Barbara 25 March 2014 (has links)
Secrecy is fundamental to computer security, but real systems often cannot avoid leaking some secret information. For this reason, the past decade has seen growing interest in quantitative theories of information flow that allow us to quantify the information being leaked. Within these theories, the system is modeled as an information-theoretic channel that specifies the probability of each output, given each input. Given a prior distribution on those inputs, entropy-like measures quantify the amount of information leakage caused by the channel. This thesis presents new results in the theory of min-entropy leakage. First, we study the perspective of secrecy as a resource that is gradually consumed by a system. We explore this intuition through various models of min-entropy consumption. Next, we consider several composition operators that allow smaller systems to be combined into larger systems, and explore the extent to which the leakage of a combined system is constrained by the leakage of its constituents. Most significantly, we prove upper bounds on the leakage of a cascade of two channels, where the output of the first channel is used as input to the second. In addition, we show how to decompose a channel into a cascade of channels. We also establish fundamental new results about the recently-proposed g-leakage family of measures. These results further highlight the significance of channel cascading. We prove that whenever channel A is composition refined by channel B, that is, whenever A is the cascade of B and R for some channel R, the leakage of A never exceeds that of B, regardless of the prior distribution or leakage measure (Shannon leakage, guessing entropy leakage, min-entropy leakage, or g-leakage). Moreover, we show that composition refinement is a partial order if we quotient away channel structure that is redundant with respect to leakage alone. These results are strengthened by the proof that composition refinement is the only way for one channel to never leak more than another with respect to g-leakage. Therefore, composition refinement robustly answers the question of when a channel is always at least as secure as another from a leakage point of view.

Page generated in 0.1266 seconds