Spelling suggestions: "subject:"managemement forminformation lemsystems."" "subject:"managemement forminformation atemsystems.""
231 |
Resource complementarity of the internet and its role in creating sustained competitive advantage in firmsChow, Lo-sing, Charles, 周路成 January 2010 (has links)
published_or_final_version / Business / Doctoral / Doctor of Philosophy
|
232 |
THE USE OF ABSTRACTIONS IN MODEL MANAGEMENT.DOLK, DANIEL ROY. January 1982 (has links)
The concept of a generalized model management System (GMMS) and its role in a decision support system are discussed. A paradigm for developing a GMMS which integrates artificial intelligence techniques with data management concepts is presented. The paradigm views a GMMS as a knowledge-based modeling system (KBMS) with knowledge abstractions as the vehicle of knowledge and model representation. Knowledge abstractions are introduced as a hybrid of the predicate calculus, semantic network, and frame representations in artificial intelligence (AI) embodied in an equivalent of a programming language data abstraction structure. As a result, models represented by knowledge abstractions are not only subject to the powerful problem reduction and inference techniques available in the AI domain but are also in a form conducive to model management. The knowledge abstraction in its most general form is seen as a frame which serves as a template for generating abstraction instances for specific classes of models. The corollaries of an abstraction-based GMMS with current data management concepts are explored. A CODASYL implementation of an abstraction-based GMMS for the class of linear programming models is described and demonstrated.
|
233 |
Data manipulation in collaborative research systems.Lynch, Kevin John. January 1989 (has links)
This dissertation addresses data manipulation in collaborative research systems, including what data should be stored, the operations to be performed on that data, and a programming interface to effect this manipulation. Collaborative research systems are discussed, and requirements for next-generation systems are specified, incorporating a range of emerging technologies including multimedia storage and presentation, expert systems, and object-oriented database management systems. A detailed description of a generic query processor constructed specifically for one collaborative research system is given, and its applicability to next-generation systems and emerging technologies is examined. Chapter 1 discusses the Arizona Analyst Information System (AAIS), a successful collaborative research system being used at the University of Arizona and elsewhere. Chapter 2 describes the generic query processing approach used in the AAIS, as an efficient, nonprocedural, high-level programmer interface to databases. Chapter 3 specifies requirements for next-generation collaborative research systems that encompass the entire research cycle for groups of individuals working on related topics over time. These requirements are being used to build a next-generation collaborative research system at the University of Arizona called CARAT, for Computer Assisted Research and Analysis Tool. Chapter 4 addresses the underlying data management systems in terms of the requirements specified in Chapter 3. Chapter 5 revisits the generic query processing approach used in the AAIS, in light of the requirements of Chapter 3, and the range of data management solutions described in Chapter 4. Chapter 5 demonstrates the generic query processing approach as a viable one, for both the requirements of Chapter 3 and the DBMSs of Chapter 4. The significance of this research takes several forms. First, Chapters 1 and 3 provide detailed views of a current collaborative research system, and of a set of requirements for next-generation systems based on years of experience both using and building the AAIS. Second, the generic query processor described in Chapters 2 and 5 is shown to be an effective, portable programming language to database interface, ranging across the set of requirements for collaborative research systems as well as a number of underlying data management solutions.
|
234 |
FORMALISMS FOR BUSINESS INFORMATION SYSTEM DEVELOPMENT.KOTTEMANN, JEFFREY ERNST. January 1984 (has links)
The Development Environment (DE) developed in this research includes a methodology and specification and implementation of a tool environment for Management Information Systems (IS) development. In the DE proposed, business IS development includes, and indeed hinges on, organizational modeling. Specifically, the objective and strategy, task, and agent structures are modeled and analyzed. This initial analysis uncovers incompletenesses and inconsistencies in the organizational model as well as allowing top-down prioritization of business areas for further IS development. For information required to support decision making, various information attributes--e.g., currency--are used in modeling the information requirements. These attributes represent variables of information benefit levels to the user and of cost factors in IS development and operations. In later stages of development, these attributes serve as inference parameters that dictate system aspects such as communication architectures, database design, and process scheduling. Information outputs are decomposed to form an information processing architecture--an architecture comprised of interlinked data and processes--that minimizes the redundancy of IS resources. A specification of computer-aided tools for this and all steps in IS development are given. Methods and tools are developed for the determination of data store contents and physical structure, information processing requirements, system input requirements, data/process distribution, and data acquisition, disposal, and information processing scheduling. The DE developed as a part of the dissertation research attempts to draw together and extend upon many notions and methods of system development, decision support mechanisms including artificial intelligence based systems, value of information, and organizational planning and modeling, to form an integrated system development environment.
|
235 |
Three Case Studies On Business Collaboration And Process ManagementFan, Shaokun January 2012 (has links)
The importance of collaboration has been recognized for more than 2000 years. While recent improvement in technology creates vast opportunities for collaboration, effective collaboration remains challenging as ad hoc teams work across time, geographical, language, and technical boundaries, and suffer from process inefficiency. My dissertation addresses part of these challenges by proposing theoretical frameworks for business collaboration and process management. Case study is used as a research strategy for this thesis and it consists of three studies. The first study proposes a process modeling framework to support efficient process model design via model transformation and validation. First, we divide process modeling into three layers and formally define three layers of workflow models. Then, we develop a procedure for transforming a conceptual process model into its corresponding logical process model. Third, we create a validation procedure that can validate whether the derived logical model is consistent with its original conceptual model. The second study proposes a framework for analyzing the relationship between interaction processes and collaboration efficiency in software issue resolution in open source community. We first develop an algorithm to identify frequent interaction process structures referred to as interaction process patterns. Then, we assess patterns' impact through a time-dependent Cox regression model. By applying the interaction process analysis framework to software issue resolution processes, we identify several patterns that are significantly correlated with collaboration efficiency. We further conduct a case study to validate the findings of pattern efficiency in software issue resolution. The third study addresses the issue of suitability of virtual collaboration. Virtual collaboration seems to work well for some cases, but not for others. We define collaboration virtualization as the suitability for a task to be conducted virtually and propose a Collaboration Virtualization Theory (CVT) to explain collaboration virtualization. Three categories (i.e., task, technology, and team) of constructs that determine the suitability of collaboration virtualization are derived from a systematic literature review of related areas. In summary, this dissertation addresses challenges in collaboration and process management, and we believe that our research will have important theoretical and practical impacts on the development of collaboration management systems.
|
236 |
AEGIS platforms using KVA analysis to assess Open Architecture in sustaining engineeringAhart, Jennifer L. 06 1900 (has links)
The purpose of this thesis is to estimate the potential performance improvement in sustaining engineering (SE) when an Open Architecture (OA) approach to system development is used. Its basis is that in Integrated Warfare Systems (IWS) acquisition, 80% of total lifecycle costs occur during the Operation and Support phase. This statistic demonstrates the necessity of measuring how the OA approach will affect software upgrades and maintenance processes for the AEGIS IWS lifecycle. Using the OA approach, advances in distance support and monitoring and maintenance-free operating periods are possible; these advances are significant in supporting the need to reduce costs and manpower while improving performance. To estimate the potential (Return on Investment) ROI that an OA approach might enable SE in the form of software maintenance and upgrades, this thesis will apply the Knowledge Value Added (KVA) methodology to establish the baseline, "As Is," configuration of the current solutions in AEGIS. The KVA analysis will yield the ROI's and the current models for the approach to software maintenance and upgrades. Based on the assumptions of OA design for original system development, new approaches to distance and maintenance and monitoring will be explored in "To Be" solutions, and the ROIs will be estimated. The "To Be" solutions are rooted in the assumptions of MFOP and ARCI, and the results indicate that these solutions yield a potential improvement of 720% and a cost savings of $3 65,104.63 over the current methodology for just one ship. For all ships using AEGIS, ROI improves by 71,967%--with a cost savings of $2 6,543,824.56. The conclusion is that OA enables extension of these best practice approaches to AEGIS maintenance and upgrade solutions.
|
237 |
Multimethodology : an alternative management paradigm to process quality improvement.06 May 2008 (has links)
This thesis is about the formulation of a structured sequence of events using a multimethodology approach to facilitate the intervention and subsequent management, of key factors contributing to the failure of management information system development projects undertaken in the financial services industry1. Furthermore, a clear distinction is made between information system development projects undertaken within the ambit of the broader development context of ‘information technology’, as opposed to information system development projects undertaken within the ambit of the financial services industry, the latter, the focus of this thesis. The formulation of the structured sequence of events serving as mitigating factors, was mooted specifically as a result of known failure factors of management information systems development projects undertaken in the financial services industry. In terms of this research, these factors fall into two mainstream categories2, namely: Ø The quality of business requirement functional specifications. Ø Change to business requirement functional specifications, while the latter is still in the process of being developed. From the field research undertaken for this thesis both locally and abroad, the analogy was drawn that the above two factors are normally juxtaposed, contributing to multi-faceted impacts to information system development project lifecycles. Key impacts point to not only the escalation of previously approved budgets, but also to extended timelines and already mapped processes. The research shows that these two entities would typically lead to an executive call for rework of not only the business case, but also of the processes supporting the whole development. This could invariable culminate in the termination of the project or culminate in extensive recoding and process changes, which in turn would lead to the requirement for extensive change management initiatives. Alternatively, the additional rework could result in benefits harvesting from the initiative to be delayed or severely impacted. This statement is made with the clear caveat, that should the rework result in end user effectiveness being significantly boosted as a result of the required rework, to the extent that the ratio of operating profit over the benefit life span of the system to total development cost be raised, it would undoubtedly quantify such rework. The structured sequence of events serving as mitigating factors to facilitate the intervention and subsequent management of key factors contributing to the failure of management information system development projects are formulated from selected key elements of the following system methodologies namely: Ø The ‘Capability Maturity Model’, which Herbsleb et al.5 defines as ‘a reference model for appraising software process maturity and a normative model for helping software organizations progress along an evolutionary path from ad hoc, chaotic processes to mature disciplined software’. Ø The ‘Balanced Scorecard’, which Kaplan & Norton6 defines as ‘a management system that can motivate breakthrough improvements in such critical areas as product, process, customer, and market development’. A multimethodology approach will be deployed in the formulation of the mitigating factors from the above listed systems methodologies, underpinned by the concept ‘system’. This then would be further enhanced by the author’s own contributions gleaned from experience spanning some 34 years in systems development for the financial services industry, both locally and abroad. These mitigating factors will come into play at two specific levels of a typical information technology project lifecycle namely: Ø At the formulation of business requirement functional specifications. Ø During the development and testing stages, which are typically associated with change in the systems development lifecycle. Using a multimethodology approach, the interrelationship of the various core entities, gleaned from the above listed system methodologies, ultimately supporting the structured sequence of events serving as mitigating factors are graphically depicted below. In addition, the mitigating factors are positioned to reflect their potential position in a typical systems development life cycle 7, commonly associated with information system development for the financial services industry. The purpose of this thesis is then to determine if a set of mitigating factors can be developed from a structured sequence of events using a multimethodology approach to facilitate the intervention and subsequent management of key factors contributing to the failure of management information systems development undertaken in the financial services industry. Furthermore, the thesis proposes that the structured set of mitigating factors be incorporated as an alternative methodology within the ambit of the greater information technology project management life cycle for all project initiatives in the financial services industry. / Prof. N. Lessing
|
238 |
The Risks and Effects of Outsourcing on the Information Systems Function and the FirmPeak, Daniel Alan 05 1900 (has links)
IS outsourcing, especially large-scale IS outsourcing, is a comparatively recent and rapidly growing IS phenomenon, but it is also an inherently risky activity. In an IS outsourcing arrangement, the outsourcing vendor accepts responsibility for IS resources and functions formerly controlled directly by the firm. This research examines IS outsourcing from two perspectives. (1) From an IS perspective, it examines the risk perceptions of IS managers of fourteen Fortune-500 firms who had recently conducted an outsourcing evaluation. (2) From a financial perspective, it examines the theoretical relationship of IS outsourcing with financial performance, and investigates the empirical effects of IS outsourcing on the firm's market value and market risk. This research views IS outsourcing as an independent variable whose effects on the firm may be measured as changes in security returns, changes in asset risk, changes in capital structure, and long-term changes in profitability. To accomplish this, it characterizes IS outsourcing as a sale-and-leaseback transaction.
|
239 |
Analysis of magnetoencephalographic data as a nonlinear dynamical systemWoon, Wei Lee January 2002 (has links)
This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.
|
240 |
Pragmatic algorithms for implementing geostatistics with large datasetsIngram, Benjamin R. January 2008 (has links)
With the ability to collect and store increasingly large datasets on modern computers comes the need to be able to process the data in a way that can be useful to a Geostatistician or application scientist. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively for likelihood-based Geostatistics. Various methods have been proposed and are extensively used in an attempt to overcome these complexity issues. This thesis introduces a number of principled techniques for treating large datasets with an emphasis on three main areas: reduced complexity covariance matrices, sparsity in the covariance matrix and parallel algorithms for distributed computation. These techniques are presented individually, but it is also shown how they can be combined to produce techniques for further improving computational efficiency.
|
Page generated in 0.1491 seconds