271 |
Upgrading Packaged Software: An Exploratory Study of Decisions, Impacts, and Coping Strategies from the Perspectives of StakeholdersKhoo, Huoy Min 11 January 2006 (has links)
Packaged software is widely adopted and has become an integral part of most organizations’ IT portfolios. Once packaged software is adopted, upgrades to subsequent versions appear to be inevitable. To date, research on packaged software upgrade has not received the attention that it warrants, as academic research continues to focus on initial technology adoption. To explore this understudied yet important area, three research questions were proposed: (1) What influences the decision to upgrade packaged software? (2) How do stakeholders cope with software upgrade? (3) How does a packaged software upgrade affect stakeholders? A qualitative research method was used to study the research questions. Two cases were conducted at a Fortune 500 company located in the Southeastern region of United States. The first case studied Windows 2000 upgrades and the second case studied SAP 4.6C upgrade. A theoretical model with six components was induced from the study; the components are decision, motivating forces, contingency forces, planned strategies, corrective actions, and impacts. Upgrade decisions are the outcome of interaction between motivating forces that can originate from internal and external environments, and contingency forces. A decision to upgrade will lead to both positive and negative impacts as experienced by users and IT groups. However, stakeholders’ experiences differ according to the types of software and also their roles in the company. Two types of strategies were observed in the study: planned strategies and corrective actions. Planned strategies were used to tackle anticipated issues, and corrective actions were adopted to solve ad hoc problems when negative impacts arose. Both strategies can affect the final outcome of impacts. Finally, in the event a corrective action was used, there is a chance that it will become a permanent planned strategy.
|
272 |
A Process to Reuse Experiences via Narratives Among Software Project ManagersPetter, Stacie Clark 10 May 2006 (has links)
Abstract A PROCESS TO REUSE EXPERIENCES VIA NARRATIVES AMONG SOFTWARE PROJECT MANAGERS By STACIE CLARK PETTER APRIL, 2006 Committee Chair: Dr. Vijay Vaishnavi Major Department: Computer Information Systems Software project management is a complex process requiring extensive planning, effective decision-making, and proper monitoring throughout the course of a project. Unfortunately, software project managers rarely capture and reuse the knowledge gained during a project on subsequent projects. To enable the repetition of prior successes and avoidance of previous mistakes, I propose that software project managers can improve their management abilities by reusing their own and others’ past experiences with written narratives. I use multiple methodologies – including literature review, grounded theory, design science research, and experimentation – to create a process for software project managers to reuse knowledge gained through experiences on software projects. In the literature review, I examine relevant research areas to inspire ideas on how to reuse knowledge via written narratives in software project management. Interviews with software project managers, analyzed using grounded theory, provide insight into the current challenges of reusing knowledge during a project. I leverage design science research methodology to develop a process of experience reuse that incorporates narratives and wikis to enable software project managers to share their experiences using written narratives. Experimentation evaluates whether the process developed using the design science research methodology improves the current knowledge reuse practices of software project managers.
|
273 |
A Study of the Quality of Service in Group Oriented Mobile TransactionsAhluwalia, Punit 17 July 2006 (has links)
In the emerging wireless Internet environment involving m-commerce and other mobile applications, an increasing number of users are likely to adopt mobile transactions. These transactions are likely to have very diverse requirements and some of them may require significant amount of network resources and/or bounded delays. Most quality-of-service research in wireless networks has hitherto focused on call or connection-level QoS. Many mobile transactions are expected to be distinct from the previously investigated applications in their criticality, level of resource required, and in their group characteristics. Examples of such transactions are ones involving a financial value. These unique requirements of mobile transactions necessitate introduction of new metrics for quality-of-service. To measure QoS effectiveness of mobile transactions, two new metrics, namely transaction completion probability and transaction response time are introduced in this research. Moreover, it is well known that wireless networks are constrained for bandwidth. Mobile transactions are expected to require varying degree of bandwidth which makes the resource allocation only at connection level very inefficient. This research proposes a new framework to support QoS requirements of mobile transactions by allocating bandwidth at connection and transaction levels. The proposed framework helps in achieving a balance between transaction completion probability and the response time. Simulation and analytical modeling are used to evaluate the QoS metrics under varying network and traffic scenarios and to validate the effectiveness of the new framework. The results show that the balanced transaction and connection level resource allocation can improve the probability of transactions completion and resource utilization but at the cost of slightly increased response time.
|
274 |
Individual-Technology Fit: Matching Individual Characteristics and Features of Biometric Interface Technologies with PerformanceRandolph, Adriane 18 May 2007 (has links)
Abstract INDIVIDUAL-TECHNOLOGY FIT: MATCHING INDIVIDUAL CHARACTERISTICS AND FEATURES OF BIOMETRIC INTERFACE TECHNOLOGIES WITH PERFORMANCE By ADRIANE B. RANDOLPH MAY 2007 Committee Chair: Dr. Melody Moore Jackson Major Department: Computer Information Systems The term biometric literally means “to measure the body”, and has recently been associated with physiological measures commonly used for personal verification and security applications. In this work, biometric describes physiological measures that may be used for non-muscularly controlled computer applications, such as brain-computer interfaces. Biometric interface technology is generally targeted for users with severe motor disabilities which may last long-term due to illness or injury or short-term due to temporary environmental conditions. Performance with a biometric interface can vary widely across users depending upon many factors ranging from health to experience. Unfortunately, there is no systematic method for pairing users with biometric interface technologies to achieve the best performance. The current methods to accommodate users through trial-and-error result in the loss of valuable time and resources as users sometimes have diminishing abilities or suffer from terminal illnesses. This dissertation presents a framework and methodology that links user characteristics and features of biometric interface technologies with performance, thus expediting the technology-fit process. The contributions include an outline of the underlying components of capturing and representing individual user characteristics and the impact on the performance of basic interaction tasks using a methodology called biometric user profiling. In addition, this work describes a methodology for objectively measuring an individual’s ability to control a specific biometric interface technology such as one based on measures of galvanic skin response or neural activity. Finally, this work incorporates these concepts into a new individual-technology fit framework for biometric interface technologies stemming from literature on task-technology fit. Key words: user profiles, biometric user profiling, biometric interfaces, fit, individual-technology fit, galvanic skin response, functional near-infrared, brain-computer interface
|
275 |
Escalation of Commitment in Information Technology Projects: A Goal Setting Theory PerspectiveKasi, Vijay 03 December 2007 (has links)
ABSTRACT ESCALATION OF COMMITMENT IN INFORMATION TECHNOLOGY PROJECTS: A GOAL SETTING THEORY PERSPECTIVE BY VIJAY KASI Aug 30, 2007 Committee Chair: Dr. Mark Keil Major Academic Unit: Center for Process Innovation Information technology (IT) projects are prone to failure. One explanation for the high failure rate among IT projects is that managers overly commit to a failing course of action, a phenomenon referred as escalation of commitment. While the notion of goals and commitment are central to the phenomenon of escalation, very few prior studies have investigated their impact on escalation. In this study, a research model rooted in goal setting theory is advanced to better understand escalation of commitment of IT project managers. A role-playing experiment with 350 IT managers was used to test the proposed research model. The results of the study suggest that IT managers are more willing to escalate their commitment under the influence of easy and vague goals compared to difficult and specific goals. Initial goal commitment of IT managers and the level of project completion were found to have a significant effect on IT manager’s willingness to continue. Initial goal commitment of IT managers was also found to moderate the relationship between goal difficulty and willingness to continue. In other words, when there is a higher level of goal commitment, an easy goal will have a greater effect in terms of promoting an individual’s willingness to continue.
|
276 |
Managing the Tension between Standardization and Customization in IT-enabled Service Provisioning: A Sensemaking PerspectiveLewis, Mark O. 18 August 2008 (has links)
The outsourcing literature has offered a plethora of perspectives and models for understanding decision determinants and outcomes of outsourcing of business processes. While past studies have contributed significantly to scholarly research in this area, there are an insufficient number of studies that are provider centric. Consequently, there is a need to understand how service providers address a core challenge: to achieve scalable growth by developing standardized offerings that can be sufficiently customized to meet the unique demands of individual customers. This study explores how patterns of collective action within and between a provider and two of their largest customers relate to the tension between standardization and customization of information technology (IT)-enabled service provisioning. Specifically, it investigates the relationship between such behavioral patterns and the development of an enterprise architecture designed to address the tension between standardization and customization. A socio-cognitive sensemaking framework consisting of six core properties provides the analytical lens through which the relationship is investigated. The study adopts an interpretive case study methodology guided by the assumption that distinct dimensions of the social world exist, but understanding them comes from inter-subjective interaction between researcher and subject. The approach adopts a combination of literal and theoretical replication strategies (Yin 1994) to help identify similarities and dissimilarities during cross case comparison. Data were collected from semi-structured interviews, direct observations, participant observations, and analysis of documentation and archival records. Our findings suggest that localized action at the expense of global coordination exacerbates the tension between standardization and customization. Furthermore, attempts to address the tension through the logics of spatial and temporal separation proved largely ineffective, as these initiatives put added pressure on the sensemaking processes responsible for guiding collective action. Our findings further suggest that a paradigm modification might be useful for service providers, where they shift their focus from reducing equivocality to improving their internal ability to respond to it. The results of this study contribute to a large body of outsourcing literature that has too often neglected a provider centric perspective. By uncovering key factors that exacerbate the tension within and between organizations, and providing practical methods for addressing them, this study also offers valuable insight for practicing managers.
|
277 |
An Examination of the Deaf Effect Response to Bad News Reporting in Information Systems ProjectsCuellar, Michael John 29 April 2009 (has links)
Information systems project management has historically been a problematic area. One of the reasons for this has been the issue of escalation where resources continue to be committed to a failing course of action. While many causes of escalation have been proposed, this dissertation investigates one possible cause: that the project manager may not hear, ignores or overrules a report of bad news to continue a failing course of action: the Deaf Effect response to bad news reporting. This effect has not been previously studied within the information systems literature. In this dissertation, the Deaf Effect is examined through a series of three laboratory experiments and a case study. It finds that in a conducive environment, where the bad news reporter is not seen as credible, and the risk of project failure is seen as low, decision makers tend to view the report of bad news as irrelevant and thus ignore or overrule the report of bad news and continue the current course of action. Role Prescription of the bad news reporter, illusion of control and a perception of a highly politicized environment are factors that also increase the occurrence of the Deaf Effect.
|
278 |
Quality in IS Research: Theory and Validation of Constructs for Service, Information, and SystemDing, Yi 16 November 2010 (has links)
IS quality is an important concept. Basing their model on information communication theory, DeLone and McLean formulated Information Quality and System Quality as two quintessential elements in their 1992 IS Success Model. In recent years, DeLone and McLean (2003) added Service Quality to form a triumvirate of antecedents to success. Unfortunately, the addition of this construct has unintentionally uncovered an overall lack of coherence in the theoretical modeling of IS Success. Research to date on IS Service Quality has largely ignored the impacts of Information Quality and System Quality when service is delivered through an information system (IS).
We believe deeper theoretical insights are needed to reconceptualize Service Quality and rationalize IS quality. After reviewing related literature, we apply marketing exchange theory as a reference framework to redefine service related terms and identify possible scenarios of delivering service through systems. Thereafter, we model IS quality in a new way, based on analysis of alternative scenarios. In validating our proposed model, we discuss our research methods and data analysis that will serve as empirical evidence. In particular, we focus on content validity, construct validity, nomological validity, and unidimensionality of the three IS quality dimensions: System Quality, Information Quality, and Service Quality.
By furthering our understanding of IS quality, we hope to initiate coherent theory development; this exercise should then lead to a theory that integrates IS quality elements and helps organizations implement effective strategies for using IS to deliver service. Through the empirical validation of IS quality model, we contribute an empirical assessment of content, construct, and nomological validity of the IS quality constructs, as proposed by DeLone and McLean in their 2003 updated IS success model.
|
279 |
Ecological Evolution of MIS ResearchLi, Liu-Pin 22 July 2002 (has links)
To this day, the field of the information systems (IS) research, belonging to an intact field, has evolved over thirty years. Due to the rapid development of information technology, the information systems have not only influenced enterprises and personal life but also impacted on IS research field. The development of Taiwan IS research community though lags behind that of foreign countries. During the past decade, the proportion of IS research community in Taiwan has increased tremendously and the IS research community has played an important role in local information systems and accumulation of relevant knowledge about business electronics as well. However, when the researchers want to propose their research results, they will encounter several difficulties, for instance: they do not fully understand the ecology of IS research, and it is not easy for them to keep up with the trends of future development of the IS, so they just follow the trends and cannot create them by themselves.
In order to transform local IS researches into becoming mainstream of international academic studies, we have to gain insights into IS ecology and diversity of IS studies. Many years earlier, there had been numerous studies about IS research community, but many of them focused on the classifications of IS studies. They did not have concise and accurate predictions, so we try to use rigid ecological modeling method to investigate IS research community in our research. This research will include the foundings, mortality, evolution, density dependence, key species of different IS research issues, and main internal and external environmental forces of IS research diversity as well in order to build the ecological model of IS studies. We hope to not only get hold of evolution and trends of IS studies but also enhance the qualities of IS studies and applications of information systems in Taiwan through this ecological model.
|
280 |
Competence systems /Lindgren, Rikard. January 1900 (has links)
Thesis (doctoral)--Göteborgs universitet, 2002. / Added t.p. with thesis statement inserted. Includes bibliographical references.
|
Page generated in 0.1612 seconds