281 |
E-mentoring and information systems effectiveness models: a useful nexus for evaluation in the small business contextRickard, Kim January 2007 (has links) (PDF)
While information communications technology provides new opportunties for supporting mentoring, there is a need to explore how effectively these potential benefits are being realised. The evaluation of the effectiveness of structured e-mentoring in the small business context is problematic because it is contingent upon a multitude of contextual factors and characterised by a range of research difficulties. A review of 31 effectiveness studies across the mentoring, ementoring and small business fields undertaken as part of this study provided a basis for systematically determining the nature of these research challenges. They included the heterogeneity and divergent pedagogical needs of individuals, the complexity of the mentoring phenomenon, measurement difficulties, the paradigm location of evaluation models, inherent problems with evaluation methodologies and data quality, and the almost contradictory imperatives to evaluate individualised outcomes while exploring commonalities and patterns in effectiveness. To extend understanding and knowledge in the field of e-mentoring for small business, it will be necessary to develop empirically-based theories of effective e-mentoring systems. As a means of contributing to the generation and refinement of theory, this study proposed a framework as a potential solution to some of the research challenges and contextual contingencies identified. The framework integrates the DeLone and McLean model of Information Systems Success (1992) which is based on the principle that Information Systems success is best evaluated by considering the dimensions of effectiveness - System quality, Information quality, Use, User satisfaction and Impact - together as a system rather than in isolation. The Rickard model extends this principle to structured e-mentoring, and adapts and redefines DeLone and McLean’s Information Systems dimensions for the mentoring context. The study investigated the framework as a means of consolidating and classifying the metrics used in the informing disciplinary areas, as a reference tool for designing qualitative and quantitative effectiveness measurement instruments, for selecting situationally-responsive research strategies, and most critically, for describing, classifying and interpreting variability in effectiveness outcomes. The framework was applied to evaluate the effectiveness of an Australian e-mentoring program targeted at self-employed professional contractors called Mentors Online. This examination of actual practice provided a basis for proposing a set of determinants of e-mentoring effectiveness. This work in turn provided a basis for understanding how the potential benefits of structured e-mentoring are being realised. Creating a nexus between structured e-mentoring effectiveness evaluation and DeLone and McLean’s Information Systems success model was shown to provide a justified, sufficient and useful basis for evaluating structured e-mentoring effectiveness, and therefore a means of contributing to the body of international literature on e-mentoring effectiveness.
|
282 |
Examining a technology acceptance model of internet usage by academics within Thai business schoolsKripanont, Napaporn January 2007 (has links) (PDF)
Information Technology has been a significant research area for some time, but its nature has changed considerably since the Internet became prominent just over a decade ago. Many researchers have studied and proposed theories and models of technology acceptance in order to predict and explain user behaviour with technology to account for rapid change in both technologies and their environments. Each theory or model has been proposed with different sets of determinants and moderators and most of them have been developed in the U.S. It is therefore questioned whether the theories and models of technology acceptance that have been developed, modified, and extended in the U.S. can be used in other countries, especially in Thailand. It is also questioned whether there might be other determinants and moderators that also play important roles in this specific environment. This thesis (1) reviewed literature in respect of nine prominent theories and models, (2) reviewed previous literature about IT acceptance and usage within four contexts of study, (3) investigated the extent to which academics use and intend to use the Internet in their work, (4) investigated how to motivate academics to make full use of the Internet in their work, (5) investigated to what extent using the Internet helps in improving academics’ professional practice, professional development and quality of working life, (6) formulated a research model of technology acceptance regarding Internet usage by Thai academics, and (7) generated and validated the research model that best describes Thai academics’ Internet usage behaviour and behaviour intention. These last two objectives represent the main focus of the thesis. Questionnaire survey method was used to collect primary data from 927 academics within Business Schools in 20 Public Universities in Thailand. The survey yielded 455 usable questionnaires, with a response rate of 49%. Statistical analysis methods and Structural Equation Modelling with AMOS version 6.0 were used to analyse data. The research model was formulated with five core determinants of usage and up to nine moderators of key relationships. It was then tested and modified, the final modified model evidenced by goodness of fit of the model to the data, explained 31.6% (Square Multiple Correlation) of the variance in usage behaviour in teaching , 42.6% in usage behaviour in other tasks, 55.7% in behaviour intention in teaching and 59.8% in behaviour intention in other tasks. From the findings, three core determinants: perceived usefulness, perceived ease of use and self-efficacy significantly determined usage behaviour in teaching. Two core determinants: perceived usefulness and self-efficacy significantly determined usage behaviour in other tasks. Finally, usage behaviour significantly influenced behaviour intention. In addition three moderators: age, e-university plan and level of reading and writing, impacted the influence of key determinants toward usage behaviour. Only two moderators: age and research university plan, impacted the influence of usage behaviour toward behaviour intention. The rest including gender, education level, academic position, experience and Thai language usage did not impact the influence of the key determinants toward usage behaviour and did not impact the influence of usage behaviour toward behaviour intention. Consequently, the final modified research model which is called the “Internet Acceptance Model” or “IAM” has the power to explain and predict user behaviour in a Thai Business Schools environment. A thorough understanding of the model may help practitioners to analyse the reasons for resistance toward the technology and also help them to take efficient measures to improve user acceptance and usage of the technology.
|
283 |
Design and analysis of hash functionsDanda, Murali Krishna Reddy January 2007 (has links) (PDF)
A function that compresses an arbitrarily large message into a fixed small size ‘message digest’ is known as a hash function. For the last two decades, many types of hash functions have been defined but, the most widely used in many of the cryptographic applications currently are hash functions based on block ciphers and the dedicated hash functions. Almost all the dedicated hash functions are generated using the Merkle-Damgård construction which is developed independently by Merkle and Damgård in 1989 [6, 7]. A hash function is said to be broken if an attacker is able to show that the design of the hash function violates at least one of its claimed security property. There are various types of attacking strategies found on hash functions, such as attacks based on the block ciphers, attacks depending on the algorithm, attacks independent of the algorithm, attacks based on signature schemes, and high level attacks. Besides this, in recent years, many structural weaknesses have been found in the Merkle-Damgård construction [51-54], which indirectly effects the hash functions developed based on this construction. MD5, SHA-0 and SHA-1 are currently the most widely deployed hash functions. However, they were all broken by Wang using a differential collision attack in 2004 [55-60], which increased the urgency of replacement for these widely used hash functions. Since then, many replacements and modifications have been proposed for the existing hash functions. The first alternative proposed is the replacement of the effected hash function with the SHA-2 group of hash functions. This thesis presents a survey on different types of the hash functions, different types of attacks on the hash functions and structural weaknesses of the hash functions. Besides that, a new type of classification based on the number of inputs to the hash function and based on the streamability and non-streamability of the design is presented. This classification consists of explanation of the working process of the already existing hash functions and their security analysis. Also, compression of the Merkle-Damgård construction with its related constructions is presented. Moreover, three major methods of strengthening hash functions so as to avoid the recent threats on hash functions are presented. The three methods dealt are: 1) Generating a collision resistant hash function using a new message preprocessing method called reverse interleaving. 2) Enhancement of hash functions such as MD-5 and SHA-1 using a different message expansion coding, and 3) Proposal of a new hash function called 3-branch. The first two methods can be considered as modifications and the third method can be seen as a replacement to the already existing hash functions which are effected by recent differential collision attacks. The security analysis of each proposal is also presented against the known generic attacks, along with some of the applications of the dedicated hash function.
|
284 |
Enterprise modelling: the key to successful business systems integrationBrudenell, John Francis January 2007 (has links) (PDF)
The Enterprise Modelling (EM) approach to systems design is followed to promote business information systems integration and a high degree of data integrity. This research reports on a comprehensive case study of one of Australia’s leading telecommunications carriers and service providers. The case study relates to the advent of Mobile Number Portability (MNP) into the Australian telecommunications market on 25 September 2001, a world’s first, real-time ‘Churn’ business process. Specifically, it reports on Service Level Agreement (SLA) and reporting performance of two similar systems evaluated in terms of accepted Information Systems Architectural Criteria. The researcher derived a number of architectural evaluation criteria from the literature, which provided insight into the ways of evaluating information systems. One purpose-built operational system, named the Mobile Number Portability System (MNPS) was designed and built using the latest object-oriented techniques and tools. The other system, named the Data Repository System (DRS) was designed using the EM approach. The MNPS failed to meet SLA functionality and reporting functionality. It performed poorly when evaluated in terms of accepted Information Systems Architectural Criteria. For example, the MNPS’ support of fundamental business rules was extremely poor. It should be noted that the SLA functionality was the most complex aspect of the system to design and implement, as it constantly changes according to the requirements of the Regulator (ACA). Hence, it was decided to build this functionality into a separate system, the DRS using a different approach based on EM. The new system was designed using this top-down approach. The DRS successfully met all SLA functionality and reporting functionality. It performed extremely well when evaluated against the Information Systems Architectural Criteria. The DRS significantly outperformed the MNPS confirming the claims made for the EM approach.
|
285 |
Computer recognition of musical instruments : an examination of within class classificationMoore, Robert January 2007 (has links) (PDF)
This dissertation records the development of a process that enables within class classification of musical instruments. That is, a process that identifies a particular instrument of a given type - in this study four guitars and five violins. In recent years there have been numerous studies where between class classification has been attempted, but there have been no attempts at within class classification. Since timbre is the quality/quantity that enables one musical sound to be differentiated from another, before any classification can take place, a means to measure and describe it in physical terms needs to be devised. Towards this end, a review of musical timbre is presented which includes research into musical timbre from the work of Helmholtz through to the present. It also includes related work in speech recognition and musical instrument synthesis. The representation of timbre used in this study is influenced by the work of Hourdin and Charbonneau who used an adaption of multi-dimensional scaling, based on frequency analysis over time, to represent the evolution of each musical tone. A trajectory path, a plot of frequencies over time for each tone, was used to represent the evolution of each tone. This is achieved by taking a sequence of samples from the initial waveform and applying the discrete Fourier transform (DFT) or the constant Q transform (CQT) to achieve a frequency analysis of each data window. The classification technique used, is based on statistical distance methods. Two sets of data were recorded for each of the guitars and violins in the study across the pitch range of each instrument type. In the classi¯cation trials, one set of data was used as reference tones, and the other set, as test tones. To measure the similarity of timbre for a pair of tones, the closeness of the two trajectory paths was measured. This was achieved by summing the squared distances between corresponding points along the trajectory paths. With four guitars, a 97% correct classification rate was achieved for tones of the same pitch (fundamental frequency), and for five violins, a 94% correct classification rate was achieved for tones of the same pitch. The robustness of the classification system was tested by comparing a smaller portion of the whole tone, by comparing tones of differing pitch, and a number of other variations. It was found that classification of both guitars and violins was highly sensitive to pitch. The classification rate fell away markedly when tones of different pitch were compared. Further investigation was done to examine the timbre of each instrument across the range of the instrument. This conformed that the timbres of the guitar and violin are highly frequency dependent and suggested the presence of formants that is, certain fixed frequencies that are boosted when the tone contains harmonics at or near those frequencies.
|
286 |
Cognitive Support during Object-Oriented Software Development: The Case of UML DiagramsCostain, Gay January 2008 (has links)
The Object Management Group (OMG) accepted the Unified Modelling Language (UML) as a standard in 1997, yet there is sparse empirical evidence to justify its choice. This research aimed to address that lack by investigating the modification of programs for which external representations, drawn using the UML notations most commonly used in industry, were provided. The aim of the research was to discover if diagrams using those UML notations provided the modifying programmer with cognitive support. The application of the use of modelling to assist program modification was chosen as a result of interviews that were carried out in New Zealand and North America to discover how workers in the software industry used modelling, and if so, whether UML notation satisfied their needs. The most preferred UML diagrams were identified from the interviews. A framework of modelling use in software development was derived. A longitudinal study at a Seattle-based company was the source that suggested that program modification should be investigated. The methodology chosen for the research required subjects to modify two non-trivial programs, one of which was supplied with UML documentation. There were two aspects to the methodology. First, the subjects’ performances with and without the aid of UML documentation were compared. Modifying a program is an exercise in problem solving which is a cognitive activity. If the use of UML improved subjects’ performances then it could be said that the UML had aided the subjects’ cognition. Second, concurrent verbal protocols were collected whilst the subjects modified the programs. The protocols for the modification with UML documentation, for ten of the more successful subjects, were transcribed and analysed according to a framework derived from the literature. The framework listed the possible cognitive steps involved in problem solving where cognition could be distributed to and from external representations. The categories of evidence that would confirm cognitive support were also derived from the literature. The experiments confirmed that programmers from similar backgrounds varied widely in ability and style. Twenty programmers modified both an invoice application and a diary application. There was some indication that the UML diagrams aided performance. The analyses of all ten of the transcribed subjects showed evidence of UML cognitive support.
|
287 |
A methodology for business processes identification: developing instruments for an effective enterprise system projectBerkowitz, Zeev January 2006 (has links)
Whole document restricted, see Access Instructions file below for details of how to access the print copy. / Since the mid 1990s, thousands of companies around the world have implemented Enterprise Systems (ES), which are considered to be the most important development in the corporate use of information technology. By providing computerized support to business processes spanning both the enterprise and the supply chain, these systems have become an indispensable tool utilized by organizations to accomplish and maintain efficient and effective operational performance. However, there are many cases in which ES implementation has failed in terms of the required time and budget, and more importantly, in terms of functionality and performance. One of the main causes of these failures is the misidentification and improper selection of business processes to be implemented into the ES, which are a crucial element of the system's implementation life cycle. In order to achieve effective implementation, a ‘necessary and sufficient’ set of business processes must be designed and implemented. Implementing an excessive set of business processes is costly; yet implementing an insufficient set is ruinous. The heuristic identification of the set of business processes, based on requirement elicitation, is flawed; there is no guarantee that all the necessary processes have been captured (Type I error), and/or that superfluous processes have been selected for implementation (Type II error). The existing implementation methods do not include a methodology to address this vital issue. This thesis aims to resolve this problem and to provide a methodology that will generate a necessary and sufficient set of business processes in a given organization, based on its specific characteristics, which will be used as a baseline for implementing an ES. A proper definition of the business processes and their associated properties is proposed and detailed. The properties are then used as parameters to generate the complete set of all the possible business processes in the organization; from this set, necessary and sufficient processes are selected. The methodology exposes the fundamental level of business processes, which are then used as a baseline for further phases in the implementation process. The proposed methodology has been tested through the analysis of companies that have implemented ES. In each of these cases, the identification of business processes utilizing the proposed methodology has proven to provide superior results to those obtained through all other implemented practices, producing a better approximation of their existing business processes.
|
288 |
Multi-Vendor System Network Management: A Roadmap for CoexistenceGutierrez, Jairo A. January 1997 (has links)
Whole document restricted, see Access Instructions file below for details of how to access the print copy. / As computer networks become more complex, and more heterogeneous (often involving systems from multiple vendors), the importance of integrated network management increases. This thesis summarises the efforts of research carried out 1 ) to identify the characteristics and requirements of an Integrated Network Management Environment (INME) and its individual components, 2) to propose a model to represent the INME, 3) to demonstrate the validity of the model, 4) to describe the steps needed to formally specify the model, and 5) to suggest an implementation plan for the INME. One of the key aspects of this thesis is the introduction of three different and complementary models used to integrate the emerging OSI management standards with the proven-and-tried network management solutions promoted by the Internet Activities Board. The Protocol-Oriented Network Management Model is used to represent the existing network management supported by the INME: ie, OSI and Internet-based systems. The Element-Oriented Network Management Model represents the components that are used within individual network systems. It describes the managed objects, and the platform application program interfaces (APIs). This model also includes the translation mechanisms needed to support the interaction between OSI managers and Internet agents. The Interoperability Model is used to represent the underlying communications infrastructure supporting network management. The communications between agents and managers is represented with this model by using the required protocol stacks (OSI or TCP/IP), and by depicting the interconnection between the entities using the network management functions. This three-pronged classification provides a richer level of abstraction facilitating the coexistence of the standard network management systems, allowing different levels of modeling. complexity, and improving the access to managed objects. The ultimate goal of this thesis is to describe a framework that assists developers of network management applications in the process of integrating their solutions to an open systems network management platform. This framework will also help network managers to minimise the risks involved in the transition from first generation network management systems to more integrated alternatives as they become available.
|
289 |
Cognitive Support during Object-Oriented Software Development: The Case of UML DiagramsCostain, Gay January 2008 (has links)
The Object Management Group (OMG) accepted the Unified Modelling Language (UML) as a standard in 1997, yet there is sparse empirical evidence to justify its choice. This research aimed to address that lack by investigating the modification of programs for which external representations, drawn using the UML notations most commonly used in industry, were provided. The aim of the research was to discover if diagrams using those UML notations provided the modifying programmer with cognitive support. The application of the use of modelling to assist program modification was chosen as a result of interviews that were carried out in New Zealand and North America to discover how workers in the software industry used modelling, and if so, whether UML notation satisfied their needs. The most preferred UML diagrams were identified from the interviews. A framework of modelling use in software development was derived. A longitudinal study at a Seattle-based company was the source that suggested that program modification should be investigated. The methodology chosen for the research required subjects to modify two non-trivial programs, one of which was supplied with UML documentation. There were two aspects to the methodology. First, the subjects’ performances with and without the aid of UML documentation were compared. Modifying a program is an exercise in problem solving which is a cognitive activity. If the use of UML improved subjects’ performances then it could be said that the UML had aided the subjects’ cognition. Second, concurrent verbal protocols were collected whilst the subjects modified the programs. The protocols for the modification with UML documentation, for ten of the more successful subjects, were transcribed and analysed according to a framework derived from the literature. The framework listed the possible cognitive steps involved in problem solving where cognition could be distributed to and from external representations. The categories of evidence that would confirm cognitive support were also derived from the literature. The experiments confirmed that programmers from similar backgrounds varied widely in ability and style. Twenty programmers modified both an invoice application and a diary application. There was some indication that the UML diagrams aided performance. The analyses of all ten of the transcribed subjects showed evidence of UML cognitive support.
|
290 |
A methodology for business processes identification: developing instruments for an effective enterprise system projectBerkowitz, Zeev January 2006 (has links)
Whole document restricted, see Access Instructions file below for details of how to access the print copy. / Since the mid 1990s, thousands of companies around the world have implemented Enterprise Systems (ES), which are considered to be the most important development in the corporate use of information technology. By providing computerized support to business processes spanning both the enterprise and the supply chain, these systems have become an indispensable tool utilized by organizations to accomplish and maintain efficient and effective operational performance. However, there are many cases in which ES implementation has failed in terms of the required time and budget, and more importantly, in terms of functionality and performance. One of the main causes of these failures is the misidentification and improper selection of business processes to be implemented into the ES, which are a crucial element of the system's implementation life cycle. In order to achieve effective implementation, a ‘necessary and sufficient’ set of business processes must be designed and implemented. Implementing an excessive set of business processes is costly; yet implementing an insufficient set is ruinous. The heuristic identification of the set of business processes, based on requirement elicitation, is flawed; there is no guarantee that all the necessary processes have been captured (Type I error), and/or that superfluous processes have been selected for implementation (Type II error). The existing implementation methods do not include a methodology to address this vital issue. This thesis aims to resolve this problem and to provide a methodology that will generate a necessary and sufficient set of business processes in a given organization, based on its specific characteristics, which will be used as a baseline for implementing an ES. A proper definition of the business processes and their associated properties is proposed and detailed. The properties are then used as parameters to generate the complete set of all the possible business processes in the organization; from this set, necessary and sufficient processes are selected. The methodology exposes the fundamental level of business processes, which are then used as a baseline for further phases in the implementation process. The proposed methodology has been tested through the analysis of companies that have implemented ES. In each of these cases, the identification of business processes utilizing the proposed methodology has proven to provide superior results to those obtained through all other implemented practices, producing a better approximation of their existing business processes.
|
Page generated in 0.1732 seconds