• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 33
  • 11
  • 6
  • 4
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 70
  • 70
  • 23
  • 21
  • 13
  • 12
  • 10
  • 9
  • 8
  • 8
  • 8
  • 7
  • 7
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Error rate performance metrics for digital communications systems

Hassanien, Mohamed A. M. January 2011 (has links)
In this thesis, novel error rate performance metrics and transmission solutions are investigated for delay limited communication systems and for co-channel interference scenarios. The following four research problems in particular were considered. The first research problem is devoted to analysis of the higher order ergodic moments of error rates for digital communication systems with time- unlimited ergodic transmissions and the statistics of the conditional error rates of digital modulations over fading channels are considered. The probability density function and the higher order moments of the conditional error rates are obtained. Non-monotonic behavior of the moments of the conditional bit error rates versus some channel model parameters is observed for a Ricean distributed channel fading amplitude at the detector input. Properties and possible applications of the second central moments are proposed. The second research problem is the non-ergodic error rate analysis and signaling design for communication systems processing a single finite length received sequence. A framework to analyze the error rate properties of non-ergodic transmissions is established. The Bayesian credible intervals are used to estimate the instantaneous bit error rate. A novel degree of ergodicity measure is introduced using the credible interval estimates to quantify the level of ergodicity of the received sequence with respect to the instantaneous bit error rate and to describe the transition of the data detector from the non-ergodic to ergodic zone of operation. The developed non-ergodic analysis is used to define adaptive forward error correction control and adaptive power control policies that can guarantee, with a given probability, the worst case instantaneous bit error rate performance of the detector in its transition fi'om the non-ergodic to ergodic zone of operation. In the third research problem, novel retransmission schemes are developed for delay-limited retransmissions. The proposed scheme relies on a reliable reverse link for the error-free feedback message delivery. Unlike the conventional automatic repeat request schemes, the proposed scheme does not require the use of cyclic redundancy check bits for error detection. In the proposed scheme, random permutations are exploited to locate the bits for retransmission in the predefined window within the packet. The retransmitted bits are combined using the maximal-ratio combining. The complexity-performance trade-offs of the proposed scheme is investigated by mathematical analysis as well as computer simulations. The bit error rate of the proposed scheme is independent of the packet length while the throughput is dependent on the packet length. Three practical techniques suitable for implementation are proposed. The performance of the proposed retransmission scheme was compared to the block repetition code corresponding to a conventional ARQ retransmission strategy. It was shown that, for the same number of retransmissions, and the same packet length, the proposed scheme always outperforms such repetition coding, and, in some scenarios, the performance improvement is found to be significant. Most of our analysis has been done for the case of AWGN channel, however, the case of a slow Rayleigh block fading channel was also investigated. The proposed scheme appears to provide the throughput and the BER reduction gains only for the medium to large SNR values. Finally, the last research problem investigates the link error rate performance with a single co-channel interference. A novel metric to assess whether the standard Gaussian approximation of a single interferer underestimates or overestimates the link bit error rate is derived. This metric is a function of the interference channel fading statistics. However, it is otherwise independent of the statistics of the desired signal. The key step in derivation of the proposed metric is to construct the standard Gaussian approximation of the interference by a non-linear transformation. A closed form expression of the metric is obtained for a Nakagami distributed interference fading amplitude. Numerical results for the case of Nakagami and lognormal distributed interference fading amplitude confirm the validity of the proposed metric. The higher moments, interval estimators and non-linear transformations were investigated to evaluate the error rate performance for different wireless communication scenarios. The synchronization channel is also used jointly with the communication link to form a transmission diversity and subsequently, to improve the error rate performance.
22

Inligtingontsluiting in 'n geintegreerde biblioteekrekenaarstelsel

De Kock, Martha Georgina 13 February 2014 (has links)
M.Bibl. / Please refer to full text to view abstract
23

StreamER: Evaluation Framework For Streaming Recommender Systems

Kosaraju, Sai Sri January 2018 (has links)
Recommender systems have gained a lot of popularity in recent times dueto their application in the wide range of fields. Recommender systems areintended to support users in finding the relevant items based on their interestsand preferences. Recommender algorithms proposed by researchersevolved over time from simple matching recommendations to machine learningalgorithms. One such class of algorithms with increasing focus is oncalled streaming recommender systems, these algorithms treat input data asa stream of events and make recommendations. To evaluate the algorithmsthat work with continuous data streams, stream-based evaluation techniquesare needed. So far, less interest is shown in the research so far on the evaluationof recommender systems in streaming environments.In this thesis, a simple evaluation framework named StreamER that evaluatesrecommender algorithms that work on streaming data is proposed.StreamER is intended for the rapid prototyping and evaluation of incrementalalgorithms. StreamER is designed and implemented using object-orientedarchitecture to make it more flexible and expandable. StreamER can beconfigured via a configuration file, which can configure algorithms, metricsand other properties individually. StreamER has inbuilt support for calculatingaccuracy metrics, namely click-through rate, precision, and recall.The popular-seller and random recommender are two algorithms supportedout of the box with StreamER. Evaluation of StreamER is performed via acombination of hypothesis and manual evaluation. Results have matched theproposed hypothesis, thereby successfully evaluating the proposed frameworkStreamER.
24

IS-MBNQA: A new framework for the relationship between Information Systems and organizational quality.

Chong, Hyonsong 12 1900 (has links)
Despite numerous frameworks and models proposed in the literature, Information Systems (IS) assessment still remains elusive. In addition, little agreement exists on the contribution of the IS function within an organization and on how IS is related to the other organizational dimensions. Frameworks that show the relationship between IS and the organization are in the developmental stage and this work proposes a more comprehensive framework to assist in better understanding the relationship between IS and organizational quality. This research examines two popular IS quality assessment frameworks - Malcolm Baldrige National Quality Award (MBNQA) and Information Systems Assessment (ISA) - and suggests a new framework, IS-MBNQA. This work integrates these two IS quality assessment frameworks into a single comprehensive model that provides a holistic view on how IS quality is interrelated to organizational quality. The existing two IS assessment frameworks attempted to measure IS quality at different levels within an organization. The MBNQA model is the most comprehensive quality framework because it takes an organization wide perspective. On the other hand, ISA employs an IS specific perspective and reflects the relationships of eight major IS success dimensions. ISA is a modified version of DeLone & McLean's model with the inclusion of a success factor for Service Quality. For this study, survey instruments are developed from the MBNQA and ISA frameworks and they are consolidated to allow testing of the single IS-MBNQA framework. Exploratory factor analysis is performed for instrument refinement and confirmatory factor analysis for validity of the models. The instruments developed in this work are utilized as a foundation for identifying the relationships among the dimensions within and between each model. A major contribution of this work is the validation of the 2000 MBNQA model and the extension of existing models/frameworks to better explain the IS contribution to an organization.
25

Information systems assessment: development of a comprehensive framework and contingency theory to assess the effectiveness of the information systems function.

Myers, Barry L. 08 1900 (has links)
The purpose of this research is to develop a comprehensive, IS assessment framework using existing IS assessment theory as a base and incorporating suggestions from other disciplines. To validate the framework and to begin the investigation of current IS assessment practice, a survey instrument was developed. A small group of subject matter experts evaluated and improved the instrument. The instrument was further evaluated using a small sample of IS representatives. Results of this research include a reexamination of the IS function measurement problem using new frameworks of analyses yielding (a) guidance for the IS manager or executive on which IS measures might best fit their organization, (b) a further verification of the important measures most widely used by IS executives, (c) a comprehensive, theoretically-derived, IS assessment framework, and by (d) the enhancement of IS assessment theory by incorporating ideas from actual practice. The body of knowledge gains a comprehensive, IS assessment framework that can be further tested for usefulness and applicability. Future research is recommended to substantiate and improve on these findings. Chapter 2 is a complete survey of prior research, subdivided by relevant literature divisions, such as organizational effectiveness, quality management, and IS assessment. Chapter 3 includes development of and support for the research questions, IS assessment framework, and the research model. Chapter 4 describes how the research was conducted. It includes a brief justification for the research approach, a description of how the framework was evaluated, a description of how the survey instrument was developed and evaluated, a description of the participants and how they were selected, a synopsis of the data collection procedures, a brief description of follow-up procedures, and a summary. Chapter 5 presents the results of the research. Chapter 6 is a summary and conclusion of the research. Finally, included in the appendices are definitions of terms, and copies of the original and improved survey instruments.
26

Improving System Reliability for Cyber-Physical Systems

Wu, Leon L. January 2015 (has links)
Cyber-physical systems (CPS) are systems featuring a tight combination of, and coordination between, the system’s computational and physical elements. Cyber-physical systems include systems ranging from critical infrastructure such as a power grid and transportation system to health and biomedical devices. System reliability, i.e., the ability of a system to perform its intended function under a given set of environmental and operational conditions for a given period of time, is a fundamental requirement of cyber-physical systems. An unreliable system often leads to disruption of service, financial cost and even loss of human life. An important and prevalent type of cyber-physical system meets the following criteria: processing large amounts of data; employing software as a system component; running online continuously; having operator-in-the-loop because of human judgment and an accountability requirement for safety critical systems. This thesis aims to improve system reliability for this type of cyber-physical system. To improve system reliability for this type of cyber-physical system, I present a system evaluation approach entitled automated online evaluation (AOE), which is a data-centric runtime monitoring and reliability evaluation approach that works in parallel with the cyber-physical system to conduct automated evaluation along the workflow of the system continuously using computational intelligence and self-tuning techniques and provide operator-in-the-loop feedback on reliability improvement. For example, abnormal input and output data at or between the multiple stages of the system can be detected and flagged through data quality analysis. As a result, alerts can be sent to the operator-in-the-loop. The operator can then take actions and make changes to the system based on the alerts in order to achieve minimal system downtime and increased system reliability. One technique used by the approach is data quality analysis using computational intelligence, which applies computational intelligence in evaluating data quality in an automated and efficient way in order to make sure the running system perform reliably as expected. Another technique used by the approach is self-tuning which automatically self-manages and self-configures the evaluation system to ensure that it adapts itself based on the changes in the system and feedback from the operator. To implement the proposed approach, I further present a system architecture called autonomic reliability improvement system (ARIS). This thesis investigates three hypotheses. First, I claim that the automated online evaluation empowered by data quality analysis using computational intelligence can effectively improve system reliability for cyber-physical systems in the domain of interest as indicated above. In order to prove this hypothesis, a prototype system needs to be developed and deployed in various cyber-physical systems while certain reliability metrics are required to measure the system reliability improvement quantitatively. Second, I claim that the self-tuning can effectively self-manage and self-configure the evaluation system based on the changes in the system and feedback from the operator-in-the-loop to improve system reliability. Third, I claim that the approach is efficient. It should not have a large impact on the overall system performance and introduce only minimal extra overhead to the cyber- physical system. Some performance metrics should be used to measure the efficiency and added overhead quantitatively. Additionally, in order to conduct efficient and cost-effective automated online evaluation for data-intensive CPS, which requires large volumes of data and devotes much of its processing time to I/O and data manipulation, this thesis presents COBRA, a cloud-based reliability assurance framework. COBRA provides automated multi-stage runtime reliability evaluation along the CPS workflow using data relocation services, a cloud data store, data quality analysis and process scheduling with self-tuning to achieve scalability, elasticity and efficiency. Finally, in order to provide a generic way to compare and benchmark system reliability for CPS and to extend the approach described above, this thesis presents FARE, a reliability benchmark framework that employs a CPS reliability model, a set of methods and metrics on evaluation environment selection, failure analysis, and reliability estimation. The main contributions of this thesis include validation of the above hypotheses and empirical studies of ARIS automated online evaluation system, COBRA cloud-based reliability assurance framework for data-intensive CPS, and FARE framework for benchmarking reliability of cyber-physical systems. This work has advanced the state of the art in the CPS reliability research, expanded the body of knowledge in this field, and provided some useful studies for further research.
27

Evaluation of Information Resource Management: Measuring Change in a Federal Bureaucracy

Perrin, Randolph D. 02 December 1993 (has links)
This is a case study of a federal bureaucracy and its Information Resource Management (IRM) organization. The Bonneville Power Administration (BPA), a federal power marketing agency and part of the Department of Energy, significantly impacts the western United States. BPA is responsible for developing electric power resources, transmission of electric resources. power conservation programs, and fish and wildlife programs. The focus of the study is the perception of the quality of information supplied to management, assuming that better information makes better decisions. The study uses a pre-experimental research design to evaluate the satisfaction executive and middle management with information they use to make decisions. The study uses both questionnaire and interview methodologies to examine management opinions before the establishment of IRM and two years after the establishment of IRM. Literature on bureaucracy indicates that decision making has limits and processes. Channels of communication, both formal and subformal are used by decision makers to gather information to fill information gaps. The gaps exist because formal channels of information do not supply sufficient information. Consequently, decision makers constantly search for information. There is a great deal of literature addressing IRM and other similar organizations. The technical and operational sides of information management are occasionally conflicting but, adequately addressed. Information assessment and evaluation are approached inadequately.
28

Searching without SQL: Re-engineering a database-centric web application with open-source information retrieval software.

Timothy A. Ross 26 November 2008 (has links)
This paper seeks to describe the process by which a database-centric web application was redesigned and rewritten to take advantage of Apache’s Lucene - an open-source information retrieval software library written in the Java programming language. After the implementation of a Lucene-based text index of “semi-structured data”, a college radio station's card catalog application was able to deliver higher-quality search results in significantly less time than it was able to do using just a relational database alone. Additionally, the dramatic improvements in speed and performance even allowed the search results interface to be redesigned and enhanced with an improved pagination system and new features such as faceted search/filtering.
29

Interactive visualization tools for spatial data & metadata

Antle, Alissa N. 11 1900 (has links)
In recent years, the focus of cartographic research has shifted from the cartographic communication paradigm to the scientific visualization paradigm. With this, there has been a resurgence of cognitive research that is invaluable in guiding the design and evaluation of effective cartographic visualization tools. The design of new tools that allow effective visual exploration of spatial data and data quality information in a resource management setting is critical if decision-makers and policy setters are to make accurate and confident decisions that will have a positive long-term impact on the environment. The research presented in this dissertation integrates the results of previous research in spatial cognition, visualization of spatial information and on-line map use in order to explore the design, development and experimental testing of four interactive visualization tools that can be used to simultaneously explore spatial data and data quality. Two are traditional online tools (side-by-side and sequenced maps) and two are newly developed tools (an interactive "merger" bivariate map and a hybrid o f the merger map and the hypermap). The key research question is: Are interactive visualization tools, such as interactive bivariate maps and hypermaps, more effective for communicating spatial information than less interactive tools such as sequenced maps? A methodology was developed in which subjects used the visualization tools to explore a forest species composition and associated data quality map in order to perform a range of map-use tasks. Tasks focused on an imaginary land-use conflict for a small region of mixed boreal forest in Northern Alberta. Subject responses in terms of performance (accuracy and confidence) and preference are recorded and analyzed. Results show that theory-based, well-designed interactive tools facilitate improved performance across all tasks, but there is an optimal matching between specific tasks and tools. The results are generalized into practical guidelines for software developers. The use of confidence as a measure of map-use effectiveness is verified. In this experimental setting, individual differences (in terms of preference, ability, gender etc.) did not significantly affect performance.
30

A presentation service for rapidly building interactive collaborative web applications

Sweeney, Michael, Engineering & Information Technology, Australian Defence Force Academy, UNSW January 2009 (has links)
Web applications have become a large segment of the software development domain but their rapid rise in popularity has far exceeded the support in software engineering. There are many tools and techniques for web application development, but the developer must still learn and use many complex protocols and languages. Products still closely bind data operations, business logic, and the user interface, limiting integration and interoperability. This thesis introduces an innovative new presentation service to help web application developers create better applications faster, and help them build high quality web user interfaces. This service acts as a broker between web browsers and applications, adding value with programming-language independent object-oriented technology. The second innovation is a generic graphics applet (GGA) component for the web browser user interface. This light component provides interactive graphics support for the creation of business charts, maps, diagrams, and other graphical displays in web applications. The combination of a presentation service program (BUS) and the GGA is explored in a series of experiments to evaluate their ability to improve web applications. The experiments provide evidence that the BUS and GGA technology enhances web application designs.

Page generated in 0.1626 seconds