• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 562
  • 442
  • 226
  • 209
  • 129
  • 41
  • 38
  • 31
  • 25
  • 19
  • 18
  • 15
  • 9
  • 8
  • 8
  • Tagged with
  • 1972
  • 254
  • 222
  • 186
  • 185
  • 180
  • 175
  • 146
  • 124
  • 116
  • 114
  • 111
  • 110
  • 101
  • 97
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

A Study on Applying of Business Rules Management Systems in Enterprises / Verslo taisyklių valdymo sistemų taikymo įmonėse tyrimas

Avdejenkov, Vladimir 04 May 2009 (has links)
The paper analyses topicalities of implementation of business rules and business rules management systems into enterprise business management information systems. It is rare nowadays to find a company which is not using an ERP (Enterprise Resource Planning) system for its own needs. These systems are intended to perform a centralized accounting of enterprise activities, including financial, logistic, production, service management and human resource management accounting. / Darbe analizuojamas verslo taisyklių bei verslo taisyklių valdymo sistemų (angl. rule engine) taikymas įmonių verslo valdymo (angl. Enterprise Resource Planning – ERP) sistemose, nagrinėjamos šio taikymo problemos.
102

Leveraging service-oriented business applications to a rigorous rule-centric dynamic behavioural architecture

Alqahtani, Ali January 2010 (has links)
Today’s market competitiveness and globalisation are putting pressure on organisations to join their efforts, to focus more on cooperation and interaction and to add value to their businesses. That is, most information systems supporting these cross-organisations are characterised as service-oriented business applications, where all the emphasis is put on inter-service interactions rather than intra-service computations. Unfortunately for the development of such inter-organisational service-oriented business systems, current service technology proposes only ad-hoc, manual and static standard web-service languages such as WSDL, BPEL and WS-CDL [3, 7]. The main objective of the work reported in this thesis is thus to leverage the development of service-oriented business applications towards more reliability and dynamic adaptability, placing emphasis on the use of business rules to govern activities, while composing services. The best available software-engineering techniques for adaptability, mainly aspect-oriented mechanisms, are also to be integrated with advanced formal techniques. More specifically, the proposed approach consists of the following incremental steps. First, it models any business activity behaviour governing any service-oriented business process as Event-Condition-Action (ECA) rules. Then such informal rules are made more interaction-centric, using adapted architectural connectors. Third, still at the conceptual-level, with the aim of adapting such ECA-driven connectors, this approach borrows aspect-oriented ideas and mechanisms, and proposes to intercept events, select the properties required for interacting entities, explicitly and separately execute such ECA-driven behavioural interactions and finally dynamically weave the results into the entities involved. To ensure compliance and to preserve the implementation of this architectural conceptualisation, the work adopts the Maude language as an executable operational formalisation. For that purpose, Maude is first endowed with the notions of components and interfaces. Further, the concept of ECA-driven behavioural interactions are specified and implemented as aspects. Finally, capitalising on Maude reflection, the thesis demonstrates how to weave such interaction executions into associated services.
103

A multilayer framework for quality of context in context-aware systems

Al-Shargabi, Asma Abdulghani Qassem January 2015 (has links)
Context-aware systems use context information to decide what adaptation actions to perform in response to changes in their environment. Depending on applications, context information includes physical context (e.g. temperature and location), user context (e.g. user preferences and user activity), and ICT context (e.g. device capabilities and battery power). Sensors are the main mean of capturing context. Unfortunately, sensed context data are commonly prone to imperfection due to the technical limitations of sensors, their availability, dysfunction, and the highly dynamic nature of environment. Consequently, sensed context data might be imprecise, erroneous, conflicting, or simply missing. To limit the impact of context imperfection on the behavior of a context-aware system, a notion of Quality of Context (QoC) is used to measure quality of any information that is used as context information. Adaptation is performed only if the context data used in the decision-making has an appropriate quality level. This thesis conducts a novel framework for QoC in context-aware systems, which is called MCFQoC (Multilayered-Context Framework for Quality of Context). The main innovative features of our framework, MCFQoC, include: (1) a new definition that generalizes the notion of QoC to encompass sensed context as well as user profiled context; (2) a novel multilayer context model, that distinguishes between three context abstractions: context situation, context object, and context element in descending order. A context element represents a single value and many context elements can be compound into a context object. Many context objects in turn form a context situation; (3) a novel model of QoC parameters which extends the existing parameters with new quality parameter and explicitly distributes the quality parameters across the three layers of context abstraction; (4) a novel algorithm, RCCAR (Resolving Context Conflicts Using Association Rules), which has been developed to resolve conflicts in context data using the Association Rules (AR) technique; (5) a novel mechanism to define QoC policy by assigning weights to QoC parameters using a multi-criteria decision-making technique called Analytical Hierarchy Process (AHP); (6) and finally, a novel quality control algorithm called IPQP (Integrating Prediction with Quality of context Parameters for Context Quality Control) for handling context conflicts, context missing values, and context erroneous values. IPQP is extension of RCCAR. Our framework, MCFQoC, has been implemented in MatLab and evaluated using a case study of a flood forecast system. Results show that the framework is expressive and modular, thanks to the multilayer context model and also to the notion QoC policy which enables us to assign weights for QoC’s parameters depending on quality requirements of each specific application. This flexibility makes it easy to apply our approach to a wider type of context-aware applications. As a part of MCFQoC framework, IPQP algorithm has been successfully tested and evaluated for QoC control using a variety of scenarios. The algorithm RCCAR has been tested and evaluated either individually and as a part of MCFQoC framework with a significant performance concerning resolving context conflicts. In addition, RCCAR has achieved a good success comparing to traditional prediction methods such as moving average (MA), weighted moving average, exponential smoothing, doubled exponential smoothing, and autoregressive moving average (ARMA).
104

The tax policy-making process in practice : a field study in Chile

Ormeño Pérez, Rodrigo Andres January 2014 (has links)
The purpose of this research project is to examine the technical, political, social, organisational and cultural ‘practices’ of tax policy making in order to gain an in-depth understanding of certain tax rules in the Chilean context. Consistent with a qualitative interpretivist approach, this study is informed by documents and three phases of face-to-face interviews with a range of actors engaged in the process of (re)making tax regulation. Through the views of a wide spectrum of participants, including policy makers (broadly defined), tax administrators, academics, tax practitioners and taxpayers, theoretical concepts were inductively developed. These concepts were combined with related tax policy literature and Bourdieusian concepts to construct a theoretical/conceptual framework which was later applied in interpreting the findings. The findings reveal how an élite group of agents forms a social space connected with the field of power. In this space, these agents define tax policy, draft legislation and budget for economic effects. This thesis illustrates how these agents mobilise different forms of capital from their respective fields in order to reach and access this social space. Transfer pricing processes highlight the fluidity of these spaces, allowing the access and influence of external forces. The research also shows that other stages are more distant from the field of power. The findings suggest the importance of tax knowledge and information in the development of tax regulation. Tax knowledge and information become a capital at stake which agents struggle to acquire. Empirical data show that the amount of tax knowledge and information in the space relating to the field of power is connected with the content and robustness of the transfer pricing rules under analysis. This research also suggests a high concentration of transfer pricing tax knowledge in very few agents across the bureaucratic, professional services and corporate/business fields. This research also shows the influence of social capital in the tax policy-making field. The findings show that bureaucrats and politicians consult with those connected with them who are subjects of trust. In the particularities of transfer pricing, the findings illustrate the importance of social capital in defining the content of tax rules. Finally, the study also shows how domination and two forms of violence are present and exercised across the tax policy-making field. This is one of only a few studies that have examined the practice of tax policy making holistically, from the very early stages to the application of the rules in practice, broadly contributing in this respect to the tax policy strand of literature. In contrast to previous descriptive and partial studies, this study captures the views of actors responsible for making tax rules. It also contributes to theory development by translating Bourdieusian tools to analyse tax policy making.
105

Semantic Validation of T&E XML Data

Moskal, Jakub, Kokar, Mieczyslaw, Morgan, John 10 1900 (has links)
ITC/USA 2015 Conference Proceedings / The Fifty-First Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2015 / Bally's Hotel & Convention Center, Las Vegas, NV / It is anticipated that XML will heavily dominate the next generation of telemetry systems. The syntax of XML-based languages can be constrained by a schema that describes the structure of valid documents. However, the schemas cannot express all dependencies between XML elements and attributes, both within a single document and across multiple documents. This prohibits the XML validation process from being fully automated with standard schema processors. This paper presents an approach that is based on the W3C Semantic Web technologies and allows different vendors and system integrators to independently develop their own semantic validation rules. The rules are equipped with powerful semantics, which allows for specification and validation of complex types of constraints. The approach is not specific to a particular T&E standard and is entirely standards-based.
106

Prudent ranking rules: theoretical contributions and applications

Lamboray, Claude 03 October 2007 (has links)
Arrow and Raynaud introduced a set of axioms that a ranking rule should verify. Among these, axiom V' states that the compromise ranking should be a so-called prudent order. Intuitively, a prudent order is a linear order such that the strongest opposition against this solution is minimal. Since the related literature lacks in solid theoretical foundations for this type of aggregation rule, it was our main objective in this thesis to thoroughly study and gain a better understanding of the family of prudent ranking rules. We provide characterizations of several prudent ranking rules in a conjoint axiomatic framework. We also prove that we can construct profiles for which the result of a prudent ranking rule and a non-prudent ranking rule can be contradictory. Finally we illustrate the use of prudent ranking rules in a group decision context and on the composite indicator problem.
107

匯率預測模型之分析與比較 / Analysis of exchange rates forecasting models

謝耀慶, Hsieh, Yao Ching Unknown Date (has links)
In this research, we review the relevant literatures to discuss the predictability of foreign exchange rates. Besides, we collect literatures to examine the development of the fundamental models, market models, technical analysis and trading rules and compare and evaluate the precision of these models. Moreover, we make a case study of a global leading investment bank to discuss how to use these models in practice. The result shows that fundamental models can help to establish the long-term equilibrium but have some shortcomings and thus we could adopt market models to resolve the shortages and the technical analyses and rules to set the exact price levels for trading purposes.
108

Design and Analysis of Decision Rules via Dynamic Programming

Amin, Talha M. 24 April 2017 (has links)
The areas of machine learning, data mining, and knowledge representation have many different formats used to represent information. Decision rules, amongst these formats, are the most expressive and easily-understood by humans. In this thesis, we use dynamic programming to design decision rules and analyze them. The use of dynamic programming allows us to work with decision rules in ways that were previously only possible for brute force methods. Our algorithms allow us to describe the set of all rules for a given decision table. Further, we can perform multi-stage optimization by repeatedly reducing this set to only contain rules that are optimal with respect to selected criteria. One way that we apply this study is to generate small systems with short rules by simulating a greedy algorithm for the set cover problem. We also compare maximum path lengths (depth) of deterministic and non-deterministic decision trees (a non-deterministic decision tree is effectively a complete system of decision rules) with regards to Boolean functions. Another area of advancement is the presentation of algorithms for constructing Pareto optimal points for rules and rule systems. This allows us to study the existence of “totally optimal” decision rules (rules that are simultaneously optimal with regards to multiple criteria). We also utilize Pareto optimal points to compare and rate greedy heuristics with regards to two criteria at once. Another application of Pareto optimal points is the study of trade-offs between cost and uncertainty which allows us to find reasonable systems of decision rules that strike a balance between length and accuracy.
109

A comparison of the Effects of Different Sizes of Ceiling Rules on the Estimates of Reliability of a Mathematics Achievement Test

Somboon Suriyawongse 05 1900 (has links)
This study compared the estimates of reliability made using one, two, three, four, five, and unlimited consecutive failures as ceiling rules in scoring a mathematics achievement test which is part of the Iowa Tests of Basic Skill (ITBS), Form 8. There were 700 students randomly selected from a population (N=2640) of students enrolled in the eight grades in a large urban school district in the southwestern United States. These 700 students were randomly divided into seven subgroups so that each subgroup had 100 students. The responses of all those students to three subtests of the mathematics achievement battery, which included mathematical concepts (44 items), problem solving (32 items), and computation (45 items), were analyzed to obtain the item difficulties and a total score for each student. The items in each subtest then were rearranged based on the item difficulties from the highest to the lowest value. In each subgroup, the method using one, two, three, four, five, and unlimited consecutive failures as the ceiling rules were applied to score the individual responses. The total score for each individual was the sum of the correct responses prior to the point described by the ceiling rule. The correct responses after the ceiling rule were not part of the total score. The estimate of reliability in each method was computed by alpha coefficient of the SPSS-X. The results of this study indicated that the estimate of reliability using two, three, four, and five consecutive failures as the ceiling rules were an improvement over the methods using one and unlimited consecutive failures.
110

New Probabilistic Interest Measures for Association Rules

Hahsler, Michael, Hornik, Kurt January 2006 (has links) (PDF)
Mining association rules is an important technique for discovering meaningful patterns in transaction databases. Many different measures of interestingness have been proposed for association rules. However, these measures fail to take the probabilistic properties of the mined data into account. In this paper, we start with presenting a simple probabilistic framework for transaction data which can be used to simulate transaction data when no associations are present. We use such data and a real-world database from a grocery outlet to explore the behavior of confidence and lift, two popular interest measures used for rule mining. The results show that confidence is systematically influenced by the frequency of the items in the left hand side of rules and that lift performs poorly to filter random noise in transaction data. Based on the probabilistic framework we develop two new interest measures, hyper-lift and hyper-confidence, which can be used to filter or order mined association rules. The new measures show significant better performance than lift for applications where spurious rules are problematic. / Series: Research Report Series / Department of Statistics and Mathematics

Page generated in 0.5195 seconds