• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 1
  • 1
  • Tagged with
  • 11
  • 11
  • 11
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The viable system model (VSM) and organisation theory : a complementary approach to the case of a pharmaceutical manufacturer

Ja'bari, Nasser Wahid January 1995 (has links)
The primary purpose of this research is to explore the relationships between Beer's viable system model (VSM) and mainstream functionalist organisation theory.The latter is taken to include the classical, behavioural and systems models of organisation. For completeness, we also consider organisation theory situated in the interpretive, radical humanist and radical structuralist paradigms of Burrell and Morgan's (1979) sociological grid. Models of mainstream organisation theory have been used extensively by organisation theorists in the structuring of organisations and the design of information systems. Little interest, however, has been paid by organisation theorists to Beer's VSM, which is also used by cyberneticians to structure organisations and design information systems. The problem is that both camps have developed in isolation from one another. Theorists in each camp advocate their own stance regardless what the other might have to offer to their thinking. This situation is a result of a gap between the two camps owing to lack of dialogue between them. The aim of this thesis is to attempt to bridge the gap between the two camps. It is the author's firm belief that this is best done by adopting a complementary approach to pinpoint domains of support each camp may offer to the other. The outcome of this approach is an enhanced model of organisation. Part One of the research begins by introducing the science of cybernetics. Its history, tools, techniques and concepts are then put in place. Building on cybernetic tools and techniques, Beer developed a model of any viable system. Beer's VSM is presented in Chapter 2. Part Two of the thesis is totally devoted to organisational theory. First, we take up models of the functionalist mainstream organisation theory. The approach adopted is first to elaborate on each model, then to contrast each with the VSM. Attention is then directed to organisation theory located in the alternative paradigms, that is, the interpretive, radical humanist and radical structuralist paradigms, respectively. Again, theory of organisation within the above mentioned paradigms is contrasted with the VSM. We mark the end of Part Two by presenting an enhanced model of organisation. This model is the outcome of the comparison which took place between the functionalist organisation theory and the VSM. The argument is that the likelihood of the classical model providing support to the VSM is slim. In fact, the former stands to gain much from the VSM, particularly from the notion of recursive structures which explains how control and communication systems must be designed and organised. The behavioural model, which takes the informal aspects of organisation as its core, appears to be a useful adjunct to the VSM, which concentrates primarily on the formal organisation. Again, the behavioural model stands to gain much from the insights offered by the VSM. At least, the view of openness to the environment would surely give the behavioural model a boost in the right direction. However, we focus our interest on the systems model of organisation, specifically, the notion of semiautonomous work groups encapsulated in the sociotechnical systems approach. By incorporating this notion into the VSM we can, it is hoped, enhance the VSM. Once again, the insights of the VSM, especially that of recursivity of its structure, is of immense significance. In Part Three, the enhanced model is put to the test. This is done by applying it to an existing pharmaceutical manufacturer. The model proves to be not only practical, but also powerful in highlighting domains requiring attention if the effectiveness and efficiency of the organisation in concern is to improve, which the VSM, on its own, cannot provide.
2

Modelling information usage and decision processes in new product introductions: An information processing perspective

Abraham, Thomas 01 January 1990 (has links)
The objective of this study is to understand the problem solving process used in new product introductions, and other unstructured business problems. I hope this understanding will contribute to improved decision support systems. Based on Cognitive psychology theories (in particular, Anderson, 1983, 1987), a set of propositions were outlined and investigated by using a computer model. One application of the expert system shell, used here, is to try to model the expert's knowledge. The shell is used to develop a system that simulates the expert's approach to problem solving. The differences between this application and expert system development, are: (i) the focus is on trying to understand the mind of the expert, instead of trying to replace him; and (ii) the problem area is ill-structured, instead of narrow and well-defined. The introduction of new products into markets is an example of an ill-structured problem, in a business setting. In particular, identifying opportunities is to create new products--their future growth and competitiveness often depends on this. The method adopted, computer simulation, has both advantages and limitations. The advantages include: (i) in-depth analysis of the problem-solving process; (ii) operationalizing the theory; and (iii) producing a program that can act as a research vehicle for future projects. The limitations are: (i) small sample size; (ii) lack of clear-cut validation procedures; and (iii) dependence on shell features. The findings, for the most part, supported the propositions (i) The expert model clearly had more procedural knowledge than the textbook model. This supports the proceduralization theory of skill acquisition. (ii) Reasoning by analogy was used by both expert and novices. The use of weak methods by the expert does not support the theory. (iii) The expert adopted a forward reasoning strategy within a task agenda. This supports the hierarchical goal structure theory of Anderson. (iv) The use of soft information was also observed.
3

Environment centered analysis and design of coordination mechanisms

Decker, Keith S 01 January 1995 (has links)
Coordination, as the act of managing interdependencies between activities, is one of the central research issues in Distributed Artificial Intelligence. Our thesis is that the design of coordination mechanisms cannot rely on the principled construction of agents alone, but must also rely on the structure and other characteristics of the agents' task environment. For example, the presence of both uncertainty and high variance in a task structure can lead to better performance in coordination algorithms that adapt to each problem-solving episode. Furthermore, the structure and characteristics of an environment can and should be used as the central guide to the design of coordination mechanisms, and thus must be a part of our eventual goal, a comprehensive theory of coordination, partially developed here. Our approach is to first develop a framework, TAEMS, to directly represent the salient features of a computational task environment. The unique features of TAEMS include that it quantitatively represents complex task interrelationships, and that it divides a task environment model into generative, objective, and subjective levels. We then extend a standard methodology to use the framework and apply it to the first published analysis, explanation, and prediction of agent performance in a distributed sensor network problem. We predict the effect of adding more agents, changing the relative cost of communication and computation, and changing how the agents are organized. Finally, we show how coordination mechanisms can be designed to respond to particular features of the task environment structure by developing the Generalized Partial Global Planning (GPGP) family of algorithms. GPGP is a cooperative (team-oriented) coordination component that is unique because it is built of modular mechanisms that work in conjunction with, but do not replace, a fully functional agent with a local scheduler. GPGP differs from other previous approaches in that it is not tied to a single domain, it allows agent heterogeneity, it exchanges less global information, it communicates at multiple levels of abstraction, and it allows the use of a separate local scheduling component. We prove that GPGP can be adapted to different domains, and learn what its performance is through simulation in conjunction with a heuristic real-time local scheduler and randomly generated abstract task environments.
4

Distributed algorithms for optimized resource management of LTE in unlicensed spectrum and UAV-enabled wireless networks

Challita, Ursula January 2018 (has links)
Next-generation wireless cellular networks are morphing into a massive Internet of Things (IoT) environment that integrates a heterogeneous mix of wireless-enabled devices such as unmanned aerial vehicles (UAVs) and connected vehicles. This unprecedented transformation will not only drive an exponential growth in wireless traffic, but it will also lead to the emergence of new wireless service applications that substantially differ from conventional multimedia services. To realize the fifth generation (5G) mobile networks vision, a new wireless radio technology paradigm shift is required in order to meet the quality of service requirements of these new emerging use cases. In this respect, one of the major components of 5G is self-organized networks. In essence, future cellular networks will have to rely on an autonomous and self-organized behavior in order to manage the large scale of wireless-enabled devices. Such an autonomous capability can be realized by integrating fundamental notions of artificial intelligence (AI) across various network devices. In this regard, the main objective of this thesis is to propose novel self-organizing and AI-inspired algorithms for optimizing the available radio resources in next-generation wireless cellular networks. First, heterogeneous networks that encompass licensed and unlicensed spectrum are studied. In this context, a deep reinforcement learning (RL) framework based on long short-term memory cells is introduced. The proposed scheme aims at proactively allocating the licensed assisted access LTE (LTE-LAA) radio resources over the unlicensed spectrum while ensuring an efficient coexistence with WiFi. The proposed deep learning algorithm is shown to reach a mixed-strategy Nash equilibrium, when it converges. Simulation results using real data traces show that the proposed scheme can yield up to 28% and 11% gains over a conventional reactive approach and a proportional fair coexistence mechanism, respectively. In terms of priority fairness, results show that an efficient utilization of the unlicensed spectrum is guaranteed when both technologies, LTE-LAA and WiFi, are given equal weighted priorities for transmission on the unlicensed spectrum. Furthermore, an optimization formulation for LTE-LAA holistic traffic balancing across the licensed and the unlicensed bands is proposed. A closed form solution for the aforementioned optimization problem is derived. An attractive aspect of the derived solution is that it can be applied online by each LTE-LAA small base station (SBS), adapting its transmission behavior in each of the bands, and without explicit communication with WiFi nodes. Simulation results show that the proposed traffic balancing scheme provides a better tradeoff between maximizing the total network throughput and achieving fairness among all network ows compared to alternative approaches from the literature. Second, UAV-enabled wireless networks are investigated. In particular, the problems of interference management for cellular-connected UAVs and the use of UAVs for providing backhaul connectivity to SBSs are studied. Speci cally, a deep RL framework based on echo state network cells is proposed for optimizing the trajectories of multiple cellular-connected UAVs while minimizing the interference level caused on the ground network. The proposed algorithm is shown to reach a subgame perfect Nash equilibrium upon convergence. Moreover, an upper and lower bound for the altitude of the UAVs is derived thus reducing the computational complexity of the proposed algorithm. Simulation results show that the proposed path planning scheme allows each UAV to achieve a tradeoff between minimizing energy efficiency, wireless latency, and the interference level caused on the ground network along its path. Moreover, in the context of UAV-enabled wireless networks, a UAV-based on-demand aerial backhaul network is proposed. For this framework, a network formation algorithm, which is guaranteed to reach a pairwise stable network upon convergence, is presented. Simulation results show that the proposed scheme achieves substantial performance gains in terms of both rate and delay reaching, respectively, up to 3.8 and 4-fold increase compared to the formation of direct communication links with the gateway node. Overall, the results of the different proposed schemes show that these schemes yield significant improvements in the total network performance as compared to current existing literature. In essence, the proposed algorithms can also provide self-organizing solutions for several resource management problems in the context of new emerging use cases in 5G networks, such as connected autonomous vehicles and virtual reality headsets.
5

A FORCEnet framework for analysis of existing naval C4I architectures /

Roche, Patrick G. January 2003 (has links) (PDF)
Thesis (M.S. in Systems Technology)--Naval Postgraduate School, June 2003. / Thesis advisor(s): William G. Kemple, John S. Osmundson. Includes bibliographical references (p. 103-105). Also available online.
6

A FORCEnet framework for analysis of existing naval C4I architectures

Roche, Patrick G. January 1900 (has links) (PDF)
Thesis (M.S.)--Naval Postgraduate School, 2003. / Title from title screen (viewed Oct. 31, 2005). "June 2003." Electronic book. Includes bibliographical references (p. 103-105). Also issued in paper format.
7

Utilisation of embedded information devices to support a sustainable approach to product life-cycle management

Kamal, Khurram January 2008 (has links)
The huge landfills from solid waste generated by the massive utilisation of different products from domestic sources are badly affecting the environment. About 70% of the solid municipal waste, two thirds of which comprises of household waste, is dumped as landflll all over the world. For efficient product lifecycle management via upgrade, maintenance, reuse, refurbishment, and reclamation of components etc., storage of product related information throughout its lifecycle is indispensable. Efficient use of information technology integrated with product design can enable products to manage themselves in a semiautomatic and intelligent manner. It means that products themselves should contain informationú that what to do with them when they are of no use. More advanced products may locate themselves and communicate with their recyclers through internet or some other communication technology. In this regard, different types of technologies have been investigated. These technologies are broadly classified as passive embedded information devices and active embedded information devices. Methods of automatic identification in combination with information technology can act as passive Embedded Information Devices (EID) to make products intelligent enough in order to manage associated information throughout their life cycles. Barcodes, Radio Frequency Identification tags, and a new technology called i-button technology were investigated as possible candidates for passive EIDs. The ibutton technology from the perspective of product lifecycle management is presented for the very first time in the literature. Experiments demonstrated that RFID and i-button technologies have potential to store not only the static but dynamic data up to some extent, such as small maintenance logs. As passive EIDs are unable to store the sensory data and detailed maintenance logs regarding a product, therefore, in addition to these demonstrators for passive EIDs, an advanced active EID demonstrator for lifecycle management of products with high functional complexity is also presented. Initially, the idea is presented as smart EID system that r~cords the sensory data of a refrigerator compressor and stores the detailed maintenance logs into the product itself. However, this idea is extended as intelligent EID that is implemented on a gearbox in order to predict the gearbox lifetime under an accelerated life test. This involves developmen,t of a novel on-chip life prediction algorithm to predict the gearbox lifetime under accelerated life testing scenario. The algorithm involves a combination of artificial neural networks and an appropriate reliability distribution. Results of accelerated life testing, simulation for the choice of appropriate reliability distribution and the life prediction algorithm are presented. Bi-directional communication software that is developed in order to retrieve lifecycle data from the intelligent EID and to keep intelligent EID updated is also explained. Overall, embedded information devices can be proposed as a good solution to support a sustainable approach to lifecycle management.
8

Využití prostředků umělé inteligence při řízení rizik / The Use of Artificial Intelligence in Risk Management

Zitterbart, Erik January 2010 (has links)
Diplomová práce se zabývá problematikou použití umělé inteligence v managementu rizik v kontextu malé výrobní společnosti Princ parket. Práce představuje společnost a přináší analýzu rizik, která vede k rozhodnutí zaměřit se na riziko poškození dobrého jména z důvodu produkce vadných výrobků. Jejím výsledkem je poskytnutí vyvinutých nástrojů RETUNN využívající metod Neuronových sítí, které umožňují predikci rizika a následnou implementaci opatření na snížení tohoto rizika.
9

A framework for managing global risk factors affecting construction cost performance

Baloi, Daniel January 2002 (has links)
Poor cost performance of construction projects has been a major concern for both contractors and clients. The effective management of risk is thus critical to the success of any construction project and the importance of risk management has grown as projects have become more complex and competition has increased. Contractors have traditionally used financial mark-ups to cover the risk associated with construction projects but as competition increases and margins have become tighter they can no longer rely on this strategy and must improve their ability to manage risk. Furthermore, the construction industry has witnessed significant changes particularly in procurement methods with clients allocating greater risks to contractors. Evidence shows that there is a gap between existing risk management techniques and tools, mainly built on normative statistical decision theory, and their practical application by construction contractors. The main reason behind the lack of use is that risk decision making within construction organisations is heavily based upon experience, intuition and judgement and not on mathematical models. This thesis presents a model for managing global risk factors affecting construction cost performance of construction projects. The model has been developed using behavioural decision approach, fuzzy logic technology, and Artificial Intelligence technology. The methodology adopted to conduct the research involved a thorough literature survey on risk management, informal and formal discussions with construction practitioners to assess the extent of the problem, a questionnaire survey to evaluate the importance of global risk factors and, finally, repertory grid interviews aimed at eliciting relevant knowledge. There are several approaches to categorising risks permeating construction projects. This research groups risks into three main categories, namely organisation-specific, global and Acts of God. It focuses on global risk factors because they are ill-defined, less understood by contractors and difficult to model, assess and manage although they have huge impact on cost performance. Generally, contractors, especially in developing countries, have insufficient experience and knowledge to manage them effectively. The research identified the following groups of global risk factors as having significant impact on cost performance: estimator related, project related, fraudulent practices related, competition related, construction related, economy related and political related factors. The model was tested for validity through a panel of validators (experts) and crosssectional cases studies, and the general conclusion was that it could provide valuable assistance in the management of global risk factors since it is effective, efficient, flexible and user-friendly. The findings stress the need to depart from traditional approaches and to explore new directions in order to equip contractors with effective risk management tools.
10

Chatbot v podnikovém informačním systému / Chatbot in an Enterprise Information System

Novák, Miroslav January 2019 (has links)
This diploma thesis deals with problems of development of chatbots. The theoretical part of the thesis introduces the concept of the conversational interface in general and analyzes available technologies for its development. The practical part deals with the design and implementation of a particular chatbot, whose goal is to be a virtual assistant in the process of selecting and purchasing goods. This is accomplished by connecting the chatbot to the product information management system using OData web services. One of the biggest problems was to determine the order of questions asked about product properties. For the implementation was used decision tree theory.

Page generated in 0.2091 seconds