• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 302
  • 92
  • 40
  • 14
  • 12
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 4
  • Tagged with
  • 658
  • 658
  • 391
  • 357
  • 117
  • 115
  • 109
  • 87
  • 87
  • 78
  • 77
  • 73
  • 73
  • 61
  • 60
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

Verslo valdymo modeliavimo programinė įranga / Software for business management modeling

Čelkys, Donatas 25 August 2010 (has links)
Įmonių sėkmė labai priklauso nuo veiklos organizavimo strategijos, metodikos ir principų bei vadovų priimamų sprendimų savalaikiškumo. Kad įmonės personalas sugebėtų laiku priimti strateginius sprendimus, reikia kuo operatyviau reaguoti į įmonės veiklą lemiančius dinaminius mikro ir makro aplinkos veiksnius, juos sisteminti bei analizuoti turimą ir naujai gautą informaciją. Šioje srityje labai svarbi organizaciją pasiekiančių informacijos srautų judėjimo greičio bei apdorojimo kokybė. Informacijos cirkuliacija rinkoje, jos įsisavinimas ir kryptingas panaudojimas priklauso ne tik nuo pačių informacijos vartotojų, bet ir nuo rinkoje sudarytų sąlygų šiai informacijai gauti ir naudotis. Dinamiškoje rinkos bei žinių ekonomikos aplinkoje labai reikšmingi tampa sprendimų priėmimo ir greito bei tikslaus reagavimo į rinkos pokyčius įgūdžiai, kuriuos kiekvienas verslininkas turėtų nuolat lavinti. Juk kiekvienos verslo organizacijos sugebėjimą dirbti pelningai ir būti konkurencingais lemia žinios apie tam tikrus rinkos veiksnius ir padėtį joje. / In a dynamic market environment and economy it is becoming a very important to be able to react immediately to the changes, that occurs in companies environment. Moreover, every business organization's ability to operate profitably and to be competitive depends on knowledge of certain market factors and situation there. Trying to reach the highest company management and decision-making quality is very useful and important for companies to make conditions for staff to train in business management. This type of trainings should be implemented in the way of using software for business modelling .This type of software would make it possible to educate business leaders and company decision-makers. In order to maximize the system usefulness, it is appropriate to use the expert systems. Expert system gives the opportunity to evaluate how unexpected factors can affect the company and its profit. Expert system helps to adopt theoretical knowledge in practical situations, so in this way the theoretical knowledge would be used in company administration process. In nowadays markets there is no public available business modelling software. So it is very topical to create this type of system. Finally, it was created business management modelling software that is orientated to educational process. The usage of an expert systems made up the opportunity to relate theoretical knowledge to practical experience.
282

Informing dialogue strategy through argumentation-derived evidence

Emele, Chukwuemeka David January 2011 (has links)
In many settings, agents engage in problem-solving activities, which require them to share resources, act on each others behalf, coordinate individual acts, etc. If autonomous agents are to e ectively interact (or support interaction among humans) in situations such as deciding whom and how to approach the provision of a resource or the performance of an action, there are a number of important questions to address. Who do I choose to delegate a task to? What do I need to say to convince him/her to do something? Were similar requests granted from similar agents in similar circumstances? What arguments were most persuasive? What are the costs involved in putting certain arguments forward? Research in argumentation strategies has received signi cant attention in recent years, and a number of approaches has been proposed to enable agents to reason about arguments to present in order to persuade another. However, current approaches do not adequately address situations where agents may be operating under social constraints (e.g., policies) that regulate behaviour in a society. In this thesis, we propose a novel combination of techniques that takes into consideration the policies that others may be operating with. First, we present an approach where evidence derived from dialogue is utilised to learn the policies of others. We show that this approach enables agents to build more accurate and stable models of others more rapidly. Secondly, we present an agent decision-making mechanism where models of others are used to guide future argumentation strategy. This approach takes into account the learned policy constraints of others, the cost of revealing in- formation, and anticipated resource availability in deciding whom to approach. We empirically evaluate our approach within a simulated multi-agent frame- work, and demonstrate that through the use of informed strategies agents can improve their performance.
283

Capture and maintenance of constraints in engineering design

Ajit, Suraj January 2009 (has links)
The thesis investigates two domains, initially the kite domain and then part of a more demanding Rolls-Royce domain (jet engine design). Four main types of refinement rules that use the associated application conditions and domain ontology to support the maintenance of constraints are proposed. The refinement rules have been implemented in ConEditor and the extended system is known as ConEditor+. With the help of ConEditor+, the thesis demonstrates that an explicit representation of application conditions together with the corresponding constraints and the domain ontology can be used to detect inconsistencies, redundancy, subsumption and fusion, reduce the number of spurious inconsistencies and prevent the identification of inappropriate refinements of redundancy, subsumption and fusion between pairs of constraints.
284

Accuracy of tropical cyclone induced winds using TYDET at Kadena AB

Fenlason, Joel W. 03 1900 (has links)
When a tropical cyclone (TC) is within 360 nautical miles of Kadena AB, the Air Force's Typhoon Determination (TYDET) program is used to estimate TC-induced winds expected at the base. Best-track data and Joint Typhoon Warning Center (JTWC) forecasts are used to evaluate systematic errors in TYDET. The largest contributors to errors in TYDET are a systematic error by which wind speeds are too large and the lack of size and symmetry parameters. To examine these parameters, best-track and forecasts are used to classify TCs as small or large and symmetric or asymmetric. A linear regression technique is then used to adjust TYDET forecasts based on the best-track and forecast position, size, and symmetry categories. Using independent data, over 65 percent of the overall cross-wind forecasts were improved and more than 60 percent of the cross-wind forecasts were improved when verifying conditions noted a cross-wind of 20 knots or greater. The effectiveness of the corrections and implications for TYDET forecasts are examined in relation to errors in forecast data used to initialize TYDET. A similar approach as developed here for the TYDET model at Kadena AB is proposed for other bases within the Pacific theater.
285

Effective use of artificial intelligence in predicting energy consumption and underground dam levels in two gold mines in South Africa

12 February 2015 (has links)
D.Ing. (Electrical and Electronic Engineering) / The electricity shortage in South Africa has required the implementation of demand side management (DSM) projects. The DSM projects were implemented by installing energy monitoring and control systems to monitor certain mining aspects such as water pumping systems. Certain energy saving procedures and control systems followed by the mining industry are not sustainable and must be updated regularly in order to meet any changes in the water pumping system. In addition, the present water pumping, monitoring, and control system does not predict the energy consumption or the underground water dam levels. Hence, there is a need to introduce new monitoring system that could control and predict the energy consumption of the underground water pumping system and dam levels based on present and historical data. The work is undertaken to investigate the feasibility of using artificial intelligence in certain aspects of the mining industry. If successful, artificial intelligence systems could lead to improved safety and reduced electrical energy consumption, and decreased human error that could occur throughout the pump station monitoring and control process ...
286

A Timescale Estimating Model for Rule-Based Systems

Moseley, Charles Warren 12 1900 (has links)
The purpose of this study was to explore the subject of timescale estimating for rule-based systems. A model for estimating the timescale necessary to build rule-based systems was built and then tested in a controlled environment.
287

An intelligent system for a telecommunications network domain.

02 June 2008 (has links)
Knowledge in organizations today is considered as one of the most important assets the organization possesses. A considerable part of this knowledge is the knowledge possessed by the individuals employed by the organization. In order for intelligent systems to perform some of the tasks their human counter parts perform in an organization the intelligent systems need to acquire the knowledge their human counter parts possess for the specific task. To develop an intelligent system that can perform a specific task in an organization, the knowledge needed to perform the task will have to be extracted from the individuals in the organization via knowledge acquisition. This knowledge will then be presented so that the intelligent system can understand it and perform the task. In order to develop an intelligent system an ontology representing the domain under consideration as well as the rules that constitute the reasoning behind the intelligent system needs to be developed. In this dissertation a development environment for developing intelligent systems called the Collaborative Ontology Builder for Reasoning and Analysis (COBRA) was developed. COBRA provides a development environment for developing the ontology and rules for an intelligent system. COBRA was used in this study to develop a Cellular telecommunications Network Consistency Checking Intelligent System (CNCCIS), which was implemented in a cellular telecommunications network. / Prof. E.M. Ehlers
288

Modeling and analysis of security

Unknown Date (has links)
Cloud Computing is a new computing model consists of a large pool of hardware and software resources on remote datacenters that are accessed through the Internet. Cloud Computing faces significant obstacles to its acceptance, such as security, virtualization, and lack of standardization. For Cloud standards, there is a long debate about their role, and more demands for Cloud standards are put on the table. The Cloud standardization landscape is so ambiguous. To model and analyze security standards for Cloud Computing and web services, we have surveyed Cloud standards focusing more on the standards for security, and we classified them by groups of interests. Cloud Computing leverages a number of technologies such as: Web 2.0, virtualization, and Service Oriented Architecture (SOA). SOA uses web services to facilitate the creation of SOA systems by adopting different technologies despite their differences in formats and protocols. Several committees such as W3C and OASIS are developing standards for web services; their standards are rather complex and verbose. We have expressed web services security standards as patterns to make it easy for designers and users to understand their key points. We have written two patterns for two web services standards; WS-Secure Conversation, and WS-Federation. This completed an earlier work we have done on web services standards. We showed relationships between web services security standards and used them to solve major Cloud security issues, such as, authorization and access control, trust, and identity management. Close to web services, we investigated Business Process Execution Language (BPEL), and we addressed security considerations in BPEL and how to enforce them. To see how Cloud vendors look at web services standards, we took Amazon Web Services (AWS) as a case-study. By reviewing AWS documentations, web services security standards are barely mentioned. We highlighted some areas where web services security standards could solve some AWS limitations, and improve AWS security process. Finally, we studied the security guidance of two major Cloud-developing organizations, CSA and NIST. Both missed the quality of attributes offered by web services security standards. We expanded their work and added benefits of adopting web services security standards in securing the Cloud. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2013.
289

Método de aquisição de conhecimento para sistemas especialistas destinados à diagnose de falhas: aplicação de técnicas de análise de confiabilidade e de risco. / Knowledge acquisition method for expert system to fault diagnosis: application of technical of reliability analysis and risk.

Hidalgo, Erick Miguel Portugal 24 November 2014 (has links)
O processo de aquisição do conhecimento é uma das principais etapas de desenvolvimento de um sistema especialista e é considerado como um dos estágios mais difíceis. Essa dificuldade se dá em virtude da inexistência de uma metodologia eficiente, confiável e padrão para extração e organização do conhecimento das várias fontes. O método apresentado neste trabalho é uma alternativa que pode ser empregada para adquirir o conhecimento para desenvolver sistemas especialistas para diagnóstico de falhas em diferentes áreas da indústria. Este trabalho apresenta um método que integra as técnicas de confiabilidade e risco, tais como, Análise de Modos e Efeitos de Falha (FMEA), Análise de Árvore de falhas (FTA) e Estudo de Perigo e Operabilidade (HAZOP) para aquisição do conhecimento para o diagnóstico de falhas. O método também permite estimar a periocidade da manutenção preventiva aplicando os conceitos de manutenção imperfeita e teoria de decisão multicritério. O método utilizada técnicas empregadas em análise de confiabilidade e risco para determinar a relação entre efeito da falha em um sistema e as suas causas raiz com o objetivo de estabelecer um procedimento estruturado para aquisição do conhecimento associado à relação causa-efeito em um sistema. O método foi validado com a comparação do histórico de falhas de um sistema hidráulico de uma usina hidrelétrica e, considerando-se que os eventos definidos como causa raiz registrados no histórico de falhas foram encontrados como resultados da análise pelo sistema especialista, tem-se a validação. O método para determinar a periocidade da manutenção preventiva foi validado com os resultados de artigos e com os planos de manutenção da usina. / The process of knowledge acquisition is a major step in developing an expert system and is considered as one of the most difficult stages. This difficulty is due to the lack of an efficient, reliable and standard methodology for extraction and organization of knowledge from various sources. The method presented in this thesis is an alternative that can be used to acquire the knowledge to develop expert systems for fault diagnosis in different areas of industry. This thesis presents a method that integrates risk and reliability analysis techniques such as Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA) and Hazard and Operability Study (HAZOP) for the acquisition of knowledge to fault diagnosis. The method also allows estimating the optimal intervention times of preventive maintenance by applying the imperfect maintenance and multicriteria concepts. The method uses techniques that are employed in reliability and risk analysis to determine the relationship between fault effect in the system and its root causes in order to establish a structured acquisition of knowledge associated with the causeeffect relationship in a system procedure. The method was validated by comparing the failure database related to a hydropower plant hydraulic system and, considering that the events defined as root causes recorded in the failure database were found by expert system, the method was validated. The method for determining the optimal intervention time for preventive maintenance has been validated with the results of articles and maintenance plans of the plant.
290

Sparse Coding and Compressed Sensing: Locally Competitive Algorithms and Random Projections

Unknown Date (has links)
For an 8-bit grayscale image patch of size n x n, the number of distinguishable signals is 256(n2). Natural images (e.g.,photographs of a natural scene) comprise a very small subset of these possible signals. Traditional image and video processing relies on band-limited or low-pass signal models. In contrast, we will explore the observation that most signals of interest are sparse, i.e. in a particular basis most of the expansion coefficients will be zero. Recent developments in sparse modeling and L1 optimization have allowed for extraordinary applications such as the single pixel camera, as well as computer vision systems that can exceed human performance. Here we present a novel neural network architecture combining a sparse filter model and locally competitive algorithms (LCAs), and demonstrate the networks ability to classify human actions from video. Sparse filtering is an unsupervised feature learning algorithm designed to optimize the sparsity of the feature distribution directly without having the need to model the data distribution. LCAs are defined by a system of di↵erential equations where the initial conditions define an optimization problem and the dynamics converge to a sparse decomposition of the input vector. We applied this architecture to train a classifier on categories of motion in human action videos. Inputs to the network were small 3D patches taken from frame di↵erences in the videos. Dictionaries were derived for each action class and then activation levels for each dictionary were assessed during reconstruction of a novel test patch. We discuss how this sparse modeling approach provides a natural framework for multi-sensory and multimodal data processing including RGB video, RGBD video, hyper-spectral video, and stereo audio/video streams. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2016. / FAU Electronic Theses and Dissertations Collection

Page generated in 0.0544 seconds