• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 8
  • 4
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 28
  • 12
  • 9
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

The COMPASS Paradigm For The Systematic Evaluation Of U.S. Army Command And Control Systems Using Neural Network And Discrete Event Computer Simulation

Middlebrooks, Sam E. 15 April 2003 (has links)
In today's technology based society the rapid proliferation of new machines and systems that would have been undreamed of only a few short years ago has become a way of life. Developments and advances especially in the areas of digital electronics and micro-circuitry have spawned subsequent technology based improvements in transportation, communications, entertainment, automation, the armed forces, and many other areas that would not have been possible otherwise. This rapid "explosion" of new capabilities and ways of performing tasks has been motivated as often as not by the philosophy that if it is possible to make something better or work faster or be more cost effective or operate over greater distances then it must inherently be good for the human operator. Taken further, these improvements typically are envisioned to consequently produce a more efficient operating system where the human operator is an integral component. The formal concept of human-system interface design has only emerged this century as a recognized academic discipline, however, the practice of developing ideas and concepts for systems containing human operators has been in existence since humans started experiencing cognitive thought. An example of a human system interface technology for communication and dissemination of written information that has evolved over centuries of trial and error development, is the book. It is no accident that the form and shape of the book of today is as it is. This is because it is a shape and form readily usable by human physiology whose optimal configuration was determined by centuries of effort and revision. This slow evolution was mirrored by a rate of technical evolution in printing and elsewhere that allowed new advances to be experimented with as part of the overall use requirement and need for the existence of the printed word and some way to contain it. Today, however, technology is advancing at such a rapid rate that evolutionary use requirements have no chance to develop along side the fast pace of technical progress. One result of this recognition is the establishment of disciplines like human factors engineering that have stated purposes and goals of systematic determination of good and bad human system interface designs. However, other results of this phenomenon are systems that get developed and placed into public use simply because new technology allowed them to be made. This development can proceed without a full appreciation of how the system might be used and, perhaps even more significantly, what impact the use of this new system might have on the operator within it. The U.S. Army has a term for this type of activity. It is called "stove-piped development". The implication of this term is that a system gets developed in isolation where the developers are only looking "up" and not "around". They are thus concerned only with how this system may work or be used for its own singular purposes as opposed to how it might be used in the larger community of existing systems and interfaces or, even more importantly, in the larger community of other new systems in concurrent development. Some of the impacts for the Army from this mode of system development are communication systems that work exactly as designed but are unable to interface to other communications systems in other domains for battlefield wide communications capabilities. Having communications systems that cannot communicate with each other is a distinct problem in its own right. However, when developments in one industry produce products that humans use or attempt to use with products from totally separate developments or industries, the Army concept of product development resulting from stove-piped design visions can have significant implication on the operation of each system and the human operator attempting to use it. There are many examples that would illustrate the above concept, however, one that will be explored here is the Army effort to study, understand, and optimize its command and control (C2) operations. This effort is at the heart of a change in the operational paradigm in C2 Tactical Operations Centers (TOCs) that the Army is now undergoing. For the 50 years since World War II the nature, organization, and mode of the operation of command organizations within the Army has remained virtually unchanged. Staffs have been organized on a basic four section structure and TOCs generally only operate in a totally static mode with the amount of time required to move them to keep up with a mobile battlefield going up almost exponentially from lower to higher command levels. However, current initiatives are changing all that and while new vehicles and hardware systems address individual components of the command structures to improve their operations, these initiatives do not necessarily provide the environment in which the human operator component of the overall system can function in a more effective manner. This dissertation examines C2 from a system level viewpoint using a new paradigm for systematically examining the way TOCs operate and then translating those observations into validated computer simulations using a methodological framework. This paradigm is called COmputer Modeling Paradigm And Simulation of Systems (COMPASS). COMPASS provides the ability to model TOC operations in a way that not only includes the individuals, work groups and teams in it, but also all of the other hardware and software systems and subsystems and human-system interfaces that comprise it as well as the facilities and environmental conditions that surround it. Most of the current literature and research in this area focuses on the concept of C2 itself and its follow-on activities of command, control, communications (C3), command, control, communications, and computers (C4), and command, control, communications, computers and intelligence (C4I). This focus tends to address the activities involved with the human processes within the overall system such as individual and team performance and the commander's decision-making process. While the literature acknowledges the existence of the command and control system (C2S), little effort has been expended to quantify and analyze C2Ss from a systemic viewpoint. A C2S is defined as the facilities, equipment, communications, procedures, and personnel necessary to support the commander (i.e., the primary decision maker within the system) for conducting the activities of planning, directing, and controlling the battlefield within the sector of operations applicable to the system. The research in this dissertation is in two phases. The overall project incorporates sequential experimentation procedures that build on successive TOC observation events to generate an evolving data store that supports the two phases of the project. Phase I consists of the observation of heavy maneuver battalion and brigade TOCs during peacetime exercises. The term "heavy maneuver" is used to connotate main battle forces such as armored and mechanized infantry units supported by artillery, air defense, close air, engineer, and other so called combat support elements. This type of unit comprises the main battle forces on the battlefield. It is used to refer to what is called the conventional force structure. These observations are conducted using naturalistic observation techniques of the visible functioning of activities within the TOC and are augmented by automatic data collection of such things as analog and digital message traffic, combat reports generated by the computer simulations supporting the wargame exercise, and video and audio recordings where appropriate and available. Visible activities within the TOC include primarily the human operator functions such as message handling activities, decision-making processes and timing, coordination activities, and span of control over the battlefield. They also include environmental conditions, functional status of computer and communications systems, and levels of message traffic flows. These observations are further augmented by observer estimations of such indicators as perceived level of stress, excitement, and level of attention to the mission of the TOC personnel. In other words, every visible and available component of the C2S within the TOC is recorded for analysis. No a priori attempt is made to evaluate the potential significance of each of the activities as their contribution may be so subtle as to only be ascertainable through statistical analysis. Each of these performance activities becomes an independent variable (IV) within the data that is compared against dependent variables (DV) identified according to the mission functions of the TOC. The DVs for the C2S are performance measures that are critical combat tasks performed by the system. Examples of critical combat tasks are "attacking to seize an objective", "seizure of key terrain", and "river crossings'. A list of expected critical combat tasks has been prepared from the literature and subject matter expert (SME) input. After the exercise is over, the success of these critical tasks attempted by the C2S during the wargame are established through evaluator assessments, if available, and/or TOC staff self analysis and reporting as presented during after action reviews. The second part of Phase I includes datamining procedures, including neural networks, used in a constrained format to analyze the data. The term constrained means that the identification of the outputs/DV is known. The process was to identify those IV that significantly contribute to the constrained DV. A neural network is then constructed where each IV forms an input node and each DV forms an output node. One layer of hidden nodes is used to complete the network. The number of hidden nodes and layers is determined through iterative analysis of the network. The completed network is then trained to replicate the output conditions through iterative epoch executions. The network is then pruned to remove input nodes that do not contribute significantly to the output condition. Once the neural network tree is pruned through iterative executions of the neural network, the resulting branches are used to develop algorithmic descriptors of the system in the form of regression like expressions. For Phase II these algorithmic expressions are incorporated into the CoHOST discrete event computer simulation model of the C2S. The programming environment is the commercial programming language Micro Saintä running on a PC microcomputer. An interrogation approach was developed to query these algorithms within the computer simulation to determine if they allow the simulation to reflect the activities observed in the real TOC to within an acceptable degree of accuracy. The purpose of this dissertation is to introduce the COMPASS concept that is a paradigm for developing techniques and procedures to translate as much of the performance of the entire TOC system as possible to an existing computer simulation that would be suitable for analyses of future system configurations. The approach consists of the following steps: • Naturalistic observation of the real system using ethnographic techniques. • Data analysis using datamining techniques such as neural networks. • Development of mathematical models of TOC performance activities. • Integration of the mathematical into the CoHOST computer simulation. • Interrogation of the computer simulation. • Assessment of the level of accuracy of the computer simulation. • Validation of the process as a viable system simulation approach. / Ph. D.
22

Apprentissage automatique à partir de traces multi-sources hétérogènes pour la modélisation de connaissances perceptivo-gestuelles / Automatic knowledge acquisition from multisource heterogeneous traces for perceptual-gestural knowledge modeling

Toussaint, Ben-Manson 12 October 2015 (has links)
Les connaissances perceptivo-gestuelles sont difficiles à saisir dans les Systèmes Tutoriels Intelligents. Ces connaissances sont multimodales : elles combinent des connaissances théoriques, ainsi que des connaissances perceptuelles et gestuelles. Leur enregistrement dans les Systèmes Tutoriels Intelligents implique l'utilisation de plusieurs périphériques ou capteurs couvrant les différentes modalités des interactions qui les sous-tendent. Les « traces » de ces interactions –aussi désignées sous le terme "traces d'activité"- constituent la matière première pour la production de services tutoriels couvrant leurs différentes facettes. Les analyses de l'apprentissage ou les services tutoriels privilégiant une facette de ces connaissances au détriment des autres, sont incomplets. Cependant, en raison de la diversité des périphériques, les traces d'activité enregistrées sont hétérogènes et, de ce fait, difficiles à modéliser et à traiter. Mon projet doctoral adresse la problématique de la production de services tutoriels adaptés à ce type de connaissances. Je m'y intéresse tout particulièrement dans le cadre des domaines dits mal-définis. Le cas d'étude de mes recherches est le Système Tutoriel Intelligent TELEOS, un simulateur dédié à la chirurgie orthopédique percutanée. Les propositions formulées se regroupent sous trois volets : (1) la formalisation des séquences d'interactions perceptivo-gestuelles ; (2) l'implémentation d'outils capables de réifier le modèle conceptuel de leur représentation ; (3) la conception et l'implémentation d'outils algorithmiques favorisant l'analyse de ces séquences d'un point de vue didactique. / Perceptual-gestural knowledge is multimodal : they combine theoretical and perceptual and gestural knowledge. It is difficult to capture in Intelligent Tutoring Systems. In fact, its capture in such systems involves the use of multiple devices or sensors covering all the modalities of underlying interactions. The "traces" of these interactions -also referred to as "activity traces"- are the raw material for the production of key tutoring services that consider their multimodal nature. Methods for "learning analytics" and production of "tutoring services" that favor one or another facet over others, are incomplete. However, the use of diverse devices generates heterogeneous activity traces. Those latter are hard to model and treat.My doctoral project addresses the challenge related to the production of tutoring services that are congruent to this type of knowledge. I am specifically interested to this type of knowledge in the context of "ill-defined domains". My research case study is the Intelligent Tutoring System TELEOS, a simulation platform dedicated to percutaneous orthopedic surgery.The contributions of this thesis are threefold : (1) the formalization of perceptual-gestural interactions sequences; (2) the implementation of tools capable of reifying the proposed conceptual model; (3) the conception and implementation of algorithmic tools fostering the analysis of these sequences from a didactic point of view.
23

Individuell marknadsföring : En kvalitativ studie om konsumenters uppfattning kring individanpassad marknadsföring gällande personlig integritet / Individual marketing : A qualitative study on consumers' perceptions of personalized marketing regarding personal privacy

Asp Sandin, Agnes January 2020 (has links)
Digitaliseringen av samhället har ökat kraftigt det senaste decenniet och hela 95% av Sveriges befolkning uppger att de använder internet dagligen. Det har också lett till att data som personer lämnar efter sig på nätet även den har ökat. Denna ökning i kombination med smartare analysverktyg för att utvinna information ur dessa datamängder har möjliggjort en ny typ av marknadsföring, den individanpassade marknadsföringen. Det innebär att företag riktar sina erbjudanden och annonser för den specifika konsumenten med hjälp av dennes egen data även kallat digitala fotspår. De digitala fotspår som företag använder sig utav kan vara köpbeteende, GPS- koordinater och tidigare sökhistorik på exempelvis Google. Dock finns det en risk att företag kan kränka den personliga integriteten när den individanpassade reklamen blir för specifik eller är baserad på känslig information om konsumenten. Detta leder in på studiens forskningsfråga som är ” Vad är konsumentens uppfattning kring individanpassad marknadsföring gällande den personliga integriteten?”. Denna forskningsfråga har även två delfrågor som är ” Vilka faktorer bidrar till att konsumenten i mindre utsträckning uppfattar individanpassad reklam som integritetskränkande?” och ” Vad uppfattas som integritetskränkande för konsumenten?”. Dessa delfrågor ska tillsammans hjälpa till att svara på huvudfrågan. Ordet konsument avser i denna studie de som handlar digitalt och använder sig av olika digitala tjänster där de blir exponerade för individanpassad reklam.Forskningsfrågan besvaras med hjälp av en kvalitativ metod och samtalsintervjuer där en induktiv med inslag av deduktiv tematisk analys används för att analysera materialet. Studien kommer fokusera på att undersöka individanpassad marknadsföring gällande integritet utifrån ett konsumentperspektiv och kommer därmed inte gå in på hur företag resonerar kring denna fråga. Syftet med studien är att ge företag en inblick i vilken information som konsumenter tycker är okej att de använder sig av för att individanpassa reklam samt när de anser att det blir integritetskränkande.Det resultat som studien kom fram till är att olika åtgärder från företag så som att ge konsumenten ökad kontroll över sin personliga information samt en ökad transparens kring vilken information som samlas in och används leder till ett ökat förtroende för företaget vilket i sin tur bidrar till ett positivare mottagande av reklamen. Vidare uppfattas reklam som är baserat på konsumentens köpbeteende och tidigare sökningar3som minst integritetskränkande medan reklam som baseras på ljudupptag, GPS- koordinater samt vilka konsumenten integrerar med som mest integritetskränkande. Företaget som ligger bakom en reklam som uppfattas som integritetskränkande kan få en skadad image vilket kan leda till att konsumenten i framtiden väljer ett annat företag.
24

Klasifikace webových stránek / Web Page Classification

Kolář, Roman January 2008 (has links)
This paper presents problem of automatic webpages classification using association rules based classifier. Classification problem is presented, as a one of  datamining technique, in context of mining knowledges from text data. There are many text document classification methods presented with highlighting benefits of classification methods using association rules. The main goal of work is adjusting selected classification method for relation data and design draft of webpages classifier, which classifies pages with the aid of visual properties - independent section layout on the web page, not (only) by textual data. There is also ARC-BC classification method presented as a selected method and as one of intriguing classificators, that derives accuracy and understandableness benefits of all other methods.
25

Systém pro dolování z dat v prostředí Oracle / Data Mining System in Oracle

Krásný, Michal January 2008 (has links)
This MSc Project deals with the system of Knowledge Discovery in Databases. It is a client application which uses the Oracle Data Mining Server's 10.g Release 2 (10.2) services. The application is implemented in Java, the graphical user interface is built on the NetBeans Rich Client Platform. The theoretical part introduces the Knowledge Discovery in Databases, while the practical part describes functionality of the original system, it's deficiencies, documents sollutions of theese deficiencies and there are proposed improvements for further development. The goal of this project is to modify the system to increase the application usability.
26

Rozšíření funkcionality systému pro dolování z dat na platformě NetBeans / Functionality Extension of Data Mining System on NetBeans Platform

Šebek, Michal January 2009 (has links)
Databases increase by new data continually. A process called Knowledge Discovery in Databases has been defined for analyzing these data and new complex systems has been developed for its support. Developing of one of this systems is described in this thesis. Main goal is to analyse the actual state of implementation of this system which is based on the Java NetBeans Platform and the Oracle database system and to extend it by data preprocessing algorithms and the source data analysis. Implementation of data preprocessing components and changes in kernel of this system are described in detail in this thesis.
27

Tvorba databázové aplikace a řešení pro Business Intelligence / Creation of Database Application and Solutions for Business Intelligence

Městka, Milan January 2012 (has links)
Theme of this master’s thesis is design of software support for business intelligence. Design is realized in cooperation with corporation ZZN Pelhřimov a.s. Introduction is focused on theoretical description of business intelligence and datamining and also on development environment in which is project designed. Corporation is characterised also in introduction. Main part contains data collecting and definition of individual modules. In conclusion of this thesis will be several types of analysis from collected data and then according to these analysis, we can draw measures to improve current state of corporation.
28

Robustness of Machine Learning algorithms applied to gas turbines / Robusthet av maskininlärningsalgoritmer i gasturbiner

Cardenas Meza, Andres Felipe January 2024 (has links)
This thesis demonstrates the successful development of a software sensor for Siemens Energy's SGT-700 gas turbines using machine learning algorithms. Our goal was to enhance the robustness of measurements and redundancies, enabling early detection of sensor or turbine malfunctions and contributing to predictive maintenance methodologies. The research is based on a real-world case study, implementing the Cross Industry Standard Process for Data Mining (CRISP DM) methodology in an industrial setting. The thesis details the process from dataset preparation and data exploration to algorithm development and evaluation, providing a comprehensive view of the development process. This work is a step towards integrating machine learning into gas turbine systems. The data preparation process highlights the challenges that arise in the industrial application of data-driven methodologies due to inevitable data quality issues. It provides insight into potential future improvements, such as the constraint programming approach used for dataset construction in this thesis, which remains a valuable tool for future research. The range of algorithms proposed for the software sensor's development spans from basic to more complex methods, including shallow networks, ensemble methods and recurrent neural networks. Our findings explore the limitations and potential of the proposed algorithms, providing valuable insights into the practical application of machine learning in gas turbines. This includes assessing the reliability of these solutions, their role in monitoring machine health over time, and the importance of clean, usable data in driving accurate and satisfactory estimates of different variables in gas turbines. The research underscores that, while replacing a physical sensor with a software sensor is not yet feasible, integrating these solutions into gas turbine systems for health monitoring is indeed possible. This work lays the groundwork for future advancements and discoveries in the field. / Denna avhandling dokumenterar den framgångsrika utvecklingen av en mjukvarusensor för Siemens Energy's SGT-700 gasturbiner med hjälp av maskininlärningsalgoritmer. Vårt mål var att öka mätkvaliten samt införa redundans, vilket möjliggör tidig upptäckt av sensor- eller turbinfel och bidrar till utvecklingen av prediktiv underhållsmetodik. Forskningen baseras på en verklig fallstudie, implementerad enligt Cross Industry Standard Process for Data Mining-metodiken i en industriell miljö. Avhandligen beskriver processen från datamängdsförberedelse och datautforskning till utveckling och utvärdering av algoritmer, vilket ger en heltäckande bild av utvecklingsprocessen. Detta arbete är ett steg mot att integrera maskininlärning i gasturbinssystem. Dataförberedelsesprocessen belyser de utmaningar som uppstår vid industriell tillämpning av datadrivna metoder på grund av oundvikliga datakvalitetsproblem. Det ger insikt i potentiella framtida förbättringar, såsom den begränsningsprogrammeringsansats som används för datamängdskonstruktion i denna avhandling, vilket förblir ett värdefullt verktyg för framtida forskning. Utvecklingen av mjukvarusensorn sträcker sig från grundläggande till mer komplexa metoder, inklusive ytliga nätverk, ensemblemetoder och återkommande neurala nätverk. Våra resultat utforskar begränsningarna och potentialen hos de föreslagna algoritmerna och ger värdefulla insikter i den praktiska tillämpningen av maskininlärning i gasturbiner. Detta inkluderar att bedöma tillförlitligheten hos dessa lösningar, deras roll i övervakning av maskinhälsa över tid och vikten av ren, användbar data för att generera korrekta och tillfredsställande uppskattningar av olika variabler i gasturbiner. Forskningen understryker att, medan det ännu inte är genomförbart att ersätta en fysisk sensor med en mjukvarusensor, är det verkligen möjligt att integrera dessa lösningar i gasturbinssystem för tillståndsövervakning. Detta arbete lägger grunden för vidare studier och upptäckter inom området. / Esta tesis demuestra el exitoso desarrollo de un sensor basado en software para las turbinas de gas SGT-700 de Siemens Energy utilizando algoritmos de aprendizaje automático. Esto con el objetivo de contribuir a las metodologías de mantenimiento predictivo. La investigación se basa en un estudio industrial que implementa la metodología de Proceso Estándar de la Industria para la Minería de Datos, cuyo acrónimo en inglés CRISP-DM. La tesis detalla el proceso desde la preparación del 'dataset', la exploración de datos hasta el desarrollo y evaluación de algoritmos, proporcionando una visión holistica del proceso de desarrollo. Este trabajo representa un paso hacia la integración del aprendizaje automático en turbinas de gas. Nuestros hallazgos exploran las limitaciones y el potencial de los algoritmos propuestos, proporcionando un analisis sobre la aplicación práctica del aprendizaje automático en turbinas de gas. Esto incluye evaluar la confiabilidad de estas soluciones, su papel en la monitorización de la salud de la máquina a lo largo del tiempo, y la importancia de los datos limpios y utilizables para impulsar estimaciones precisas y satisfactorias de diferentes variables en las turbinas de gas. La investigación sugiere que, aunque reemplazar un sensor físico con un sensor basado en aprendizaje automatico aún no es factible, sí es posible integrar estas soluciones en los sistemas de turbinas de gas para monitorear del estado de la maquina.

Page generated in 0.1662 seconds