• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 72
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 128
  • 128
  • 68
  • 67
  • 44
  • 38
  • 32
  • 29
  • 19
  • 14
  • 12
  • 11
  • 9
  • 9
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Výběr informačního systému / Information System Selection

Šebesta, Petr January 2009 (has links)
The thesis deals with problems of enterprise information system selection. Based on theoretical knowledge in the field of information systems and their development is the analysis of enterprise information system selection and situation on czech market. Solution suggestions consist of optimal enterprise information system project description and description of created expert system knowledge base for pre-selection of enterprise information system.
122

An Exploratory Study on The Trust of Information in Social Media

Chih-Yuan Chou (8630730) 17 April 2020 (has links)
This study examined the level of trust of information on social media. Specifically, I investigated the factors of performance expectancy with information-seeking motives that appear to influence the level of trust of information on various social network sites. This study utilized the following theoretical models: elaboration likelihood model (ELM), the uses and gratifications theory (UGT), the unified theory of acceptance and use of technology model (UTAUT), the consumption value theory (CVT), and the Stimulus-Organism-Response (SOR) Model to build a conceptual research framework for an exploratory study. The research investigated the extent to which information quality and source credibility influence the level of trust of information by visitors to the social network sites. The inductive content analysis on 189 respondents’ responses carefully addressed the proposed research questions and then further developed a comprehensive framework. The findings of this study contribute to the current research stream on information quality, fake news, and IT adoption as they relate to social media.
123

Sdílení ekonomických informací a znalostí ve vztahu k rozvoji informačně-komunikačních technologií (ICT) / Sharing of Economic Information and Knowledge in the Context of ICT Development

Dlouhý, Vladimír January 2011 (has links)
(in English) The subject of the Diploma thesis is the process of sharing of economic information and knowledge suitable for corporate governance. A special attention is focused on an effective utilization and development of information and communication technologies (ICT). At the beginning, information society and basic terms are defined, such as information economy, economics of information, information management and knowledge management. Furthermore, the strategic importance of enterprise information systems and other solutions for improving business information processes are described in general. The final chapters deal with Competitive Intelligence (CI) and Business Intelligence (BI) and contain the overview and analyses of the most important information products of economic information from commercial and non-commercial sectors [Author's abstract].
124

Multimodal Data Management in Open-world Environment

K M A Solaiman (16678431) 02 August 2023 (has links)
<p>The availability of abundant multimodal data, including textual, visual, and sensor-based information, holds the potential to improve decision-making in diverse domains. Extracting data-driven decision-making information from heterogeneous and changing datasets in real-world data-centric applications requires achieving complementary functionalities of multimodal data integration, knowledge extraction and mining, situationally-aware data recommendation to different users, and uncertainty management in the open-world setting. To achieve a system that encompasses all of these functionalities, several challenges need to be effectively addressed: (1) How to represent and analyze heterogeneous source contents and application context for multimodal data recommendation? (2) How to predict and fulfill current and future needs as new information streams in without user intervention? (3) How to integrate disconnected data sources and learn relevant information to specific mission needs? (4) How to scale from processing petabytes of data to exabytes? (5) How to deal with uncertainties in open-world that stem from changes in data sources and user requirements?</p> <p><br></p> <p>This dissertation tackles these challenges by proposing novel frameworks, learning-based data integration and retrieval models, and algorithms to empower decision-makers to extract valuable insights from diverse multimodal data sources. The contributions of this dissertation can be summarized as follows: (1) We developed SKOD, a novel multimodal knowledge querying framework that overcomes the data representation, scalability, and data completeness issues while utilizing streaming brokers and RDBMS capabilities with entity-centric semantic features as an effective representation of content and context. Additionally, as part of the framework, a novel text attribute recognition model called HART was developed, which leveraged language models and syntactic properties of large unstructured texts. (2) In the SKOD framework, we incrementally proposed three different approaches for data integration of the disconnected sources from their semantic features to build a common knowledge base with the user information need: (i) EARS: A mediator approach using schema mapping of the semantic features and SQL joins was proposed to address scalability challenges in data integration; (ii) FemmIR: A data integration approach for more susceptible and flexible applications, that utilizes neural network-based graph matching techniques to learn coordinated graph representations of the data. It introduces a novel graph creation approach from the features and a novel similarity metric among data sources; (iii) WeSJem: This approach allows zero-shot similarity matching and data discovery by using contrastive learning<br> to embed data samples and query examples in a high-dimensional space using features as a novel source of supervision instead of relevance labels. (3) Finally, to manage uncertainties in multimodal data management for open-world environments, we characterized novelties in multimodal information retrieval based on data drift. Moreover, we proposed a novelty detection and adaptation technique as an augmentation to WeSJem.<br> </p> <p>The effectiveness of the proposed frameworks, models, and algorithms was demonstrated<br> through real-world system prototypes that solved open problems requiring large-scale human<br> endeavors and computational resources. Specifically, these prototypes assisted law enforcement officers in automating investigations and finding missing persons.<br> </p>
125

DIGITAL TWIN: FACTORY DISCRETE EVENT SIMULATION

Zachary Brooks Smith (7659032) 04 November 2019 (has links)
Industrial revolutions bring dynamic change to industry through major technological advances (Freeman & Louca, 2002). People and companies must take advantage of industrial revolutions in order to reap its benefits (Bruland & Smith, 2013). Currently, the 4th industrial revolution, industry is transforming advanced manufacturing and engineering capabilities through digital transformation. Company X’s production system was investigated in the research. Detailed evaluation the production process revealed bottlenecks and inefficiency (Melton, 2005). Using the Digital Twin and Discrete Event Factory Simulation, the researcher gathered factory and production input data to simulate the process and provide a system level, holistic view of Company X’s production system to show how factory simulation enables process improvement. The National Academy of Engineering supports Discrete Event Factory Simulation as advancing Personalized Learning through its ability to meet the unique problem solving needs of engineering and manufacturing process through advanced simulation technology (National Academy of Engineering, 2018). The directed project applied two process optimization experiments to the production system through the simulation tool, 3DExperience wiht the DELMIA application from Dassualt Systemes (Dassault, 2018). The experiment resulted in a 10% improvement in production time and a 10% reduction in labor costs due to the optimization
126

<strong>TOWARDS A TRANSDISCIPLINARY CYBER FORENSICS GEO-CONTEXTUALIZATION FRAMEWORK</strong>

Mohammad Meraj Mirza (16635918) 04 August 2023 (has links)
<p>Technological advances have a profound impact on people and the world in which they live. People use a wide range of smart devices, such as the Internet of Things (IoT), smartphones, and wearable devices, on a regular basis, all of which store and use location data. With this explosion of technology, these devices have been playing an essential role in digital forensics and crime investigations. Digital forensic professionals have become more able to acquire and assess various types of data and locations; therefore, location data has become essential for responders, practitioners, and digital investigators dealing with digital forensic cases that rely heavily on digital devices that collect data about their users. It is very beneficial and critical when performing any digital/cyber forensic investigation to consider answering the six Ws questions (i.e., who, what, when, where, why, and how) by using location data recovered from digital devices, such as where the suspect was at the time of the crime or the deviant act. Therefore, they could convict a suspect or help prove their innocence. However, many digital forensic standards, guidelines, tools, and even the National Institute of Standards and Technology (NIST) Cyber Security Personnel Framework (NICE) lack full coverage of what location data can be, how to use such data effectively, and how to perform spatial analysis. Although current digital forensic frameworks recognize the importance of location data, only a limited number of data sources (e.g., GPS) are considered sources of location in these digital forensic frameworks. Moreover, most digital forensic frameworks and tools have yet to introduce geo-contextualization techniques and spatial analysis into the digital forensic process, which may aid digital forensic investigations and provide more information for decision-making. As a result, significant gaps in the digital forensics community are still influenced by a lack of understanding of how to properly curate geodata. Therefore, this research was conducted to develop a transdisciplinary framework to deal with the limitations of previous work and explore opportunities to deal with geodata recovered from digital evidence by improving the way of maintaining geodata and getting the best value from them using an iPhone case study. The findings of this study demonstrated the potential value of geodata in digital disciplinary investigations when using the created transdisciplinary framework. Moreover, the findings discuss the implications for digital spatial analytical techniques and multi-intelligence domains, including location intelligence and open-source intelligence, that aid investigators and generate an exceptional understanding of device users' spatial, temporal, and spatial-temporal patterns.</p>
127

PLANT LEVEL IIOT BASED ENERGY MANAGEMENT FRAMEWORK

Liya Elizabeth Koshy (14700307) 31 May 2023 (has links)
<p><strong>The Energy Monitoring Framework</strong>, designed and developed by IAC, IUPUI, aims to provide a cloud-based solution that combines business analytics with sensors for real-time energy management at the plant level using wireless sensor network technology.</p> <p>The project provides a platform where users can analyze the functioning of a plant using sensor data. The data would also help users to explore the energy usage trends and identify any energy leaks due to malfunctions or other environmental factors in their plant. Additionally, the users could check the machinery status in their plant and have the capability to control the equipment remotely.</p> <p>The main objectives of the project include the following:</p> <ul> <li>Set up a wireless network using sensors and smart implants with a base station/ controller.</li> <li>Deploy and connect the smart implants and sensors with the equipment in the plant that needs to be analyzed or controlled to improve their energy efficiency.</li> <li>Set up a generalized interface to collect and process the sensor data values and store the data in a database.</li> <li>Design and develop a generic database compatible with various companies irrespective of the type and size.</li> <li> Design and develop a web application with a generalized structure. Hence the database can be deployed at multiple companies with minimum customization. The web app should provide the users with a platform to interact with the data to analyze the sensor data and initiate commands to control the equipment.</li> </ul> <p>The General Structure of the project constitutes the following components:</p> <ul> <li>A wireless sensor network with a base station.</li> <li>An Edge PC, that interfaces with the sensor network to collect the sensor data and sends it out to the cloud server. The system also interfaces with the sensor network to send out command signals to control the switches/ actuators.</li> <li>A cloud that hosts a database and an API to collect and store information.</li> <li>A web application hosted in the cloud to provide an interactive platform for users to analyze the data.</li> </ul> <p>The project was demonstrated in:</p> <ul> <li>Lecture Hall (https://iac-lecture-hall.engr.iupui.edu/LectureHallFlask/).</li> <li>Test Bed (https://iac-testbed.engr.iupui.edu/testbedflask/).</li> <li>A company in Indiana.</li> </ul> <p>The above examples used sensors such as current sensors, temperature sensors, carbon dioxide sensors, and pressure sensors to set up the sensor network. The equipment was controlled using compactable switch nodes with the chosen sensor network protocol. The energy consumption details of each piece of equipment were measured over a few days. The data was validated, and the system worked as expected and helped the user to monitor, analyze and control the connected equipment remotely.</p> <p><br></p>
128

EXPLORING GRAPH NEURAL NETWORKS FOR CLUSTERING AND CLASSIFICATION

Fattah Muhammad Tahabi (14160375) 03 February 2023 (has links)
<p><strong>Graph Neural Networks</strong> (GNNs) have become excessively popular and prominent deep learning techniques to analyze structural graph data for their ability to solve complex real-world problems. Because graphs provide an efficient approach to contriving abstract hypothetical concepts, modern research overcomes the limitations of classical graph theory, requiring prior knowledge of the graph structure before employing traditional algorithms. GNNs, an impressive framework for representation learning of graphs, have already produced many state-of-the-art techniques to solve node classification, link prediction, and graph classification tasks. GNNs can learn meaningful representations of graphs incorporating topological structure, node attributes, and neighborhood aggregation to solve supervised, semi-supervised, and unsupervised graph-based problems. In this study, the usefulness of GNNs has been analyzed primarily from two aspects - <strong>clustering and classification</strong>. We focus on these two techniques, as they are the most popular strategies in data mining to discern collected data and employ predictive analysis.</p>

Page generated in 0.1464 seconds