• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 288
  • 23
  • 21
  • 16
  • 9
  • 7
  • 7
  • 5
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 472
  • 472
  • 115
  • 98
  • 97
  • 86
  • 67
  • 61
  • 61
  • 54
  • 48
  • 46
  • 46
  • 45
  • 42
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Building Energy Modeling: A Data-Driven Approach

January 2016 (has links)
abstract: Buildings consume nearly 50% of the total energy in the United States, which drives the need to develop high-fidelity models for building energy systems. Extensive methods and techniques have been developed, studied, and applied to building energy simulation and forecasting, while most of work have focused on developing dedicated modeling approach for generic buildings. In this study, an integrated computationally efficient and high-fidelity building energy modeling framework is proposed, with the concentration on developing a generalized modeling approach for various types of buildings. First, a number of data-driven simulation models are reviewed and assessed on various types of computationally expensive simulation problems. Motivated by the conclusion that no model outperforms others if amortized over diverse problems, a meta-learning based recommendation system for data-driven simulation modeling is proposed. To test the feasibility of the proposed framework on the building energy system, an extended application of the recommendation system for short-term building energy forecasting is deployed on various buildings. Finally, Kalman filter-based data fusion technique is incorporated into the building recommendation system for on-line energy forecasting. Data fusion enables model calibration to update the state estimation in real-time, which filters out the noise and renders more accurate energy forecast. The framework is composed of two modules: off-line model recommendation module and on-line model calibration module. Specifically, the off-line model recommendation module includes 6 widely used data-driven simulation models, which are ranked by meta-learning recommendation system for off-line energy modeling on a given building scenario. Only a selective set of building physical and operational characteristic features is needed to complete the recommendation task. The on-line calibration module effectively addresses system uncertainties, where data fusion on off-line model is applied based on system identification and Kalman filtering methods. The developed data-driven modeling framework is validated on various genres of buildings, and the experimental results demonstrate desired performance on building energy forecasting in terms of accuracy and computational efficiency. The framework could be easily implemented into building energy model predictive control (MPC), demand response (DR) analysis and real-time operation decision support systems. / Dissertation/Thesis / Doctoral Dissertation Industrial Engineering 2016
112

Run-to-run modelling and control of batch processes

Duran Villalobos, Carlos Alberto January 2016 (has links)
The University of ManchesterCarlos Alberto Duran VillalobosDoctor of Philosophy in the Faculty of Engineering and Physical SciencesDecember 2015This thesis presents an innovative batch-to-batch optimisation technique that was able to improve the productivity of two benchmark fed-batch fermentation simulators: Saccharomyces cerevisiae and Penicillin production. In developing the proposed technique, several important challenges needed to be addressed:For example, the technique relied on the use of a linear Multiway Partial Least Squares (MPLS) model to adapt from one operating region to another as productivity increased to estimate the end-point quality of each batch accurately. The proposed optimisation technique utilises a Quadratic Programming (QP) formulation to calculate the Manipulated Variable Trajectory (MVT) from one batch to the next. The main advantage of the proposed optimisation technique compared with other approaches that have been published was the increase of yield and the reduction of convergence speed to obtain an optimal MVT. Validity Constraints were also included into the batch-to-batch optimisation to restrict the QP calculations to the space only described by useful predictions of the MPLS model. The results from experiments over the two simulators showed that the validity constraints slowed the rate of convergence of the optimisation technique and in some cases resulted in a slight reduction in final yield. However, the introduction of the validity constraints did improve the consistency of the batch optimisation. Another important contribution of this thesis were a series of experiments that were implemented utilising a variety of smoothing techniques used in MPLS modelling combined with the proposed batch-to-batch optimisation technique. From the results of these experiments, it was clear that the MPLS model prediction accuracy did not significantly improve using these smoothing techniques. However, the batch-to-batch optimisation technique did show improvements when filtering was implemented.
113

Síntese de controladores ressonantes baseado em dados aplicado a fontes ininterruptas de energia

Schildt, Alessandro Nakoneczny January 2014 (has links)
Este trabalho trata da utilização de um método de sintonia de controladores baseado nos dados obtidos da planta. A proposta é a sintonia de controladores ressonantes para aplicação em inversores de frequência presentes em fontes ininterruptas de energia, com o intuito de seguimento de referência senoidal de tensão. Dentro deste contexto, será usado o algoritmo Virtual Reference Feedback Tuning, o qual é um método de identificação de controladores baseado em dados que não é iterativo e não necessita do modelo do sistema para identificar o controlador. A partir dos dados obtidos da planta e também da definição de um modelo de referência pelo projetista, o método estima os parâmetros de uma estrutura fixada previamente para o controlador através da minimização de uma função custo definida pelo erro entre a saída desejada e a saída real. Além disso, uma realimentação de corrente é necessária na malha de controle, onde seu ganho proporcional é definido por experimento empírico. Para demonstrar a utilização do método são apresentados resultados simulados e práticos de uma fonte ininterrupta de energia com potência de 5 kV A utilizando cargas lineares e não-lineares. É avaliado o desempenho do ponto de vista da qualidade do sinal de saída real obtido com controladores sintonizados a partir de diferentes modelos de referência, além do uso de sinais de excitação diversos para o algoritmo V RFT. Os resultados experimentais são obtidos em um inversor de frequência monofásico com uma plataforma em tempo real baseada na placa de aquisição de dados dSPACE DS1104. Os resultados mostram que, em relação as normas internacionais, o sistema de controle proposto possui bom comportamento para seguimento de referência, operando à vazio ou utilizando carga linear. / This work discusses about controller tuning methods based on plant data. The proposal is to tune resonant controllers for application to the frequency inverters found in uninterruptible power supplies, with the goal of following sinusoidal reference signals. Within this context, the Virtual Reference Feedback Tuning algorithm is used, which is a data-driven controller identification method that is not iterative and does not require a system model to identify the controller. Data obtained from the plant and also the definition of a reference model by the designer, are used by the method to estimate the parameters of a previously fixed controller structure through the minimization of a cost function, which is defined by the error between desired and actual outputs. Moreover, a current feedback is required in the control loop where the proportional gain is defined by empirical experiment. To demonstrate the method’s application, simulated and practical results of an uninterruptible power supply with capacity of the 5 kV A will be presented employing linear and nonlinear loads. Evaluates the performance in terms of system’s actual output quality, obtained with controllers tuned with different reference models. Distinct excitation signals are also used to feed the VRFT algorithm. The experimental results achieved from use of an single-phase inverter and a real-time platform based on data acquisition board dSPACE DS1104. The results show that, with respect to international standards, the proposed control system has good performance for tracking reference, operating at empty or using linear load.
114

Curious Cuisine : Bringing culinary creativity home

Nacsa, Júlia January 2016 (has links)
How could culinary science and technology educate us about food through engagement and reflection? In this project, I proposed to uncover opportunities for design intervention within our near-future scenarios of cooking and eating in a home environment. My intent has been to use interaction design methodology to form social practices that turn the process of making and eating food more pleasurable and inspiring, while developing one’s individual knowledge, without being didactic and prescriptive. The hypothesis has been that culinary science simplified, combined with today’s data-driven technologies, have the potential to foster creativity and experimentation among hobby cooks. The aim has been to discover the consequences of cloud data and connected technologies on experimentation, which is inherently driven by human intuition. My approach has been to explore what behaviors such data-driven systems designed for eliciting creativity could possess, and what kind of inspiration the science of flavor could bring into everyday cooking. The result is a set of design principles for how creative cooking explorations can be fostered through tangible and embodied experiences. It is manifested in a concept that creates a ‘culinary safe zone’ by encouraging experimentation, presenting information on demand, but without overshadowing the cook’s intuition. The concept Curious Cuisine allows non-professional cooks to create their own unique dishes; to explore ingredient pairings, preparation techniques, and fine-tuning flavors.
115

Automated Data-Driven Hint Generation for Learning Programming

Rivers, Kelly 01 July 2017 (has links)
Feedback is an essential component of the learning process, but in fields like computer science, which have rapidly increasing class sizes, it can be difficult to provide feedback to students at scale. Intelligent tutoring systems can provide personalized feedback to students automatically, but they can take large amounts of time and expert knowledge to build, especially when determining how to give students hints. Data-driven approaches can be used to provide personalized next-step hints automatically and at scale, by mining previous students’ solutions. I have created ITAP, the Intelligent Teaching Assistant for Programming, which automatically generates next-step hints for students in basic Python programming assignments. ITAP is composed of three stages: canonicalization, where a student's code is transformed to an abstracted representation; path construction, where the closest correct state is identified and a series of edits to that goal state are generated; and reification, where the edits are transformed back into the student's original context. With these techniques, ITAP can generate next-step hints for 100% of student submissions, and can even chain these hints together to generate a worked example. Initial analysis showed that hints could be used in practice problems in a real classroom environment, but also demonstrated that students' relationships with hints and help-seeking were complex and required deeper investigation. In my thesis work, I surveyed and interviewed students about their experience with helpseeking and using feedback, and found that students wanted more detail in hints than was initially provided. To determine how hints should be structured, I ran a usability study with programmers at varying levels of knowledge, where I found that more novice students needed much higher levels of content and detail in hints than was traditionally given. I also found that examples were commonly used in the learning process, and could serve an integral role in the feedback provision process. I then ran a randomized control trial experiment to determine the effect of next-step hints on learning and time-on-task in a practice session, and found that having hints available resulted in students spending 13.7% less time during practice while achieving the same learning results as the control group. Finally, I used the data collected during these experiments to measure ITAP’s performance over time, and found that generated hints improved as data was added to the system. My dissertation has contributed to the fields of computer science education, learning science, human-computer interaction, and data-driven tutoring. In computer science education, I have created ITAP, which can serve as a practice resource for future programming students during learning. In the learning sciences, I have replicated the expertise reversal effect by finding that more expert programmers want less detail in hints than novice programmers; this finding is important as it implies that programming teachers may provide novices with less assistance than they need. I have contributed to the literature on human-computer interaction by identifying multiple possible representations of hint messages, and analyzing how users react to and learn from these different formats during program debugging. Finally, I have contributed to the new field of data-driven tutoring by establishing that it is possible to always provide students with next-step hints, even without a starting dataset beyond the instructor’s solution, and by demonstrating that those hints can be improved automatically over time.
116

Supporting product development with a tangible platform for simulating user scenarios

Ruvald, Ryan January 2017 (has links)
Motivation: Today’s sustainability challenges are increasingly being addressed by Product Service Systems to satisfy customers needs while lowering their overall environment impact. These systems are increasingly complex containing diverse artifacts and interactions. To provide a holistic solution centered on the human experience element, design of product-service systems are best driven by data gathered from design thinking methods.     Problem: When considering innovation challenges, such as the deployment of autonomous electric machines on future construction sites, data driven design can suffer from a lack of available tangible user feedback upon which to make design decisions.   Approach: In the case of this study, the development of a scaled down construction site structured around generally applicable operations was built as a prototype for involving various users in early phase development of a HMI for interacting with prototype machines built by Volvo CE called the HX01. Qualitative data acquisition methods were derived from Design Thinking approaches to needfinding including: a questionnaire, unstructured interviews and observations.   Results: The prototype scale site became a 5 meter x 5 meter semi-portable site with 1:11 scale ratio machines including: excavators, wheeled loaders and autonomous haulers. The product tested with the site was an augmented reality interface to provide a communication platform between workers and the autonomous haulers designed at building trust to enable collaboration. Test users and observers provided feedback confirming the effectiveness of the scale site scenario to convey the necessary context of a realistic interaction experience. Beyond HMI testing, the site served as a tangible artifact to instigate conversations across domain boundaries.   Conclusions: The tangible experiential scenario platform developed displayed the capability to go beyond one-way concept communication of concepts to customers, by including customers as integral participants in the testing of new products/services. For design teams, the site can facilitate deeper learnings and validation via a shared contextualization of user feedback. The further implications may also include: the ability to increase rationale at design decision gate’s assessment of risk in new products and enable the identification of emergent issues in complex future scenarios.
117

Dynamic Data-Driven Visual Surveillance of Human Crowds via Cooperative Unmanned Vehicles

Minaeian, Sara, Minaeian, Sara January 2017 (has links)
Visual surveillance of human crowds in a dynamic environment has attracted a great amount of computer vision research efforts in recent years. Moving object detection, which conventionally includes motion segmentation and optionally, object classification, is the first major task for any visual surveillance application. After detecting the targets, estimation of their geo-locations is needed to create the same reference coordinate system for them for higher-level decision-making. Depending on the required fidelity of decision, multi-target data association may be also needed at higher levels to differentiate multiple targets in a series of frames. Applying all these vision-based algorithms to a crowd surveillance system (a major application studied in this dissertation) using a team of cooperative unmanned vehicles (UVs), introduces new challenges to the problem. Since the visual sensors move with the UVs, and thus the targets and the environment are dynamic, it adds to the complexity and uncertainty of the video processing. Moreover, the limited onboard computation resources require more efficient algorithms to be proposed. Responding to these challenges, the goal of this dissertation is to design and develop an effective and efficient visual surveillance system based on dynamic data driven application system (DDDAS) paradigm to be used by the cooperative UVs for autonomous crowd control and border patrol. The proposed visual surveillance system includes different modules: 1) a motion detection module, in which a new method for detecting multiple moving objects, based on sliding window is proposed to segment the moving foreground using the moving camera onboard the unmanned aerial vehicle (UAV); 2) a target recognition module, in which a customized method based on histogram-of-oriented-gradients is applied to classify the human targets using the onboard camera of unmanned ground vehicle (UGV); 3) a target geo-localization module, in which a new moving-landmark-based method is proposed for estimating the geo-location of the detected crowd from the UAV, while a heuristic method based on triangulation is applied for geo-locating the detected individuals via the UGV; and 4) a multi-target data association module, in which the affinity score is dynamically adjusted to comply with the changing dispersion of the detected targets over successive frames. In this dissertation, a cooperative team of one UAV and multiple UGVs with onboard visual sensors is used to take advantage of the complementary characteristics (e.g. different fidelities and view perspectives) of these UVs for crowd surveillance. The DDDAS paradigm is also applied toward these vision-based modules, where the computational and instrumentation aspects of the application system are unified for more accurate or efficient analysis according to the scenario. To illustrate and demonstrate the proposed visual surveillance system, aerial and ground video sequences from the UVs, as well as simulation models are developed, and experiments are conducted using them. The experimental results on both developed videos and literature datasets reveal the effectiveness and efficiency of the proposed modules and their promising performance in the considered crowd surveillance application.
118

The use of data in social media marketing : An explorative study of data insights in social media marketing

Grönlund, Sophie, Schytt, Tommy January 2017 (has links)
The marketing possibilities on the Internet is growing and so are social media marketing. The budget devoted for marketing activities on social media is constantly increasining every year and the time users are spending on social media is also increasing. Among the increasing activities comes a vast amount of data which create endless of opportunities for companies to optimize their marketing activities. In marketing the most important has always been to know your customers and how to reach out to them. The Internet and data that comes with it has made it possible for companies to get to know their customers even better and to reach them with more precision if data is correctly used.   A gap was identified from the litterature search which showed that it is not always clear how to utilize social media for marketing and it is not easy to analyze and interpret the data derived from social media. This has lead to a lack of knowledge on how data can be used for social media activities. From the identified gap regarding data usage in social media marketing, a research question was formulated:   “How is data used in brand’s strategies for social media?”   A qualitative research design conducting semi-structured interviews was used to examine the research question. A purposeful sample of eleven respondents, defined as experts within the research field, from ten different companies was selected. A pilot study was carried out to get insights in the identified gap, to set a base for the theoretical framework, and to optimize the interview questions. All respondents represented agencies except for the respondent in the pilot study.   Academics and business communities are interested in how data is used in marketing purposes and therefore it was elaborated further in this thesis to how data can be used in social media activities. Branding activities are becoming more engaged with its customers, thus marketers need to keep up to date with the new and emerging trends. Furthermore, the aim was to explore how data is used in social media marketing and how data affect decisions in social media strategies.   The results found in this study shows that data is used to define audiences on social media and to enable a greater reach of the messages for the audiences. The audience is defined by data analysis mostly based on consumer behavior in social media. To achive reach marketers use programmatic buying tools, which are based on data and ultimatley enables conversions among the audience. Data is also analyzed by opinion mining where data insights can show what topics customers are engaged in. Data insights can further give direction on how content can encourage engagement among the targeted audience. Lastly, the result shows that it is important to have knowledge about how to analyze, interpret, and use data insights in order to create successful social media activites.
119

Channel attribution modelling using clickstream data from an online store

Neville, Kevin January 2017 (has links)
In marketing, behaviour of users is analysed in order to discover which channels (for instance TV, Social media etc.) are important for increasing the user’s intention to buy a product. The search for better channel attribution models than the common last-click model is of major concern for the industry of marketing. In this thesis, a probabilistic model for channel attribution has been developed, and this model is demonstrated to be more data-driven than the conventional last- click model. The modelling includes an attempt to include the time aspect in the modelling which have not been done in previous research. Our model is based on studying different sequence length and computing conditional probabilities of conversion by using logistic regression models. A clickstream dataset from an online store was analysed using the proposed model. This thesis has revealed proof of that the last-click model is not optimal for conducting these kinds of analyses.
120

Optimal design and operation of heat exchanger network

Salihu, Adamu Girei January 2015 (has links)
Heat exchanger networks (HENs) are the backbone of heat integration due to their ability in energy and environmental managements. This thesis deals with two issues on HENs. The first concerns with designing of economically optimal Heat exchanger network (HEN) whereas the second focus on optimal operation of HEN in the presence of uncertainties and disturbances within the network. In the first issue, a pinch technology based optimal HEN design is firstly implemented on a 3–streams heat recovery case study to design a simple HEN and then, a more complex HEN is designed for a coal-fired power plant retrofitted with CO2 capture unit to achieve the objectives of minimising energy penalty on the power plant due to its integration with the CO2 capture plant. The benchmark in this case study is a stream data from (Khalilpour and Abbas, 2011). Improvement to their work includes: (1) the use of economic data to evaluate achievable trade-offs between energy, capital and utility cost for determination of minimum temperature difference; (2) redesigning of the HEN based on the new minimum temperature difference and (3) its comparison with the base case design. The results shows that the energy burden imposed on the power plant with CO2 capture is significantly reduced through HEN leading to utility cost saving maximisation. The cost of addition of HEN is recoverable within a short payback period of about 2.8 years. In the second issue, optimal HEN operation considering range of uncertainties and disturbances in flowrates and inlet stream temperatures while minimizing utility consumption at constant target temperatures based on self-optimizing control (SOC) strategy. The new SOC method developed in this thesis is a data-driven SOC method which uses process data collected overtime during plant operation to select control variables (CVs). This is in contrast to the existing SOC strategies in which the CV selection requires process model to be linearized for nonlinear processes which leads to unaccounted losses due to linearization errors. The new approach selects CVs in which the necessary condition of optimality (NCO) is directly approximated by the CV through a single regression step. This work was inspired by Ye et al., (2013) regression based globally optimal CV selection with no model linearization and Ye et al., (2012) two steps regression based data-driven CV selection but with poor optimal results due to regression errors in the two steps procedures. The advantage of this work is that it doesn’t require evaluation of derivatives hence CVs can be evaluated even with commercial simulators such as HYSYS and UNISIM from among others. The effectiveness of the proposed method is again applied to the 3-streams HEN case study and also the HEN for coal-fired power plant with CO2 capture unit. The case studies show that the proposed methodology provides better optimal operation under uncertainties when compared to the existing model-based SOC techniques.

Page generated in 0.1289 seconds