• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1746
  • 650
  • 251
  • 236
  • 138
  • 71
  • 54
  • 38
  • 26
  • 19
  • 18
  • 15
  • 15
  • 12
  • 11
  • Tagged with
  • 3753
  • 3753
  • 727
  • 719
  • 600
  • 543
  • 543
  • 474
  • 474
  • 427
  • 400
  • 380
  • 347
  • 332
  • 270
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
391

Modeling, optimization and hardware-in-loop simulation of hybrid electric vehicles

Tara, Ehsan 07 February 2013 (has links)
This thesis investigates modeling and simulation of hybrid electric vehicles with particular emphasis on transient modeling and real-time simulation. Three different computer models, i.e. a steady state model, a fully-detailed transient model and a reduced-intensity transient model, are developed for a hybrid drive-train in this study. The steady-state model, which has low computational intensity, is used to determine the optimal battery size and chemistry for a plug-in hybrid drive-train. Simulation results using the developed steady state model show the merits of NiMH and Li-ion battery technologies. Based on the obtained results and the reducing cost of Li-ion batteries, this battery chemistry is used throughout this research. A fully-detailed transient model is developed to simulate the vehicle behaviour under different driving conditions. This model includes the dynamics of the power train components such as the engine, the power-electronic converters and vehicle controllers of all levels. The developed transient model produces an accurate representation of the drive-train including the switching behaviour of the power electronic converters. A reduced-intensity transient model (also referred to as a dynamic average model) is developed for real-time hardware-in-loop simulation of the vehicle. By reducing the computational demand of the detailed transient model using averaging techniques, the reduced-intensity model is implemented on a real-time simulator and is interfaced to an external subsystem such as an actual battery. The setup can be used to test existing and emerging battery technologies, which may not have an accurate mathematical model. Extensive tests are performed to verify the accuracy and validity of the results obtained from the developed hardware-in-loop simulation setup.
392

Exploring real-time image space painterly rendering

Krazanowski, Michael P. 19 April 2011 (has links)
Artists have been using brush strokes to generate abstractions and interpretations of the world they view, however this process is very time consuming. Many recent and diverse techniques attempt to mimic the process on a computer that an artist would go through to generate a painting; an area of research is a ectionately nick-named "painterly rendering". These applications emulate the effects of an artist's handiwork with dramatically less time invested in the process. I present a method to adapt painterly rendering for real-time simulations or video games, such that the images may appear to have been painted by hand. The work described in this document focuses on three problem areas posed for a real-time solution for painterly-rendering: brush stroke generation, temporal coherence between frames and performance. The solution presented here intends to solve these three fundamental issues with the intent to layer these methods on top of real-time applications using current generation consumer hardware. / Graduate
393

Going Real-Time in no time? : – en kvantitativ och kvalitativ studie i hur Sveriges största annonsörer använder sociala medier och Real-Time Marketing.

Miesenberger de Morais, Daniel, Kjellström, Oscar January 2015 (has links)
Background: Our daily lives and personal interactions are increasingly featuring social media and social networks in particular. Topics regarding social media, its impact on society at large – both on corporate as well as individual behavior – has been the focus of many research papers. Personal use of social networks are regularly being mapped out by researchers aiming at clarifying concepts such as engagement, trust and interactivity, and mapped by companies seeking to maximize return on investment concerning their marketing efforts. The aim of our study is (i) to map how companies and/or organizations operating on the Swedish market use their company’s pages on the social media as a way of communicating with their followers and audience, as well as (ii) identifying whether “Real-Time Marketing”, in our own definition, is used or not. Methods: The research methods used included both quantitative and qualitative content analysis. Data was collected from Facebook and Twitter pages belonging to five companies operating on the Swedish market that are among Sweden’s largest advertisers. The data, in total 1981 updates, was analyzed in order to map out patterns and to summarize all the relevant companies’ communicative activities. Subsequently, the data was qualitatively evaluated to gain deeper understanding of the activities to uncover techniques such as semiotics and usage of social media as a way to communicate with audience and customers. Results: Results show that companies that are among Sweden’s largest advertisers mainly use Facebook and Twitter as means of publicizing information with the intent of enhancing their own brands. Our quantitative analyses concluded that the second and third most frequent updates were advertising and customer service. The results of the qualitative analyses show that companies adapt their semiotics to a more informal use of language, but fail to engage in dialogs and to stimulate their audience to increased interaction with the brand. Results also demonstrate that only one update featured the technique “Real-Time Marketing”, which suggests that “Real-Time Marketing”, in our definition, is scarcely used on the Swedish market. / Bakgrund: En allt mer vedertagen sanning är hur sociala medier spelar en större roll, inte bara i våra privatliv, men även för företag som söker att marknadsföra sig och skapa dialoger med sina kunder och intressenter. Företagens marknadsföring och kommunikation har länge studerats, och idag så fortsätter de studierna med fokus på sociala medier. Syftet med den här uppsatsen är att undersöka hur Sveriges största annonsörer använder deras sociala medie-sidor för att kommunicera med sina följare och publik, samt att försöka identifiera om det sker någon förekomst av ”Real-Time Marketing” enligt vår definition. Metod: I vår forskningsmetod förekommer både kvantitativ innehållsanalys och kvalitativ textanalys. Data har samlats in från fem företags sidor på Facebook och Twitter. Dessa fem är några av de företag som investerat mest pengar på marknadsföring under 2013. Den kodade och analyserade datamängden uppgår till 1981 stycken uppdateringar och ”tweets”, vilka används för att undersöka mönster och trender i företagens kommunikation. Efter den kvantitativa insamlingen genomförs en kvalitativ studie för att få en bredare förståelse för bland annat semiotiken och sättet som företagen vill föra en dialog på med sina kunder och följare. Resultat: Uppsatsens resultat visar att dessa svenska företag framför allt använder Facebook och Twitter för att publicera och sprida material som förstärker det egna varumärket. Genom vår analys finner vi också att de mest frekventa uppdateringarna som publiceras berör reklam och kundtjänst. Vidare ser vi hur företagen anpassar sitt språk till ett mer informellt sådant, men de misslyckas ofta med att öppna upp för dialog med sina kunder och därigenom möter de inte sitt eget och följarnas behov av ökad interaktion med varumärkena. Det visar sig också att endast en uppdatering kunde klassificeras som ”Real-Time Marketing”, vilket får oss att dra slutsatsen att denna teknik används mycket begränsat på den svenska marknaden.
394

Correlation and real time classification of physiological streams for critical care monitoring.

Thommandram, Anirudh 01 December 2013 (has links)
This thesis presents a framework for the deployment of algorithms that support the correlation and real-time classification of physiological data streams through the development of clinically meaningful alerts using a blend of expert knowledge in the domain and pattern recognition programming based on clinical rules. Its relevance is demonstrated via a real world case study within the context of neonatal intensive care to provide real-time classification of neonatal spells. Events are first detected in individual streams independently; then synced together based on timestamps; and finally assessed to determine the start and end of a multi-signal episode. The episode is then processed through a classifier based on clinical rules to determine a classification. The output of the algorithms has been shown, in a single use case study with 24 hours of patient data, to detect clinically significant relative changes in heart rate, blood oxygen saturation levels and pauses in breathing in the respiratory impedance signal. The accuracy of the algorithm for detecting these is 97.8%, 98.3% and 98.9% respectively. The accuracy for correlating the streams and determining spells classifications is 98.9%. Future research will focus on the clinical validation of these algorithms and the application of the framework for the detection and classification of signals in other clinical contexts.
395

Towards a portable and inexpensive lab-on-a-chip device for point of care applications

Olanrewaju, Ayokunle Oluwafemi 11 1900 (has links)
Ongoing work in the laboratory of Professor Chris Backhouse is aimed at developing a portable and inexpensive lab on a chip instrument. A system capable of molecular biology protocols including sample preparation (SP), polymerase chain reaction (PCR), and melting curve analysis (MCA) would meet the requirements for point of care genetic analysis. The SP, PCR, and MCA modules were designed and tested on a standalone basis and then integrated for analysis of raw clinical samples. An automated XY stage was developed for magnetic bead-based DNA purification. In addition, a LED/CCD-based optical detection module was employed for real time PCR and MCA. Data analysis algorithms and protocols were implemented to remove noise and interpret data. This work culminated in proof of principle on-chip SP-PCR-MCA to detect ß2m DNA from human buccal cells in a modular and inexpensive system. / Biomedical Engineering
396

Regulating the anterior medial prefrontal cortex : exploratory investigation of real-time fMRI training

Smith, Rachelle Marie 11 1900 (has links)
The feasibility of using real-time functional magnetic resonance imaging (fMRI) feedback regarding the level of activation in rostromedial prefrontal cortex (rMPFC) to learn improved regulation of this brain area was examined in a group of 5 young adults. Subjects received real-time feedback from the target brain region while engaging in a blocked-design task involving alternating blocks of attempted up-regulation and down-regulation of the target brain region. A transient negative emotional state was induced prior to each scanning session. Subjects completed 6 scanning sessions (a pre-training session, 4 feedback sessions and a post-training session - no feedback was provided for pre and post-training sessions). The guideline strategy provided to subjects of engaging in emotional awareness during up-regulation and bodily awareness during down-regulation was found to consistently regulate the region in the pre-training session prior to the fMRI feedback sessions. This finding is in line with the previously proposed role of the rMPFC in emotional awareness. In contrast to previous real-time fMRI findings, greater recruitment of the region was observed in the pre-training session compared to the post-training session, with a non-significant negative trend observed across feedback sessions. These results suggest that there may be limitations to which the feedback techniques successfully employed for other brain regions extend to yet unexplored brain regions.
397

Real-time dynamic infrared scene generation fidelity enhancement /

Sills, Timothy Glenn. Unknown Date (has links)
Computing equipment is fundamental to modern day simulation. Visualisation systems are often the most important component. These produce a representation of the real world in the form of pixels. These pixels are presented to viewers and/or devices under test. / The real world is mathematically treated as continuous domain. Visualisation systems produce pixels that collectively provide a digital representation of the real world. Hence, there are difficulties with using visualisation systems to represent the real world. / Sampling processes are employed for the production of pixels in visualisation systems no matter what the graphics architecture. Considering scene content, if the sample frequency does not meet or exceed the Nyquist frequency, aliasing or spectral folding will be produced. This aliasing may be both spatial and temporal, and can be analysed in both the spatial and spatial frequency domains. Spatial aliasing manifests itself in the form of image artefacts including scene-dependent noise. Temporal aliasing manifests itself in the form of pixel scintillation. Both forms are detrimental to simulation with the degree of detriment depending on the application. For virtual urban warfare simulation, soldiers may experience motion sickness, depending on the quantity and strength of the image artefacts. For simulation of imaging missile engagements, erroneous performance results may be produced due to false cueing information from inadequate object representation. / The Defence Science & Technology Organisation (DSTO) is developing an imaging infrared missile simulation capability. A core component is production of infrared scenes using a visualisation system. This system is designed to generate the best possible scenes for the visible band of the electromagnetic spectrum, making generation of the infrared equivalent somewhat difficult. For example, colour-shading calculations are designed for the visible domain and encoded into hardware. This makes it difficult to generate infrared scenes since the colour-shading calculations might have to be written from scratch, performed outside the graphics hardware then applied to pixels. This is a secondary problem however. / The primary DSTO requirement is that the missile simulation capability provide for realistic object representations at long-distances. Compared to the equivalent within the visual band of the electromagnetic spectrum, the seeker must be presented with scenes that are characterised by much larger dynamic range, using objects with smaller features of interest. The visualisation system sample frequency is therefore insufficient for accurate generation of infrared scenes since the objects are positioned at distances beyond what may be considering normal operating range. The resultant undersampling produces significant spatial and temporal aliasing, resulting in spatial artefacts and pixel scintillation, increasing with object range. This problem must be addressed before any other since undersampling has the potential to render infrared scenes totally erroneous at longer distances. / This research describes the work that has been completed to address the undersampling problem of infrared scene generation on visualisation systems. The overall aim was to address the problem for two types of objects: hard body objects such as airframes, and dynamic objects such as engine exhaust plumes and off-board aircraft countermeasures. The problem was addressed for both types of objects in both a pre- and post-pixelation manner, i.e. before and after the generation of spatial and temporal artefacts. The outcome has been successful, resulting in new anti-aliasing schemes for infrared scene generation on commercial visualisation systems. / Thesis ([PhDInformationTechnology])--University of South Australia, 2004.
398

Toward real-time control of surface irrigation

Khatri, K. L. January 2007 (has links)
[Abstract]: The performance of surface irrigation is a function of the field design, infiltration characteristic of the soil, and the irrigation management practice. However, the complexity of the interactions makes it difficult for irrigators to identify optimaldesign or management practices. The infiltration characteristic of the soil is the most crucial of all the factors affecting the performance of surface irrigation and both spatial and temporal variations in the infiltration characteristic are a major physicalconstraint to achieving higher irrigation application efficiencies. Real-time optimisation and control has the potential to overcome these spatial and temporalvariations and return highly significant improvements in performance. Calculation of the infiltration parameters from irrigation advance data is now the preferred method.If the process is to be included in a real time control system it must be done accurately, reliably and rapidly, and with a minimum of field data. Substantial workhas been directed towards developing methods to estimate the infiltration characteristics of soil from irrigation advance data. However, none of the existing methods are entirely suitable for use in real time control. The greatest limitation is that they are data intensive and or unreliable and provide soil infiltration propertiesafter an irrigation event.A simple real-time control system for furrow irrigation is proposed that: predicts the infiltration characteristics of the soil in real-time using data measured during anirrigation event, simulates the irrigation, and determines the optimum time to cut-off for that irrigation. The basis of the system is a new method for the Real-timeEstimation of the Infiltration Parameters (REIP) under furrow irrigation, developed during this research study, and that uses a model infiltration curve, and a scalingprocess to predict the infiltration characteristics for each furrow and each irrigation event. The underlying hypothesis for the method is that the shape of the infiltration characteristic for a particular field or soil is relatively constant (across the field andwith time), despite variations in the magnitude of the infiltration rate or amount. A typical furrow in the field is selected for evaluation (known as the model furrow)and its infiltration parameters (a, k, fo) in the Kostiakov–Lewis equation are determined by a model such as INFILT or IPARM using inflow, advance and runoffdata. Subsequently the infiltration parameters for this model furrow can be scaled to give the cumulative infiltration curves for the whole field. In this process a scaling factor (F) is formulated from rearrangement of the volume balance equation and is calculated for each furrow/event using the model infiltration parameters and the single advance point. The performance of each furrow can then be simulated and optimised using an appropriate simulation model to determine the preferred time tocut-off.Using this new method, infiltration parameters were calculated for two different fields T & C. The SIRMOD simulation model was then used to simulate irrigationperformance (application efficiency, requirement efficiency and uniformity) under different model strategies. These strategies were framed to assess the feasibility of and demonstrate the gains from the real-time control strategy. The infiltration evaluation results revealed that the infiltration curves produced by the proposed method were of similar shape and hence gave a distribution of cumulative depths of infiltration for the whole field that was statistically equivalent to that given using the complete set of advance data for each furrow. The advance trajectories produced bythe proposed method also matched favourably to the measured advances.The simulation results showed firstly that the scaled infiltration gave predictions of the irrigation performance similar to the actual performance. They also indicated that by adopting the simple real time control system, irrigation application efficiencies forthe two fields could be improved from 76% for field T and 39% for field C (under usual farm management) to 83% and 70% for the fields T & C, respectively. Savingsof 1239 m3 in the total volume of water applied per irrigation over the area of 7.1 ha of both fields were indicated, which can be used beneficially to grow more crop. The proposed real-time control system is shown to be feasible. It requires few data for itsoperation and provides the infiltration characteristics for each furrow without significant loss of accuracy. The irrigation performance is improved greatly from thatachieved under current farmer management and a substantial reduction in the volume of water applied per irrigation is achievable.
399

Integrated position and attitude determination for augmented reality systems

Scott-Young, Stephen Unknown Date (has links) (PDF)
One of the most challenging tasks for augmented reality systems is that of position and attitude determination in outdoor unprepared environments. Augmented reality, a technology that overlays digital information with views of the real world, requires accurate and precise position and attitude determination to operate effectively. For small (often indoor) areas, careful preparation of the environment can allow for augmented reality systems to work successfully. In large outdoor environments, however, such preparation is often impractical, time-consuming and costly. This thesis aims to investigate the development of a position and attitude determination component for augmented reality systems capable of operation in outdoor unprepared environments. The hypothesis tested in this investigation is that the integration of Global Positioning System (GPS), Dead Reckoning (DR) and map matching techniques enables the continuous and accurate real-time visual alignment of three-dimensional data with objects in the perspective view of a user operating in outdoor unprepared environments.
400

Scheduling and management of real-time communication in point-to-point wide area networks

Pope, Cheryl Lynn January 2003 (has links)
Applications with timing requirements, such as multimedia and live multi-user interaction, are becoming more prevalent in wide area networks. The desire to provide more predictable performance for such applications in packet switched wide area networks is evident in the channel management provided by Asynchronous Transfer Mode (ATM) networks and in the extensions to the Internet protocols proposed by the Internet Engineering Task Force (IETF) working groups on integrated and differentiated service. The ability to provide guarantees on the performance of traffic flows, such as packet delay and loss characteristics, relies on an accurate model of the traffic arrival and service at each node in the network. This thesis surveys the work in bounding packet delay based on various proposed queuing disciplines and proposes a method for more accurately defining the traffic arrival and worst case backlog experienced by packets. The methods are applied to the first in first out (FIFO) queuing discipline to define equations for determining the worst case backlog and queuing delay in multihop networks. Simulation results show a significant improvement in the accuracy of the delay bounds over existing bounds published in the literature. An improvement of two orders of magnitude can be realised for a ten hop path and the improvement increases exponentially with the length of the path for variable rate network traffic. The equations derived in the thesis also take into consideration the effect of jitter on delay, thereby removing the requirement for rate controllers or traffic shaping within the network. In addition to providing more accurate delay bounds, the problem of providing fault tolerance to channels with guaranteed quality of service (QoS) is also explored. This thesis introduces a method for interleaving resource requirements of backup channels to reduce the overall resource reservations that are required to provide guaranteed fault recovery with the same QoS as the original failed channel. An algorithm for selecting recovery paths that can meet a channel's QoS requirements during recovery is also introduced. / Thesis (Ph.D.)--Computer Science, 2003.

Page generated in 0.0675 seconds