• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 130096
  • 18589
  • 11323
  • 8080
  • 6978
  • 6978
  • 6978
  • 6978
  • 6978
  • 6952
  • 5619
  • 2333
  • 1457
  • 1297
  • 529
  • Tagged with
  • 218319
  • 40881
  • 34006
  • 30241
  • 28903
  • 25772
  • 22637
  • 19196
  • 17236
  • 16188
  • 16107
  • 13493
  • 13466
  • 13315
  • 12936
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
631

Simulation Analysis of a Ventilated Building Integrated Photovoltaic Air-Gap Duct System For Natural Ventilation of a Building

Zarmehr, Arash 01 January 2019 (has links)
We introduce a low-cost and low-maintenance wind-catcher duct system design addition to BIPV systems that increase airflow velocity and decrease air temperature resulting in increased performance for the PV system electricity output. The results of our work demonstrate the design can further enhance energy performance by utilizing the increased airflow from the duct system to naturally ventilate an attic. Similar benefits were observed for different variations of the design under a parametric analysis finding the most optimal configuration to increase airflow velocity and decrease temperature. Building integrated photovoltaics (BIPV) is becoming more popular and widely used to increase sustainability and decrease overall energy costs. Improving (BIPV) efficiency will benefit a wide range of applications in architecture and mechanical engineering. (BIPV) provides savings in electricity costs, lower pollution, and reduce material costs by utilizing renewable energy. BIPV functions as the outer layer of a structure, and therefore influence the heating and cooling loads of a building due to the change in thermal resistance. A BIPV ventilation air-gap system and its effects on heating and cooling loads are presented in this thesis. We use a computational fluid dynamics (CFD) model to analyze various ventilation strategies in the BIPV air-gap, and as well as the impact of using that air to naturally ventilate the attic for better building energy performance. One outcome of this investigation is a novel attachment to BIPV that modifies the air-gap into a miniature wind-catcher to increase performance. This design enhances traditional air-gap architecture by both increasing the velocity of natural air and decreasing outer layer and attic temperatures. Parametric analysis indicates that attic natural ventilation (NV) can be increased by 2.8 mph and that temperature is lowered by 11.2%. This work is the first step in establishing a better overall BIPV system utilizing a novel wind-catcher air-duct nozzle design.
632

The molten salt synthesis of core-shell heterostructure cathode materials for solid oxide fuel cells

Levitas, Benjamin 26 August 2022 (has links)
Solid oxide fuel cells (SOFCs) are high temperature electrochemical energy conversion devices that convert chemical energy directly to electricity. Several technical and cost drivers exist to lower the operating temperature of SOFCs from the state-of-the-art 1000˚C to less than 800˚C. However, one of the main hurdles for SOFCs becoming commercially viable is the sluggish reaction kinetics of the oxygen reduction reaction (ORR) at the cathode as the operating temperature is reduced. La0.6Sr0.4Co0.2Fe0.8O3-∂ (LSCF) perovskite is a state-of-the-art cathode material, yet it suffers from low surface catalytic activity for the ORR. One way to mitigate this shortcoming has been to infiltrate nanoparticle electrocatalysts, such as La0.8Sr0.2MnO3-∂ (LSM) perovskite onto the surface of LSCF. However, infiltration often requires multiple rounds to achieve uniform coverage of the substrate, numerous chemical agents for optimization, and additional decomposition and calcination steps. These pitfalls can be eliminated by instead using a molten salt to synthesize core-shell heterostructures. In this work, the molten salt synthesis (MSS) is investigated as a potential method to synthesize core-shell LSCF-LSM SOFC cathodes. The influence of multiple molten salt chemistries on the successful synthesis of LSM and stability of LSCF is studied and a thermodynamic rationale is determined. The MSS of core-shell LSCF-LSM nanoparticles is demonstrated for the first time and parameters are explored to maximize core-shell yield. Core-shell heterostructure cathodes are then fabricated and their performances are characterized using electrochemical impedance spectroscopy (EIS). / 2023-08-26T00:00:00Z
633

Analysis And Optimization Of A Solar Thermal Collector With Integrated Storage

Bonadies, Monica 01 January 2010 (has links)
Solar energy, a topic popular in the United States during the oil embargo of the 1970’s, has become a relevant topic once more with the current focus on reducing greenhouse emissions. Solar thermal energy in particular has become popular as it uses existing steam turbine technology to produce electricity, with the benefit of using solar energy to produce steam rather than coal or nuclear heat sources. Solar thermal can also be used at lower temperatures to heat water for pools or for residential use. While this energy source has its benefits, it has the problem of being opportunistic – the energy must be used as it is captured. With the integration of storage, a solar thermal system becomes more viable for use. In this work, a low temperature (50-70o C) thermal storage unit with a solar thermal collector is experimentally run then studied using both analytical and numerical methods. With these methods, suggestions for future developments of the storage unit are made. The prototype collector and storage combination tested worked best during the winter months, when there was low humidity. Furthermore, the heat exchanger design within the storage unit was found to work well for charging (heating) the unit, but not for discharging the storage to heat water. The best modeling method for the storage unit was the use of FLUENT, which would allow for the suggested changes to the prototype to be simulated before the next prototype was constructed.
634

The Development Of A Human-centric Fuzzy Mathematical Measure Of Human Engagement In Interactive Multimedia Systems And Applications

Butler, Chandre 01 January 2010 (has links)
The utilization of fuzzy mathematical modeling for the quantification of the Human Engagement is an innovative approach within Interactive Multimedia applications (mainly video-based games designed to entertain or train participants on intended topics of interest) that can result in measurable and repeatable results. These results can then be used to generate a cogent Human Engagement definition. This research is designed to apply proven quantification techniques and Industrial/Systems Engineering methodologies to nontraditional environments such as Interactive Multimedia. The outcomes of this research will provide the foundation, initial steps and preliminary validation for the development of a systematic fuzzy theoretical model to be applied for the quantification of Human Engagement. Why is there a need for Interactive Multimedia applications in commercial and educational environments including K-20 educational systems and industry? In the latter case, the debate over education reform has drawn from referenced areas within the Industrial Engineering community including quality, continuous improvement, benchmarking and metrics development, data analysis, and scientific/systemic justification requirements. In spite of these applications, the literature does not reflect a consistent and broad application of these techniques in addressing the evaluation and quantification of Human Engagement in Interactive Multimedia. It is strongly believed that until an administrative based Human Engagement definition is created and accepted, the benefits of Interactive Multimedia may not be fully realized. The influence of gaming on society is quite apparent. For example, the increased governmental appropriations for iv Simulations & Modeling development as well as the estimated multi-billion dollar consumer PC/console game market are evidence of Interactive Multimedia opportunity. This body of work will identify factors that address the actual and perceived levels of Human Engagement in Interactive Multimedia systems and Virtual Environments and factor degrees of existence necessary to quantify and measure Human Engagement. Finally, the research will quantify the inputs and produce a model that provides a numeric value that defines the level of Human Engagement as it is evaluated within the interactive multimedia application area. This Human Engagement definition can then be used as the basis of study within other application areas of interest.
635

Tuning The Properties Of Nanomaterials As Function Of Surface And Environment

Karakoti, Ajay 01 January 2010 (has links)
Nanotechnology has shaped the research and development in various disciplines of science and technology by redefining the interdisciplinary research. It has put the materials science at the forefront of technology by allowing the researchers to engineer materials with properties ranging from electronics to biomedical by using materials as diverse as ceramics to just plain carbon. These exceptional properties are achieved by minimizing the dimension of particles in such smaller domains that the boundary between the individual atoms, ions or cluster of particles is very small. This results in a change in conventional properties of particles from continuum physics to quantum physics and hence the properties of nanoparticles can be tuned based upon their size, shape and dimensionality. One of the most apparent changes upon decreasing the particle size is the increase in surface area to volume ratio. Thus nanoparticles possess greater tendency to interact with the environment in which they are present and similarly the environment can affect the properties of nanomaterials. The environment here is described as the immediate solid, liquid or gaseous material in immediate contact with the external surface of the nanoparticles. In order to control the physico-chemical properties of nanoparticles it is important to control the surface characteristics of nanoparticles and its immediate environment. The current thesis emphasizes the role of tuning the surface of nanoparticles and/or the environment around the nanoparticles to control their properties.
636

Harmony An Architecture For Network Centric Heterogeneous Terrain Database Re-generation

Graniela, Benito 01 January 2011 (has links)
Homogeneous proprietary online terrain databases are prolific. So also is the one directional generation and update process for these terrain databases. The existence of architectures and common ontologies that enable consistent and harmonious outcomes between distributed, multi-directional, heterogeneous terrain databases are lacking. Further due to technological change that empowers end-users, the expectations for immediate terrain database update are constantly increasing. As an example, a variety of incompatible synthetic environmental representations are used for military Modeling and Simulation applications. Regeneration and near-real-time update of compiled synthetic environments in a distributed, heterogeneous run time environment is an issue that is relevant to correlation of geospecific representations that are optimized for live, virtual, constructive and distributed simulation applications. Military systems of systems like the Future Combat Systems are emblematic of the regeneration challenge. The battlefields of the future will need constant updates of diverse synthetic representations of the real world environment. These updates will be driven by near real-time data from the battlefield as well as other constantly evolving intelligence and remote sensing sources. Since the Future Combat Systems will use embedded training, it will need to maintain a representation correlated with the actual battlefield as well as many other systems. To iv achieve this correlation, constant updates to the heterogeneous synthetic environment representations in the Future Combat Systems platforms will be required. An approach to overcoming the implicit bandwidth and communication limitations is to limit updates to changes only. Today’s traditional military Terrain Database (TDB) generation systems convert standard geographical source data products into many different target formats using what is refer to as pipeline flow paradigm. In the pipeline paradigm, TDBs are originally generated centrally upstream and flow downstream out to numerous divergent and distributed formats. In the pipeline paradigm, updates are centrally managed and distributed. This pipeline paradigm does not account for updates occurring on target formats and therefore such updates are not reflected upstream on the source data that originally generated the TDB. Since target format changes are not incorporated into the upstream geographical source data, adjacent streams of dependent target formats derived from the same geographical source data may not receive the changes either. The outcome of change in the pipeline TDB generation systems paradigm is correlation and interoperability errors between target formats as well as between the original upstream data source. An alternative paradigm that addresses data synchronization of geographical source data and target formats while accommodating bandwidth limitation is needed. This v dissertation proposes a “partial bi-directional TDB regeneration” paradigm, which envisions network based TDB updates between reliable partners. Partial bi-directional TDB regeneration is an approach that is very attractive as it reduces the amount of changes by only updating the affected target format data element. This research, presents an implementation of distributed, partial and bi-directional TDB regeneration through agent theory and ontologies over a network. Agent theory and ontologies are used to interpret data changes on external target formats and implement the necessary transformations on the Internal TDB generation system data elements to achieve consistency between all correlated representations. In this approach a variety of agents will exist and their behavior and knowledge will be customized based on ontologies that describe the target format. It is expected that such a system will provide a TDB generation paradigm that can address the implicit issues of: distribution, time, expertise, monetary, labor-constraints, and update frequency, while addressing the explicit issue of correlation between the external targets formats over time and at the same time reducing bandwidth requirements associated with traditional TDB generation system.
637

Homologous Pairing Through Dna Driven Harmonics-- A Simulation Investigation

Calloway, Richard J 01 January 2011 (has links)
The objective of this research is to determine if a better understanding of the “molecule of life”, deoxyribonucleic acid or DNA can be obtained through Molecular Dynamics (MD) modeling and simulation (M&S) using contemporary MD M&S. It is difficult to overstate the significance of the DNA molecule. The now-completed Human Genome Project stands out as the most significant testimony yet to the importance of understanding DNA. The Human Genome Project (HGP) enumerated many areas of application of genomic research including molecular medicine, energy sources, environmental applications, agriculture and livestock breeding to name just a few. (Science, 2008) In addition to the fact that DNA contains the informational blueprints for all life, it also exhibits other remarkable characteristics most of which are either poorly understood or remain complete mysteries. One of those completely mysterious characteristics is the ability of DNA molecules to spontaneously segregate with other DNA molecules of similar sequence. This ability has been observed for years in living organisms and is known as “homologous pairing.” It is completely reproducible in a laboratory and defies explanation. What is the underlying mechanism that facilitates long-range attraction between 2 double-helix DNA molecules containing similar nucleotide sequences? The fact that we cannot answer this question indicates we are missing a fundamental piece of information concerning the DNA bio-molecule. The research proposed herein investigated using the Nano-scale Molecular Dynamics NAMD (Phillips et al., 2005) simulator the following hypotheses: H(Simulate Observed Closure NULL) : = Current MD force field models when used to model DNA molecule segments, contain sufficient variable terms and parameters to describe and reproduce iv directed segregating movement (closure of the segments) as previously observed by the Imperial College team between two Phi X 174 DNA molecules. H(Resonance NULL) : = Current MD force field models when used to model DNA molecule segments in a condensed phased solvent contain sufficient variable terms and parameters to reproduce theorized molecular resonation in the form of frequency content found in water between the segments. H(Harmonized Resonance NULL) : = Current MD force field models of DNA molecule segments in a condensed phase solvent produce theorized molecular resonation in the form of frequency content above and beyond the expected normal frequency levels found in water between the segments. H(Sequence Relationship NULL): = The specific frequencies and amplitudes of the harmonized resonance postulated in H(Harmonized Resonance NULL) are a direct function of DNA nucleotide sequence. H(Resonance Causes Closure NULL) : = Interacting harmonized resonation produces an aggregate force between the 2 macro-molecule segments resulting in simulation of the same directed motion and segment closure as observed by the Imperial College team between two Phi X 174 DNA molecules. After nearly six months of molecular dynamic simulation for H(Simulate Observed Closure NULL) and H(Resonance Causes Closure NULL) no evidence of closure between two similar sequenced DNA segments was found. There exist several contributing factors that potentially affected this result that are described in detail in the Results section. Simulations investigating H( Resonance NULL), H(Harmonized Resonance NULL) and the emergent hypothesis H(Sequence Relationship NULL) on the other hand, revealed a rich selection of periodic pressure variation occurring in the solvent between simulated DNA molecules. About v 20% of the power in Fourier coefficients returned by Fast Fourier Transforms performed on the pressure data was characterized as statistically significant and was located in less than 2% of the coefficients by count. This unexpected result occurred consistently in 5 different system configurations with considerable system-to-system variation in both frequency and magnitude. After careful analysis given the extent of our experiments the data was found to be in support of H( Resonance NULL), and H(Harmonized Resonance NULL) . Regarding the emergent hypothesis H(Sequence Relationship NULL), further analysis was done on the aggregate data set looking for correlation between nucleotide sequence and frequency/magnitude. Some of the results may be related to sequence but were insufficient to prove it. Overall the conflicting results were inconclusive so the hypothesis was neither accepted nor rejected. Of particular interest to future researchers it was noted that the computational simulations performed herein were NOT able to reproduce what we know actually happens in a laboratory environment. DNA segregation known to occur in-vitro during the Imperial College investigation did not occur in our simulation. Until this discrepancy is resolved MM simulation should not as yet be considered a suitable tool for further investigation of Homologous Chromosome Pairing. In Chapter 5 specific follow on research is described in priority of need addressing several new questions.
638

Indian Sign Language Numbers Recognition Using Intel RealSense Camera

Mudduluru, Sravani 01 May 2017 (has links) (PDF)
The use of gesture based interaction with devices has been a significant area of research in the field of computer science since many years. The main idea of these kind of interactions is to ease the user experience by providing high degree of freedom and provide more interactive way of communication with the technology in a natural way. The significant areas of applications of gesture recognition are in video gaming, human computer interaction, virtual reality, smart home appliances, medical systems, robotics and several others. With the availability of the devices such as Kinect, Leap Motion and Intel RealSense cameras accessing the depth as well as color information has become available to the public with affordable costs. The Intel RealSense camera is a USB powered controller that can be supported with few hardware requirements such as Windows 8 and above. This is one such camera that can be used to track the human body information similar to the Kinect and Leap Motion. It was designed specifically to provide more minute information about the different parts of the human body such as face, hand etc. This camera was designed to give users more natural and intuitive interactions with the smart devices by providing some features such as creating 3D avatars, high quality 3D prints, high-quality graphic gaming visuals, virtual reality and others. The main aim of this study is to try to analyze hand tracking information and build a training model in order to decide if this camera is suitable for sign language. In this study, we have extracted the joint information of 22 joint labels per single hand .We trained the model to identify the Indian Sign Language(ISL) numbers from 0-9. Through this study we analyzed that multi-class SVM model showed higher accuracy of 93.5% when compared to the decision tree and KNN models.
639

Energy modelling in the South African electric power industry

Ben-Yaacov, Giora Zeev 27 September 2023 (has links) (PDF)
The subject of this thesis is the development of a modelling system for the planning of the South African electricity supply industry. A power system planning process was developed with the object of ~stablishing a long-range development plan that would enable the selection and timing of individual projects. This planning process is represented by several sub-models which include the prediction of future demands, the assessment of the generating capacity and the proportions of the different types of generating plant on the system (the plant "mix"), the analysis of the transmission network performances and the simulation of the financial processes. One of the characteristics of the modelling system presented is its three-stage structure. The first stage is represented by the load model for the long-term prediction of power and energy demand. The second stage is represented by the two expansion models for generation and transmission. With the aid of these two models several expansion strategies are pre-selected, and then in the third stage, using the financial model, the total capital requirements are established. The future system demand is predicted from the analysis of historical data and by the formulation of the electricity - and economic growth relationship: Two major calculations, linear programming optimisation and loss of load probability analysis, are used in the computer models which aid the policy makers in selecting the optimum size and mix of the generating plants and scheduling the operation and maintenance of the generating units. Transmission system analysis programs have been developed in such a way that they use a common power system data base which enables the planning engineers to store and maintain their power system data. The calculation routines include load flow, fault analysis, stability studies, single and three phase travelling. Waves, and power and high ii frequency transmission line parameters. The costing and financial models include the analysis of costs of all new expansion equipment, the costs of operation and maintenance of the generation units, transmission equipment and fuel, and the simulation of the financial environment and accounting processes. On the basis of experience gained so far, the author arrives at the conclusion that the modelling system presented is capable of providing a useful tool for planning the expansion of the South African electricity supply industry .
640

Topics in perturbation analysis for stochastic hybrid systems

Lima Fleck, Julia 21 June 2016 (has links)
Control and optimization of Stochastic Hybrid Systems (SHS) constitute increasingly active fields of research. However, the size and complexity of SHS frequently render the use of exhaustive verification techniques prohibitive. In this context, Perturbation Analysis techniques, and in particular Infinitesimal Perturbation Analysis (IPA), have proven to be particularly useful for this class of systems. This work focuses on applying IPA to two different problems: Traffic Light Control (TLC) and control of cancer progression, both of which are viewed as dynamic optimization problems in an SHS environment. The first part of this thesis addresses the TLC problem for a single intersection modeled as a SHS. A quasi-dynamic control policy is proposed based on partial state information defined by detecting whether vehicle backlogs are above or below certain controllable threshold values. At first, the threshold parameters are controlled while assuming fixed cycle lengths and online gradient estimates of a cost metric with respect to these controllable parameters are derived using IPA techniques. These estimators are subsequently used to iteratively adjust the threshold values so as to improve overall system performance. This quasi-dynamic analysis of the TLC\ problem is subsequently extended to parameterize the control policy by green and red cycle lengths as well as queue content thresholds. IPA estimators necessary to simultaneously control the light cycles and thresholds are rederived and thereafter incorporated into a standard gradient based scheme in order to further ameliorate system performance. In the second part of this thesis, the problem of controlling cancer progression is formulated within a Stochastic Hybrid Automaton (SHA) framework. Leveraging the fact that cell-biologic changes necessary for cancer development may be schematized as a series of discrete steps, an integrative closed-loop framework is proposed for describing the progressive development of cancer and determining optimal personalized therapies. First, the problem of cancer heterogeneity is addressed through a novel Mixed Integer Linear Programming (MILP) formulation that integrates somatic mutation and gene expression data to infer the temporal sequence of events from cross-sectional data. This formulation is tested using both simulated data and real breast cancer data with matched somatic mutation and gene expression measurements from The Cancer Genome Atlas (TCGA). Second, the use of basic IPA techniques for optimal personalized cancer therapy design is introduced and a methodology applicable to stochastic models of cancer progression is developed. A case study of optimal therapy design for advanced prostate cancer is performed. Given the importance of accurate modeling in conjunction with optimal therapy design, an ensuing analysis is performed in which sensitivity estimates with respect to several model parameters are evaluated and critical parameters are identified. Finally, the tradeoff between system optimality and robustness (or, equivalently, fragility) is explored so as to generate valuable insights on modeling and control of cancer progression.

Page generated in 0.2094 seconds