• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 138
  • 29
  • 11
  • 10
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 238
  • 238
  • 42
  • 35
  • 29
  • 26
  • 25
  • 21
  • 21
  • 21
  • 19
  • 19
  • 19
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Continuum electrostatics of biomolecular systems

Xin, W. (Weidong) 08 April 2008 (has links)
Abstract Electrostatic interactions are very important in biomolecular systems. Electrostatic forces have received a great deal of attention due to their long-range nature and the trade-off between desolvation and interaction effects. It remains a challenging task to study and to predict the effects of electrostatic interactions in biomolecular systems. Computer simulation techniques that account for such interactions are an important tool for the study of biomolecular electrostatics. This study is largely concerned with the role of electrostatic interactions in biomolecular systems and with developing novel models to estimate the strength of such interactions. First, a novel formulation based upon continuum electrostatics to compute the electrostatic potential in and around two biomolecules in a solvent with ionic strength is presented. Many, if not all, current methods rely on the (non)linear Poisson-Boltzmann equation to include ionic strength. The present formulation, however, describes ionic strength through the inclusion of explicit ions, which considerably extends its applicability and validity range. The method relies on the boundary element method (BEM) and results in two very similar coupled integral equations valid on the dielectric boundaries of two molecules, respectively. This method can be employed to estimate the total electrostatic energy of two protein molecules at a given distance and orientation in an electrolyte solution with zero to moderately high ionic strength. Secondly, to be able to study interactions between biomolecules and membranes, an alternative model partly based upon the analytical continuum electrostatics (ACE) method has been also formulated. It is desirable to develop a method for calculating the total solvation free energy that includes both electrostatic and non-polar energies. The difference between this model and other continuum methods is that instead of determining the electrostatic potential, the total electrostatic energy of the system is calculated by integrating the energy density of the electrostatic field. This novel approach is employed for the calculation of the total solvation free energy of a system consisting of two solutes, one of which could be an infinite slab representing a membrane surface.
172

Microfabrication of Plasmonic Biosensors in CYTOP Integrating a Thin SiO2 Diffusion and Etch-barrier Layer

Hanif, Raza January 2011 (has links)
A novel process for the fabrication of Long Range Surface Plasmon Polariton (LRSPP) waveguide based biosensors is presented herein. The structure of the biosensor is comprised of Au stripe waveguide devices embedded in thick CYTOP claddings with a SiO2 solvent diffusion barrier and etch-stop layer. The SiO2 layer is introduced to improve the end quality of Au waveguide structures, which previously deformed during the deposit of the upper cladding process and to limit the over-etching of CYTOP to create micro-fluidic channels. The E-beam evaporation method is adapted to deposit a thin SiO2 on the bottom cladding of CYTOP. A new micro-fluidic design pattern is introduced. Micro-fluidic channels were created on selective Au waveguides through O2 plasma etching. The presented data and figures are refractive index measurements of different materials, thickness measurements, microscope images, and AFM images. Optical power cutback measurements were performed on fully CYTOP-cladded symmetric LRSPP waveguides. The end-fire coupling method was used to excite LRSPP modes with cleaved polarization maintaining (PM) fibre. The measured mode power attenuation (MPA) was 6.7 dB/mm after using index-matched liquid at input and output fibre-waveguide interfaces. The results were compared with the theoretical calculations and simulations. Poor coupling efficiency and scattering due to the SiO2 are suspected for off-target measurements.
173

Structures ordonnées dans les systèmes avec des interactions à longue portée / Ordered equilibrium structures in systems with long-range interactions

Antlanger, Moritz 12 October 2015 (has links)
L'un des concepts fondamentaux dans l'étude des métamatériaux est que, dans certains cas, les propriétés du milieu sont déterminées par l'arrangement structurel de ses composants plutôt que par les propriétés intrinsèques des particules. De telles structures hautement ordonnées peuvent servir de guide d'onde en acoustique, ainsi que pour des ondes élastiques ou électromagnétiques ; elles peuvent aussi induire de nouvelles propriétés, ouvrant ainsi de nouvelles perspectives dans la conception des matériaux. Dans ce champ de recherche, la question centrale est de trouver comment produire ces arrangements ordonnés de particules et de molécules ; par exemple, un grand nombre d'études expérimentales et théoriques s'appuient sur des mécanismes d'auto-assemblage.Dans les études théoriques, ainsi que dans les simulations numériques, les interactions entre les constituants sont déterminées par des potentiels modèles ou effectifs dont la portée et la forme déterminent les structures collectives. Les potentiels utilisés sont souvent à courte portée, c'est-à-dire qu'ils ont une décroissance très rapide avec la distance ; typiquement, des molécules séparées de quelques diamètres moléculaires n'interagissent pas directement. Dans cette thèse, nous nous intéressons à ces structures ordonnées qu'il est possible d'obtenir, non pas avec des interactions à courte portée, mais avec des interactions à très longue portée (Coulomb, etc.). Notre démarche consiste dans un premier temps à déterminer les structures optimales à température nulle (états fondamentaux) en minimisant le potentiel thermodynamique adéquat ; puis, nous étudions la stabilité thermique de ces structures à basse température à l'aide de simulations numériques de Monte-Carlo. Nous observons une pléthore de structures prototypes, ainsi que des transitions de phases entre elles. / The central paradigm in the emerging field of metamaterials is that the properties of a material are in certain cases governed rather by the well-ordered spatial arrangement of its constituent particles than by the properties of those particles themselves. Since such highly ordered patterns can act as waveguides for acoustic, elastic, or electromagnetic waves, they can give rise to novel material properties, opening up new avenues in materials design. The central problem of how to produce the required ordered particle arrangements, e.g., via self-assembly, has received significant attention both from the experimental and theoretical sides.In theoretical studies, the interactions between particles are modeled via potential functions, whose shape and range have a profound impact on the formed structures. These potentials are often short-ranged, i.e., they are characterized by a rapid decay with distance. In this thesis, we focus on systems featuring long-range interactions, where particles interact over significantly larger distances than the mean inter-particle separation. Typical examples for such potentials are charged or multipolar interactions.In our approach, we first determine the ordered structures formed by the particles at vanishing temperature by minimizing the relevant thermodynamic potential. We observe a surprising plethora of different structural archetypes as well as novel phase transition scenarios. Then, we investigate the stability of these structures at low temperatures using Monte Carlo simulations.
174

Väderprognoser på lång räckvidd och säsongsmodellers prestanda utifrån allmänhetens perspektiv / Long range weather prediction and seasonal model performance from the public’s perspective

Bergman, Viktor January 2020 (has links)
Långtidsprognoser beskriver gapet mellan väderprognoser och klimatmodeller som förutspår klimatförändringar p.g.a. den globala uppvärmningen. Långtidsprognostik förlitar sig på så kallade “källor av förutsägbarhet”. Dessa kan vara variabler som ändrar sig långsamt, som havsvattnets yttemperatur, eller variationsmönster såsom El Nino-Southern Oscillation. Många industrisektorer och delar av samhället som ´ idag använder väderprognoser i sina dagliga beslut, som t.ex. jordbruksindustrin, energiindustrin eller någon annan väderkänslig sektor, kan potentiellt dra nytta av träffsäkra och pålitliga långtidsprognoser. Bland de potentiella användarna finns förstås också privatpersoner. Syftet med denna studie är att introducera långtidsprognostik och att försöka utvärdera prestandan av ECMWFs välkända och etablerade säsongsmodell SEAS5, genom tidigare forskning och ECMWFs verifikationsfigurer som finns tillgängliga för allmänheten. Utvärderingen gjordes utifrån allmänhetens perspektiv och vad de skulle uppfatta som en “bra” prognos. Marknära temperatur och delvis nederbörd undersöktes för Europa under sommar och vinter, i termer av “skill” (träffsäkerhet) och “reliability” (pålitlighet, på så sätt att en händelse som förutspås med 60% sannolikhet också ska observeras i ungefär 60% av fallen). SEAS5 når inte riktigt upp i “bra” skill-nivåer för marknära temperatur, om “bra” motsvarar synoptiskt användbar prognos. Det finns dock stora skillnader mellan olika platser och säsonger, där “bra” skill framförallt märks i södra/sydöstra Europa på sommaren och delar av norra Europa på vintern. “Reliability” är generellt bra, även om det är svårt att avgöra hur den skiljer mellan olika platser. Nederbörd visar dock mycket dålig skill och låg “reliability” oavsett säsong eller plats. / Long range weather prediction describe weather forecasts with a range longer than 14 days, but shorter than climate prediction models that predict climate change due to global warming. Long range forecasting relies on sources of predictability that, for example, changes slowly such as sea surface temperatures, or varies in predictable patterns like the El Nino-Southern Oscillation. Many sectors of industry and society ´ that today use weather forecasting in their day-to-day decision making, such as agriculture, energy or any other weather sensitive sector, have potentially much to benefit from accurate and reliable long range forecasts. Among potential users is of course the general public. The purpose of this study is to give an introduction to long range weather prediction and attempt to evaluate the performance of ECMWF’s SEAS5 seasonal model, which is one of the most well known and established S2S models, by using earlier research and ECMWF’s publicly available verification charts. This was done from the public’s perspective of what would be considered a “good” forecast, mainly from near surface air temperature but also precipitation, during winter and summer in Europe, on the aspects of skill (accuracy) and reliability (in the sense that a probabilistic forecast of 60% for an event also is observed around 60% of the time). SEAS5 overall doesn’t quite reach “good” skill levels for near surface temperature, if “good” is defined as synoptically significant. The skill level varies significantly though, depending on region and season, with southern/southeastern Europe during summer and parts of northern Europe during winter being notable “good” situations. Reliability is generally good, even if it is difficult to know how reliability varies spatially. However, precipitation shows very little skill and low reliability, no matter the season or region.
175

Long-Range High-Throughput Wireless Communication Using Microwave Radiation Across Agricultural Fields

Paul Christian Thieme (8151186) 19 December 2019 (has links)
Over the past three decades, agricultural machinery has made the transition from purely mechanical systems to hybrid machines, reliant on both mechanical and electronic systems. A this transformation continues, the most modern agricultural machinery uses networked systems that require a network connection to function to their full potential. In rural areas, providing this network connection has proven difficult. Obstacles, distance from access points, and incomplete coverage of cellular connection are all challenges to be overcome. “Off the shelf” commercial-grade Wi-Fi equipment, including many products from Ubiquiti like the Bullet M2 transceiver and the PowerBeam point-to-point linking system, as well as antennas by Terrawave, Crane, and Hawking, were installed in a purpose-built system which could be implemented on a production farm. This system consisted of a tower-mounted access point which used an antenna with a 65<sup>o</sup> beamwidth, and the test included distances up to 1150 meters in an agricultural setting with corn and soybeans. Some sensors were stationary and the other platform was a tractor following a path around the farm with both 8dBi and 15dBi gain antennas. Through all tests, throughput never dropped below 5 Mb/s, and the latency of successful connections never exceeded 20ms. Packets were rarely dropped and never accounted for a significant portion of all packet transmission attempts. Environmental effects like immediate precipitation, crop heights, recent rainfall, and ambient temperature had little or no effect on wireless network characteristics. As a result, it was proven that as long as line-of-sight was maintained, reliable wireless connectivity could be achieved despite varying conditions using microwave radiation. Network throughput was marginally affected by the change in free space path loss due to increased distance between the access point and the client, as well as travel by the mobile client outside the beamwidth of the access point. By enabling this coverage, it is hoped that the implementation of new agricultural technology utilizing a live network connection will progress more rapidly.
176

HALO-Based Research Conducted by the LIM: previous Campaigns and Plans for the Future

Schmidt, Jörg, Wendisch, Manfred, Wolf, Kevin, Ehrlich, André, Nitzsche, Gunda 13 November 2017 (has links)
This article gives an overview about the activities of the Leipzig Institute of Meteorology (LIM) within the HALO (High Altitude and Long Range Aircraft) Scientific Priority Program (SPP 1294 funded by DFG). HALO offers unique possibilities for atmospheric research and Earth observations. It can carry a scientific payload of up to 3 t, cover a range of 10000 km and reach a ceiling of 15 km. The LIM contributes to the instrumentation of HALO with the Spectral Modular Airborne Radiation measurement sysTem (SMART). SMART was deployed during the first HALO mission TECHNO in 2010. During subsequent five HALO campaigns SMART measurements provided valuable insights regarding cloud properties and the Earth’s radiative budget. Three further missions, which are scheduled for the coming years, will make use of SMART measurements as well. / Dieser Bericht gibt einen Überblick über die Aktivitäten des Leipziger Instituts für Meteorologie (LIM) im HALO Schwerpunktprogramm (SPP 1294 der DFG). HALO bietet einzigartige Möglichkeiten für die Atmosphärenforschung und Erdbeobachtung. Es kann eine wissenschaftliche Nutzlast von 3 t aufnehmen, eine Reichweite von 10000 km zurücklegen und eine maximale Flughöhe von 15 km erreichen. Das LIM trägt zur Instrumentierung von HALO mit dem Spectral Modular Airborne Radiation measurement sysTem (SMART) bei. SMART wurde 2010 bei der ersten HALO Mission TECHNO eingesetzt. In fünf folgenden HALO Kampagnen verschafften SMART Messungen wertvolle Erkenntnisse bezüglich Wolkeneigenschaften und dem Strahlungsbudget der Erde. Drei weitere HALO Missionen, die für die kommenden Jahre geplant sind, werden ebenfalls SMART nutzen.
177

Towards Improving Drought Forecasts Across Different Spatial and Temporal Scales

Madadgar, Shahrbanou 03 January 2014 (has links)
Recent water scarcities across the southwestern U.S. with severe effects on the living environment inspire the development of new methodologies to achieve reliable drought forecasting in seasonal scale. Reliable forecast of hydrologic variables, in general, is a preliminary requirement for appropriate planning of water resources and developing effective allocation policies. This study aims at developing new techniques with specific probabilistic features to improve the reliability of hydrologic forecasts, particularly the drought forecasts. The drought status in the future is determined by certain hydrologic variables that are basically estimated by the hydrologic models with rather simple to complex structures. Since the predictions of hydrologic models are prone to different sources of uncertainties, there have been several techniques examined during past several years which generally attempt to combine the predictions of single (multiple) hydrologic models to generate an ensemble of hydrologic forecasts addressing the inherent uncertainties. However, the imperfect structure of hydrologic models usually lead to systematic bias of hydrologic predictions that further appears in the forecast ensembles. This study proposes a post-processing method that is applied to the raw forecast of hydrologic variables and can develop the entire distribution of forecast around the initial single-value prediction. To establish the probability density function (PDF) of the forecast, a group of multivariate distribution functions, the so-called copula functions, are incorporated in the post-processing procedure. The performance of the new post-processing technique is tested on 2500 hypothetical case studies and the streamflow forecast of Sprague River Basin in southern Oregon. Verified by some deterministic and probabilistic verification measures, the method of Quantile Mapping as a traditional post-processing technique cannot generate the qualified forecasts as comparing with the copula-based method. The post-processing technique is then expanded to exclusively study the drought forecasts across the different spatial and temporal scales. In the proposed drought forecasting model, the drought status in the future is evaluated based on the drought status of the past seasons while the correlations between the drought variables of consecutive seasons are preserved by copula functions. The main benefit of the new forecast model is its probabilistic features in analyzing future droughts. It develops conditional probability of drought status in the forecast season and generates the PDF and cumulative distribution function (CDF) of future droughts given the past status. The conditional PDF can return the highest probable drought in the future along with an assessment of the uncertainty around that value. Using the conditional CDF for forecast season, the model can generate the maps of drought status across the basin with particular chance of occurrence in the future. In a different analysis of the conditional CDF developed for the forecast season, the chance of a particular drought in the forecast period can be approximated given the drought status of earlier seasons. The forecast methodology developed in this study shows promising results in hydrologic forecasts and its particular probabilistic features are inspiring for future studies.
178

Convergence of Large Deviations Probabilities for Processes with Memory - Models and Data Study

Massah, Mozhdeh 17 April 2019 (has links)
A commonly used tool in data analysis is to compute a sample mean. Assuming a uni-modal distribution, its mean provides valuable information about which value is typically found in an observation. Also, it is one of the simplest and therefore very robust statistics to compute and suffers much less from sampling effects of tails of the distribution than estimates of higher moments. In the context of a time series, the sample mean is a time average. Due to correla- tions among successive data points, the information stored in a time series might be much less than the information stored in a sample of independently drawn data points of equal size, since correlation always implies redundancy. Hence, the issue of how close the sample estimate of a time average is to the true mean value of the process depends on correlations in data. In this thesis, we will study the proba- bility that a single time average deviates by more than some threshold value from the true process mean. This will be called the Large Deviation Probability (LDP), and it will be a function of the time interval over which the average is taken: The longer the time interval, the smaller will this probability be. However, it is the precise functional form of this decay which will be in the focus of this thesis. The LDP is proven to decay exponentially for identically independently distributed data. On the other hand we will see in this thesis that this result does not apply to long-range correlated data. The LDP is found to decay slower than exponential for such data. It will be shown that for intermittent series this exponential decay breaks down severely and the LDP is a power law. These findings are outlined in the methodological explanations in chapter 3, after an overview of the theoretical background in chapter 2. In chapter 4, the theoretical and numerical results for the studied models in chapter 3 are compared to two types of empirical data sets which are both known to be long- range correlated in the literature. The earth surface temperature of two stations of two climatic zones are modelled and the error bars for the finite time averages are estimated. Knowing that the data is long-range correlated by estimating the scaling exponent of the so called fluctuation function, the LDP estimation leads to noticeably enlarged error bars of time averages, based on the results in chapter 3. The same analysis is applied on heart inter-beat data in chapter 5. The contra- diction to the classical large deviation principle is even more severe in this case, induced by the long-range correlations and additional inherent non-stationarity. It will be shown that the inter-beat intervals can be well modeled by bounded fractional Brownian motion. The theoretical and numerical LDP, both for the model and the data, surprisingly indicates no clear decay of LDP for the time scales under study.
179

Assessing and documenting the use of indigenous knowledge systems weather and seasonal climate forecasting : a case study of Moletjie Villages Limpopo Province, South Africa

Chokoe, Serole Angela January 2022 (has links)
Thesis (M.Sc. (Geography and Environmental Studies)) -- University of Limpopo, 2022 / The purpose of this study was to assess and document the use of the Indigenous Knowledge System (IKS) in weather and seasonal climate forecasting in Moletjie, Limpopo Province. The objectives of this study were to document the biological, meteorological, and astrological indicators used to forecast local weather and climate conditions. In addition, the study sought to determine the effectiveness of indicators used in forecasting local weather and climate conditions and to represent the spatial distribution of indigenous weather knowledge using a Participatory Geographic Information System (P-GIS). Both qualitative and quantitative research approaches were used. Surveys and in-depth interviews were used to obtain primary data. The secondary data collected from formally published material (books and journals), served to compliment the primary data. Purposive sampling was used to obtain data from participants with knowledge about Indigenous Knowledge (IK) in seasonal and weather forecasting. From the study, the distribution of indigenous indicators used by Moletjie smallholder farmers was mapped using ArcGIS 10.6. Results show different types of human, biological, meteorological, and astronomical indicators utilized by communities to forecast weather and climate as such they were captured, characterized, and documented. Smallholder farmers considered indigenous knowledge indicators to be more accurate and reliable in their forecasting than scientific projections, which were described as inaccurate and received late. The research found that plants and birds used in this case are in danger of extinction at an alarming rate because of climate change and anthropogenic activities. The study derives key insights from how smallholder farmers in Moletjie village use the IK weather forecast to make farming decisions to ensure farm productivity. It also demonstrates that the smallholder farmers share their trust in modern technology and have confidence and the readiness for the use of scientific forecasting and climate projections which are expected to add value when integrated with IK / National Research Foundation (NRF)
180

Low-power high-resolution image detection

Merchant, Caleb 09 August 2019 (has links)
Many image processing algorithms exist that can accurately detect humans and other objects such as vehicles and animals. Many of these algorithms require large amounts of processing often requiring hardware acceleration with powerful central processing units (CPUs), graphics processing units (GPUs), field programmable gate arrays (FPGAs), etc. Implementing an algorithm that can detect objects such as humans at longer ranges makes these hardware requirements even more strenuous as the numbers of pixels necessary to detect objects at both close ranges and long ranges is greatly increased. Comparing the performance of different low-power implementations can be used to determine a trade-off between performance and power. An image differencing algorithm is proposed along with selected low-power hardware that is capable of detected humans at ranges of 500 m. Multiple versions of the detection algorithm are implemented on the selected hardware and compared for run-time performance on a low-power system.

Page generated in 0.0512 seconds