• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 1
  • 1
  • 1
  • Tagged with
  • 21
  • 21
  • 6
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Demystifying The Hosting Infrastructure of The Free Content Web: A Security Perspective

Alqadhi, Mohammed 01 January 2024 (has links) (PDF)
This dissertation delves into the security of free content websites, a crucial internet component that presents significant security challenges due to their susceptibility to exploitation by malicious actors. While prior research has highlighted the security disparities between free and premium content websites, it has not delved into the underlying causes. This study aims to address this gap by examining the security infrastructure of free content websites. The research commences with an analysis of the content management systems (CMSs) employed by these websites and their role. Data from 1,562 websites encompassing free and premium categories is collected to identify CMS usage and its association with malicious activities. Various metrics are employed, including unpatched vulnerabilities, total vulnerabilities, malicious counts, and percentiles. The findings reveal widespread CMS usage, even among websites with custom code, underscoring the potential for a small number of unpatched vulnerabilities in popular CMSs to lead to significant maliciousness. The study further explores the global distribution of free content websites, considering factors such as hosting network scale, cloud service provider utilization, and country-level distribution. Notably, free and premium content websites are predominantly hosted in medium-scale networks, known for their high concentration of malicious websites. Moreover, the research delves into the geographical distribution of these websites and their presence in different countries. It examines the occurrence of malicious websites and their correlation with the National Cyber Security Index (NCSI), a measure of a country's cybersecurity maturity. The United States emerges as the primary host for most investigated websites, with countries exhibiting higher rates of malicious websites tending to have lower NCSI scores, primarily due to weaker privacy policy development. In conclusion, this dissertation uncovers correlations in the infrastructure, distribution, and geographical aspects of free content websites, offering valuable insights for mitigating their associated threats.
12

Human optokinetic nystagmus : a stochastic analysis

Waddington, Jonathan January 2012 (has links)
Optokinetic nystagmus (OKN) is a fundamental gaze-stabilising response in which eye movements attempt to compensate for the retinal slip caused by self-motion. The OKN response consists of a slow following movement made in the direction of stimulus motion interrupted by fast eye movements that are primarily made in the opposite direction. The timing and amplitude of these slow phases and quick phases are notably variable, but this variability is poorly understood. In this study I performed principal component analysis on OKN parameters in order to investigate how the eigenvectors and eigenvalues of the underlying components contribute to the correlation between OKN parameters over time. I found three categories of principal components that could explain the variance within each cycle of OKN, and only parameters from within a single cycle contributed highly to any given component. Differences found in the correlation matrices of OKN parameters appear to reflect changes in the eigenvalues of components, while eigenvectors remain predominantly similar across participants, and trials. I have developed a linear and stochastic model of OKN based on these results and demonstrated that OKN can be described as a 1st order Markov process, with three sources of noise affecting SP velocity, QP triggering, and QP amplitude. I have used this model to make some important predictions about the optokinetic reflex: the transient response of SP velocity, the existence of signal dependent noise in the system, the target position of QPs, and the threshold at which QPs are generated. Finally, I investigate whether the significant variability within OKN may represent adaptive control of explicit and implicit parameters. iii
13

Extension et reformulation du modèle SEA par la prise en compte de la répartition des énergies modales

Maxit, Laurent 06 March 2000 (has links) (PDF)
Résumé Dans cette thèse, on propose une approche permettant d'étendre le domaine de validité de la méthode SEA (Statistical Energy Analysis). Elle repose sur une double formulation modale et une reformulation du modèle SEA en ne posant pas l'hypothèse d'équirépartition des énergies modales. La double formulation modale qui est décrite dans le cas général du couplage de systèmes continus tridimensionnels, consiste en une décomposition modale non standard faisant intervenir une double formulation contrainte-déplacement. Les équations modales obtenues sont alors en cohérence avec le modèle supposé de la SEA et se caractérisent à partir des modes des sous-systèmes découplés. Le modèle SmEdA qui découle de la reformulation de la SEA permet d'améliorer la qualité de la prédiction, notamment quand le recouvrement modal est faible ou quand les sous-systèmes sont excités localement. Un des points forts de l'approche proposée est qu'elle peut être facilement associée à une démarche SEA. Il est possible d'appliquer le modèle SmEdA uniquement pour les couplages des sous-systèmes où une amélioration de la prédiction peut être présumée obtenue, et utiliser le modèle SEA pour les autres couplages. L'application du modèle SmEdA à des structures industrielles est possible grâce à l'utilisation de modèles Eléments Finis des sous-systèmes. En supposant l'hypothèse d'équirépartition respectée, il découle de cette approche une nouvelle technique de calcul des facteurs de perte par couplage SEA. Celle-ci ne requière que le calcul des modes des sous-systèmes découplés par Éléments Finis. Les facteurs SEA sont alors obtenus par identification des coefficients des équations modales, sans les résoudre.
14

Distribution Tables and Federal Tax Policy: A Scoring Index as a Method for Evaluation

Fichtner, Jason J. 18 November 2005 (has links)
Distribution tables have become ubiquitous to the tax policy debates surrounding major legislative initiatives to change tax law at the federal level. The fairness of any proposed change to federal tax policy has become one of the most highlighted components of tax policy discussions. The presentation of tax data within distribution tables can hide or omit important information that is required in order to effectively evaluate the merits of any tax legislation. Many producers of distribution tables show only the information necessary to present their policy preferences in the best possible light. The different economic assumptions and presentations of data used by the various groups that release distribution tables have the inherent consequence of providing the public with numerous tables that are often used as political ammunition to influence and shape debate. The purpose of this research is to contribute to the tax policy research literature by exploring the limitations and biases inherent in specific designs of tax distribution tables and in specific methodological approaches to tax distribution analysis. This is done by means of a systematic examination of how different designs and methodologies provide an incomplete picture of a proposed change to federal tax policy. By comparing distribution tables as used by different groups to provide alternative perspectives of various tax proposals, the research shows how the use of tax distribution tables often provides misleading results about the impact of proposed tax legislation in order to influence and shape the issues surrounding a proposed change to federal tax policy. A method for evaluating tax distribution tables is proposed which highlights the deficiencies of design and methodology which characterize the present use of tax distribution tables. An index of questions is provided as part of this research project to serve as a new tool of policy analysis, an index termed the "Tax Distribution Table Scoring Index" (TDTSI). The TDTSI will assist in balancing the different perspectives presented via tax distribution tables by identifying the biases and limitations associated with different methodologies and presentations of data. / Ph. D.
15

Dimensionering av höga balkar enligt fackverksanalogi : -En parametrisk studie

Bondsman, Benjamin, Al, Barzan, Hedlund, Felix January 2019 (has links)
No description available.
16

Návrh vzduchotechniky v multifunkční sportovní hale / Air conditioning design in multifunctional sports hall

Navrátil, Radek Unknown Date (has links)
The diploma thesis deals with the design of air-conditioning equipment in the building of a multifunctional hall in Brno and also includes an experimental analysis of air flow. It consists of 3 parts. In the first part a theory of air distribution is elaborated. The second part deals with the analysis of air flow for different types of air diffusers and for one diffuser further looks for a simplified model. The last part of this work is focused on the design of air conditioning equipment in the assigned building, the multifunctional sports hall.
17

Automated Leaf-Level Hyperspectral Imaging of Soybean Plants using an UAV with a 6 DOF Robotic Arm

Jialei Wang (11147142) 19 July 2021 (has links)
<p>Nowadays, soybean is one the most consumed crops in the world. As the human population continuously increases, new phenotyping technology is needed to help plant scientists breed soybean that has high-yield, stress-tolerant, and disease-tolerant traits. Hyperspectral imaging (HSI) is one of the most commonly used technologies for phenotyping. The current HSI techniques include HSI tower and remote sensing on an unmanned aerial vehicle (UAV) or satellite. There are several noise sources the current HSI technologies suffer from such as changes in lighting conditions, leaf angle, and other environmental factors. To reduce the noise on HS images, a new portable, leaf-level, high-resolution HSI device was developed for corn leaves in 2018 called LeafSpec. Due to the previous design requiring a sliding action along the leaf which could damage the leaf if used on a soybean leaf, a new design of the LeafSpec was built to meet the requirements of scanning soybean leaves. The new LeafSpec device protects the leaf between two sheets of glass, and the scanning action is automated by using motors and servos. After the HS images have been collected, the current modeling method for HS images starts by averaging all the plant pixels to one spectrum which causes a loss of information because of the non-uniformity of the leaf. When comparing the two modeling methods, one uses the mean normalized difference vegetation index (NDVI) and the other uses the NDVI heatmap of the entire leaf to predict the nitrogen content of soybean plants. The model that uses NDVI heatmap shows a significant increase in prediction accuracy with an R2 increase from 0.805 to 0.871. Therefore, it can be concluded that the changes occurring within the leaf can be used to train a better prediction model. </p> <p>Although the LeafSpec device can provide high-resolution leaf-level HS images to the researcher for the first time, it suffers from two major drawbacks: intensive labor needed to gather the image data and slow throughput. A new idea is proposed to use a UAV that carries a 6 degree of freedom (DOF) robotic arm with a LeafSpec device as an end-effect to automatically gather soybean leaf HS images. A new UAV is designed and built to carry the large payload weight of the robotic arm and LeafSpec.</p>
18

Synergistic use of promoter prediction algorithms: A choice for small training dataset?

Oppon, Ekow CruickShank January 2000 (has links)
Philosophiae Doctor - PhD / This chapter outlines basic gene structure and how gene structure is related to promoter structure in both prokaryotes and eukaryotes and their transcription machinery. An in-depth discussion is given on variations types of the promoters among both prokaryotes and eukaryotes and as well as among three prokaryotic organisms namely, E.coli, B.subtilis and Mycobacteria with emphasis on Mituberculosis. The simplest definition that can be given for a promoter is: It is a segment of Deoxyribonucleic Acid (DNA) sequence located upstream of the 5' end of the gene where the RNA Polymerase enzyme binds prior to transcription (synthesis of RNA chain representative of one strand of the duplex DNA). However, promoters are more complex than defined above. For example, not all sequences upstream of genes can function as promoters even though they may have features similar to some known promoters (from section 1.2). Promoters are therefore specific sections of DNA sequences that are also recognized by specific proteins and therefore differ from other sections of DNA sequences that are transcribed or translated. The information for directing RNA polymerase to the promoter has to be in section of DNA sequence defining the promoter region. Transcription in prokaryotes is initiated when the enzyme RNA polymerase forms a complex with sigma factors at the promoter site. Before transcription, RNA polymerase must form a tight complex with the sigma/transcription factor(s) (figure 1.1). The 'tight complex' is then converted into an 'open complex' by melting of a short region of DNA within the sequence involved in the complex formation. The final step in transcription initiation involves joining of first two nucleotides in a phosphodiester linkage (nascent RNA) followed by the release of sigma/transcription factors. RNA polymerase then continues with the transcription by making a transition from initiation to elongation of the nascent transcript.
19

Early Detection of Dicamba and 2,4-D Herbicide Injuries on Soybean with LeafSpec, an Accurate Handheld Hyperspectral Leaf Scanner

Zhongzhong Niu (13133583) 22 July 2022 (has links)
<p>  </p> <p>Dicamba (3,6-dichloro-2-methoxybenzoic acid) and 2,4-D (2,4-dichlorophenoxyacetic acid) are two widely used herbicides for broadleaf weed control in soybeans. However, off-target application of dicamba and 2,4-D can cause severe damage to sensitive vegetation and crops. Early detection and assessment of off-target damage caused by these herbicides are necessary to help plant diagnostic labs and state regulatory agencies collect more information of the on-site conditions so to develop solutions to resolve the issue in the future. In 2021, the study was conducted to detect damage to soybean leaves caused by dicamba and 2,4-D by using LeafSpec, an accurate handheld hyperspectral leaf scanner. . High resolution single leaf hyperspectral images of 180 soybean plants in the greenhouse exposed to nine different herbicide treatments were taken 1, 7, 14, 21 and 28 days after herbicide spraying. Pairwise PLS-DA models based on spectral features were able to distinguish leaf damage caused by two different modes of action herbicides, specifically dicamba and 2,4-D, as early as 2 hours after herbicide spraying. In the spatial distribution analysis, texture and morphological features were selected for separating the dosages of herbicide treatments. Compared to the mean spectrum method, new models built upon the spectrum, texture, and morphological features, improved the overall accuracy to over 70% for all evaluation dates. The combined features are able to classify the correct dosage of the right herbicide as early as 7 days after herbicide sprays. Overall, this work has demonstrated the potential of using spectral and spatial features of LeafSpec hyperspectral images for early and accurate detection of dicamba and 2,4-D damage in soybean plants.</p> <p>   </p>
20

Developing Methods for Measuring in Vivo Lipid Metabolism

Denton, Russell L. 17 February 2023 (has links) (PDF)
Lipid metabolism is critically important to the normal function of individual cells and for signaling between tissues of the human body. There is a great need to quantify changes in lipid metabolism because of its implications in both normal healthy conditions and during disease development. In our first study, we developed a liquid chromatography mass spectrometry based workflow to assess in vivo metabolism of murine brain lipids. This involved sample preparation, data acquisition, and analysis software development and improvements. Regarding sample preparation, we maintained the mice at 5% body water for the duration of the experiment. As mouse metabolism proceeds, enzymes can add deuterium atoms (D) from D2O into lipid C-H bonds. These newly synthesized D-labeled lipids display shifts in their isotopic envelopes. To observe these shifts, we used mass spectrometry acquisitions to measure the mass spectra of isotopic envelopes. We calculated changes in these isotopic envelope shifts to deduce several metabolic metrics. We used these metabolic metrics to make inferences about metabolism changes. These metrics include n-value, fraction new, rate, and asymptote for each lipid. We deduced n-value by replacing one hydrogen with a deuterium at a time in the lipid's theoretical chemical formula. The number of deuterium atoms in the theoretical D-labeled chemical formula that agrees best with its respective empirical spectrum is the n-value. A large part of this effort was assessing the reproducibility and quality control of n-values that were derived from empirical spectra. We compared these n-values to two sets of ground truth n-values that we generated. We generated one set of ground truth n-values by referencing biochemical pathways and published n-values. We used a linear algebra approach to deduce the other set of ground truth n-values. We compared both sets of ground truth n-values to n-values derived from empirical D-labeled lipid spectra. We found that both sets of ground truth n-values correlate well with n-values from empirical spectra. Using these n-values, we calculated fraction new for each lipid. This fraction new indicates what percentage of a lipid's pool is newly synthesized at a given time. For a given lipid, we calculated the fraction new for each time point and biological replicate. We plotted these fraction values together against time in days. From this fraction new vs time plot for each lipid, we deduced its asymptote and rate constant. In our second study, we added the additional dimension of drift time to the data acquisition and analysis using ion mobility spectrometry. We added this additional dimension so that we could further separate lipid isomers and prevent spectral convolution. Preliminary results suggest that lipid isomers may have distinct metabolic regulation.

Page generated in 0.1389 seconds