• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • Tagged with
  • 79
  • 79
  • 79
  • 24
  • 17
  • 12
  • 11
  • 11
  • 9
  • 9
  • 9
  • 8
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Analysis of Discrete Fractional Operators and Discrete Fractional Rheological Models

Uyanik, Meltem 01 May 2015 (has links)
This thesis is comprised of two main parts: Monotonicity results on discrete fractional operators and discrete fractional rheological constitutive equations. In the first part of the thesis, we introduce and prove new monotonicity concepts in discrete fractional calculus. In the remainder, we carry previous results about fractional rheological models to the discrete fractional case. The discrete method is expected to provide a better understanding of the concept than the continuous case as this has been the case in the past. In the first chapter, we give brief information about the main results. In the second chapter, we present some fundamental definitions and formulas in discrete fractional calculus. In the third chapter, we introduce two new monotonicity concepts for nonnegative or nonpositive valued functions defined on discrete domains, and then we prove some monotonicity criteria based on the sign of the fractional difference operator of a function. In the fourth chapter, we emphasize the rheological models: We start by giving a brief introduction to rheological models such as Maxwell and Kelvin-Voigt, and then we construct and solve discrete fractional rheological constitutive equations. Finally, we finish this thesis by describing the conclusion and future work.
72

Computational Methods for the Optimal Reconstruction of Material Properties in Complex Multiphysics Systems

Bukshtynov, Vladislav 04 1900 (has links)
<p>In this work we propose and validate a computational method for reconstructing constitutive relations (material properties) in complex multiphysics phenomena based on incomplete and noisy measurements which is applicable to different problems arising in nonequilibrium thermodynamics and continuum mechanics. The parameter estimation problem is solved as PDE–constrained optimization using a gradient–based technique in the optimize–then–discretize framework. The reconstructed material properties taken as an example here are the transport coefficients characterizing diffusion processes such as the viscosity and the thermal conductivity, and we focus on problems in which these coefficients depend on the state variables in the system. The proposed method allows one to reconstruct a smooth constitutive relation defined over a broad range of the dependent variable. This research is motivated by questions arising in the computational analysis and optimization of advanced welding processes which involves modelling complex alloys in the liquid phase at high temperatures.</p> / Doctor of Philosophy (PhD)
73

General Nonlinear-Material Elasticity in Classical One-Dimensional Solid Mechanics

Giardina, Ronald Joseph, Jr 05 August 2019 (has links)
We will create a class of generalized ellipses and explore their ability to define a distance on a space and generate continuous, periodic functions. Connections between these continuous, periodic functions and the generalizations of trigonometric functions known in the literature shall be established along with connections between these generalized ellipses and some spectrahedral projections onto the plane, more specifically the well-known multifocal ellipses. The superellipse, or Lam\'{e} curve, will be a special case of the generalized ellipse. Applications of these generalized ellipses shall be explored with regards to some one-dimensional systems of classical mechanics. We will adopt the Ramberg-Osgood relation for stress and strain ubiquitous in engineering mechanics and define a general internal bending moment for which this expression, and several others, are special cases. We will then apply this general bending moment to some one-dimensional Euler beam-columns along with the continuous, periodic functions we developed with regard to the generalized ellipse. This will allow us to construct new solutions for critical buckling loads of Euler columns and deflections of beam-columns under very general engineering material requirements without some of the usual assumptions associated with the Ramberg-Osgood relation.
74

Penalized mixed-effects ordinal response models for high-dimensional genomic data in twins and families

Gentry, Amanda E. 01 January 2018 (has links)
The Brisbane Longitudinal Twin Study (BLTS) was being conducted in Australia and was funded by the US National Institute on Drug Abuse (NIDA). Adolescent twins were sampled as a part of this study and surveyed about their substance use as part of the Pathways to Cannabis Use, Abuse and Dependence project. The methods developed in this dissertation were designed for the purpose of analyzing a subset of the Pathways data that includes demographics, cannabis use metrics, personality measures, and imputed genotypes (SNPs) for 493 complete twin pairs (986 subjects.) The primary goal was to determine what combination of SNPs and additional covariates may predict cannabis use, measured on an ordinal scale as: “never tried,” “used moderately,” or “used frequently”. To conduct this analysis, we extended the ordinal Generalized Monotone Incremental Forward Stagewise (GMIFS) method for mixed models. This extension includes allowance for a unpenalized set of covariates to be coerced into the model as well as flexibility for user-specified correlation patterns between twins in a family. The proposed methods are applicable to high-dimensional (genomic or otherwise) data with ordinal response and specific, known covariance structure within clusters.
75

Fast Algorithms for Analyzing Partially Ranked Data

McDermott, Matthew 01 January 2014 (has links)
Imagine your local creamery administers a survey asking their patrons to choose their five favorite ice cream flavors. Any data collected by this survey would be an example of partially ranked data, as the set of all possible flavors is only ranked into subsets of the chosen flavors and the non-chosen flavors. If the creamery asks you to help analyze this data, what approaches could you take? One approach is to use the natural symmetries of the underlying data space to decompose any data set into smaller parts that can be more easily understood. In this work, I describe how to use permutation representations of the symmetric group to create and study efficient algorithms that yield such decompositions.
76

NETWORK ANALYTICS FOR THE MIRNA REGULOME AND MIRNA-DISEASE INTERACTIONS

Nalluri, Joseph Jayakar 01 January 2017 (has links)
miRNAs are non-coding RNAs of approx. 22 nucleotides in length that inhibit gene expression at the post-transcriptional level. By virtue of this gene regulation mechanism, miRNAs play a critical role in several biological processes and patho-physiological conditions, including cancers. miRNA behavior is a result of a multi-level complex interaction network involving miRNA-mRNA, TF-miRNA-gene, and miRNA-chemical interactions; hence the precise patterns through which a miRNA regulates a certain disease(s) are still elusive. Herein, I have developed an integrative genomics methods/pipeline to (i) build a miRNA regulomics and data analytics repository, (ii) create/model these interactions into networks and use optimization techniques, motif based analyses, network inference strategies and influence diffusion concepts to predict miRNA regulations and its role in diseases, especially related to cancers. By these methods, we are able to determine the regulatory behavior of miRNAs and potential causal miRNAs in specific diseases and potential biomarkers/targets for drug and medicinal therapeutics.
77

Portfolio Optimization under Value at Risk, Average Value at Risk and Limited Expected Loss Constraints

Gambrah, Priscilla S.N January 2014 (has links)
<p>In this thesis we investigate portfolio optimization under Value at Risk, Average Value at Risk and Limited expected loss constraints in a framework, where stocks follow a geometric Brownian motion. We solve the problem of minimizing Value at Risk and Average Value at Risk, and the problem of finding maximal expected wealth with Value at Risk, Average Value at Risk, Limited expected loss and Variance constraints. Furthermore, in a model where the stocks follow an exponential Ornstein-Uhlenbeck process, we examine portfolio selection under Value at Risk and Average Value at Risk constraints. In both geometric Brownian motion (GBM) and exponential Ornstein-Uhlenbeck (O.U) models, the risk-reward criterion is employed and the optimal strategy is found. Secondly, the Value at Risk, Average Value at Risk and Variance is minimized subject to an expected return constraint. By running numerical experiments we illustrate the effect of Value at Risk, Average Value at Risk, Limited expected loss and Variance on the optimal portfolios. Furthermore, in the exponential O.U model we study the effect of mean-reversion on the optimal strategies. Lastly we compare the leverage in a portfolio where the stocks follow a GBM model to that of a portfolio where the stocks follow the exponential O.U model.</p> / Master of Science (MSc)
78

GIS-integrated mathematical modeling of social phenomena at macro- and micro- levels—a multivariate geographically-weighted regression model for identifying locations vulnerable to hosting terrorist safe-houses: France as case study

Eisman, Elyktra 13 November 2015 (has links)
Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.
79

Software Internationalization: A Framework Validated Against Industry Requirements for Computer Science and Software Engineering Programs

Vũ, John Huân 01 March 2010 (has links)
View John Huân Vũ's thesis presentation at http://youtu.be/y3bzNmkTr-c. In 2001, the ACM and IEEE Computing Curriculum stated that it was necessary to address "the need to develop implementation models that are international in scope and could be practiced in universities around the world." With increasing connectivity through the internet, the move towards a global economy and growing use of technology places software internationalization as a more important concern for developers. However, there has been a "clear shortage in terms of numbers of trained persons applying for entry-level positions" in this area. Eric Brechner, Director of Microsoft Development Training, suggested five new courses to add to the computer science curriculum due to the growing "gap between what college graduates in any field are taught and what they need to know to work in industry." He concludes that "globalization and accessibility should be part of any course of introductory programming," stating: A course on globalization and accessibility is long overdue on college campuses. It is embarrassing to take graduates from a college with a diverse student population and have to teach them how to write software for a diverse set of customers. This should be part of introductory software development. Anything less is insulting to students, their family, and the peoples of the world. There is very little research into how the subject of software internationalization should be taught to meet the major requirements of the industry. The research question of the thesis is thus, "Is there a framework for software internationalization that has been validated against industry requirements?" The answer is no. The framework "would promote communication between academia and industry ... that could serve as a common reference point in discussions." Since no such framework for software internationalization currently exists, one will be developed here. The contribution of this thesis includes a provisional framework to prepare graduates to internationalize software and a validation of the framework against industry requirements. The requirement of this framework is to provide a portable and standardized set of requirements for computer science and software engineering programs to teach future graduates.

Page generated in 0.0702 seconds