• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 85
  • 50
  • 23
  • 15
  • 15
  • 10
  • 9
  • 7
  • 6
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 283
  • 37
  • 30
  • 29
  • 25
  • 22
  • 22
  • 21
  • 19
  • 18
  • 18
  • 18
  • 18
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Accelerated Life Test Modeling Using Median Rank Regression

Rhodes, Austin James 01 November 2016 (has links)
Accelerated life tests (ALT) are appealing to practitioners seeking to maximize information gleaned from reliability studies, while navigating resource constraints due to time and specimen costs. A popular approach to accelerated life testing is to design test regimes such that experimental specimens are exposed to variable stress levels across time. Such ALT experiments allow the practitioner to observe lifetime behavior across various stress levels and infer product life at use conditions using a greater number of failures than would otherwise be observed with a constant stress experiment. The downside to accelerated life tests, however, particularly for those that utilize non-constant stress levels across time on test, is that the corresponding lifetime models are largely dependent upon assumptions pertaining to variant stress. Although these assumptions drive inference at product use conditions, little to no statistical methods exist for assessing their validity. One popular assumption that is prevalent in both literature and practice is the cumulative exposure model which assumes that, at a given time on test, specimen life is solely driven by the integrated stress history and that current lifetime behavior is path independent of the stress trajectory. This dissertation challenges such black box ALT modeling procedures and focuses on the cumulative exposure model in particular. For a simple strep-stress accelerated life test, using two constant stress levels across time on test, we propose a four-parameter Weibull lifetime model that utilizes a threshold parameter to account for the stress transition. To circumvent regularity conditions imposed by maximum likelihood procedures, we use median rank regression to fit and assess our lifetime model. We improve the model fit using a novel incorporation of desirability functions and ultimately evaluate our proposed methods using an extensive simulation study. Finally, we provide an illustrative example to highlight the implementation of our method, comparing it to a corresponding Bayesian analysis. / Ph. D.
52

Hypothalamic Rax+ tanycytes contribute to tissue repair and tumorigenesis upon oncogene activation in mice

Mu, W., Li, S., Guo, X., Wu, H., Chen, Z., Qiao, L., Helfer, Gisela, Lu, F., Liu, C., Wu, Q.-F. 22 March 2021 (has links)
Yes / Hypothalamic tanycytes in median eminence (ME) are emerging as a crucial cell population that regulates endocrine output, energy balance and the diffusion of blood-born molecules. Tanycytes have recently been considered as potential somatic stem cells in the adult mammalian brain, but their regenerative and tumorigenic capacities are largely unknown. Here we found that Rax+ tanycytes in ME of mice are largely quiescent but quickly enter the cell cycle upon neural injury for self-renewal and regeneration. Mechanistically, Igf1r signaling in tanycytes is required for tissue repair under injury conditions. Furthermore, Braf oncogenic activation is sufficient to transform Rax+ tanycytes into actively dividing tumor cells that eventually develop into a papillary craniopharyngioma-like tumor. Together, these findings uncover the regenerative and tumorigenic potential of tanycytes. Our study offers insights into the properties of tanycytes, which may help to manipulate tanycyte biology for regulating hypothalamic function and investigate the pathogenesis of clinically relevant tumors.
53

Robust Adaptive Signal Processors

Picciolo, Michael L. 21 April 2003 (has links)
Standard open loop linear adaptive signal processing algorithms derived from the least squares minimization criterion require estimates of the N-dimensional input interference and noise statistics. Often, estimated statistics are biased by contaminant data (such as outliers and non-stationary data) that do not fit the dominant distribution, which is often modeled as Gaussian. In particular, convergence of sample covariance matrices used in block processed adaptive algorithms, such as the Sample Matrix Inversion (SMI) algorithm, are known to be affected significantly by outliers, causing undue bias in subsequent adaptive weight vectors. The convergence measure of effectiveness (MOE) of the benchmark SMI algorithm is known to be relatively fast (order K = 2N training samples) and independent of the (effective) rank of the external interference covariance matrix, making it a useful method in practice for non-contaminated data environments. Novel robust adaptive algorithms are introduced here that perform superior to SMI algorithms in contaminated data environments while some retain its valuable convergence independence feature. Convergence performance is shown to be commensurate with SMI in non-contaminated environments as well. The robust algorithms are based on the Gram Schmidt Cascaded Canceller (GSCC) structure where novel building block algorithms are derived for it and analyzed using the theory of Robust Statistics. Coined M – cancellers after M – estimates of Huber, these novel cascaded cancellers combine robustness and statistical estimation efficiency in order to provide good adaptive performance in both contaminated and non-contaminated data environments. Additionally, a hybrid processor is derived by combining the Multistage Wiener Filter (MWF) and Median Cascaded Canceller (MCC) algorithms. Both simulated data and measured Space-Time Adaptive Processing (STAP) airborne radar data are used to show performance enhancements. The STAP application area is described in detail in order to further motivate research into robust adaptive processing. / Ph. D.
54

Tilt-Compensated Magnetic Field Sensor

Bingaman, Adam Neal 22 June 2010 (has links)
Motion and tilt have long hindered the accuracy, reliability, and response of magnetic detection systems. Perturbations in the magnetic field reading resulting from motion cause degradation of the output signal, compromising the performance and reliability of the magnetometer system. The purpose of this document is to describe the development, construction, and testing of a tilt-stabilized three-axis magnetic field sensor. The sensor is implemented as a three-axis general-purpose magnetic field sensor, with the additional capability of being implemented as a compass. Design and construction of system hardware is discussed, along with software development and implementation. Finite impulse response filters are designed and implemented in hardware to filter the acquired magnetic signals. Various designs of median filters are simulated and tested for smoothing inclination signal irregularities and noise. Trigonometric conversions necessary for tilt-compensation are calculated in software using traditional methods, as well as the Coordinate Rotation Digital Computer (CORDIC) algorithm. Both calculation methods are compared for execution time and efficiency. Successful incorporation of all design aspects leads to detection and output of stable earth magnetic fields, sinusoidal signals, and aperiodic signatures while the magnetometer system is subject to significant tilt motion. Optimized system execution time leads to a maximum detectable signal bandwidth of 410 Hz. Integration of azimuth angle calculation is incorporated and is successfully tested with minimal error, allowing the system to be used as a compass. Results of the compensated system tests are compared to non-compensated results to display system performance, including tilt-compensation effectiveness, noise attenuation, and operational speed. / Master of Science
55

Utilization of Musculoskeletal Sonography in Detecting Physiologic Changes of the Median Nerve in a Working Animal Model

Volz, Kevin R. 11 July 2013 (has links)
No description available.
56

An Evaluation of Coating Material Dependent Toxicity of Silver Nanoparticles

Silva, Thilini Upekshika 01 December 2011 (has links) (PDF)
Silver nanoparticles (AgNPs) synthesized using numerous types of coating materials may exhibit different toxicity effects. The study evaluated coating material dependent toxicity by selecting 3 types of AgNP synthesis methods with different coating materials (citrate, polyvinyl pyrrolidone, and branched polyethyleneimine, coated AgNPs as citrate-AgNPs, PVP-AgNPs, and BPEI-AgNPs respectively). Two acute aquatic toxicity tests were performed; 48hr D. magna and MetPLATE E. coli toxicity tests. Significantly different toxicity effects were observed in D. magna test exhibiting lethal median concentrations (LC50) for citrate-AgNPs, PVP-AgNPs, and BPEI AgNPs respectively as, 2.7, 11.2, and 0.57μg/L. Median inhibitory concentrations (EC50) for MetPLATE tests were 1.27, 1.73, and 0.31mg/L respectively with significant different toxicity effects. Silver ion fractions were detected in the range of 2.4-19.2% in tested NP suspensions. Study suggests the toxicity effects are due to the cumulative action of ionic and nanoparticle fractions in the suspensions.
57

Jämförelsestudie för grey scale median (GSM) värde för plack i carotis från ultraljud med två olika bildinställningar / Comparison of grey scale median (GSM) measurement in carotis plaque with ultrasound using two different image settings

Teodorescu, Crina January 2021 (has links)
Plack i arteria carotis kan ha olika utseende beroende på innehåll och morfologi. Det kan vara lågekogena eller högekogena plack. De lågekogena placken är mest problematiska eftersom de består av lipider och inflammationsceller som ger en mjuk konsistens och kan spricka och ledda till embolisering med stroke och kardiovaskulära incidenter som följd. För mätning av plack-ekogenicitet används i forskningssyfte ett GSM-värde, grey scale median som mäts med en ultraljudapparat med GSM-inställning. Ju lägre GSM-värde desto mer lågekogen är placken. Syftet med denna studie är att jämföra två GSM-värde för samma plack, den ena mätt med ultraljudstandardinställning och den andra mätt med GSM-inställningen för att avgöra om det blir en bra överensstämmelse mellan värdena. Om så är fallet innebär detta att plack-ekogenicitet kan avgöras i framtiden med mätning av GSM-värde med en ultraljudsapparat med standardinställning i forskningssyfte. Resultatet visade att det fanns en god överensstämmelse mellan en stor del av GSM-värdena, men det fanns en del värde där det blev en signifikant skillnad som gjorde att studien inte var tillräcklig övertygande för förväntade utfallet. / Plaque in the carotid artery can have different appearances depending on the content and morphology. It can be echolucent or more-echogenic plaques. The echolucent plaques are most problematic because they consist of lipids and inflammatory cells that give a soft consistency and can crack and lead to embolization with stroke and cardiovascular incidents as a result. For measuring plaque echogenicity for research purposes, a GSM value, gray scale median is used and is measured with an ultrasound device with a GSM setting. The lower the GSM value, the lower echolucent is the plaque. The purpose of this study is to compare two GSM values for the same plaque, one measured with the ultrasonic standard setting and the other measured with the GSM setting to determine if there is a good agreement between the values. If this is the case, this means that plaque echogenicity can be determined in the future by measuring GSM value with an ultrasound device with a standard setting for research purpose. The results showed that there was a good agreement between a large part of the GSM values, but there were some values where there was a significant difference which made the study not sufficiently convincing for the expected outcome.
58

Erweiterung des 'generalized' p-Median-Problems

Futlik, Alisa 15 October 2018 (has links)
Die vorliegende Masterarbeit beschäftigt sich mit den MINISUM-Modellen auf einem Graphen. Die Eigenschaften des „generalized“ p-Median-Problem werden neben den Eigenschaften des ordinären p-Median-Problems untersucht. Dabei kommt folgende Diskrepanz zum Vorschein: Obwohl das „generalized“ p-Median-Problem eine unendliche Anzahl an potenziellen Lösungsmöglichkeiten besitzt und der optimale Standort bei einer derartigen Problemstellung sowohl im Knoten als auch auf der Kante des Graphen liegen kann, wird der Median oft ausschließlich in den Knoten des Graphen gesucht. Dadurch entsteht das Risiko, dass beim Lösen des Problems der optimale Standort von Anfang an nicht mitberücksichtigt wird. Die Forschungsaufgabe dieser Arbeit ist, das „generalized“ p-Median-Problem so zu erweitern, dass aus einem Problem mit unendlicher Anzahl an Lösungsmöglichkeiten ein endliches Problem wird, welches optimal mit einer diskreten Methode gelöst werden kann. Im ersten Schritt werden die potenziellen Standorte auf den Kanten (die sogenannten fiktiven Knoten) ermittelt. Sie werden mit den Knoten des Graphen gleichgestellt und bei der Auffindung des kostenminimalen Standortes einkalkuliert. Damit sind alle potenziellen Standorte abgedeckt und das Problem erhält eine endliche Anzahl an Lösungsmöglichkeiten. Eine weitere Herausforderung liegt in der unkonventionellen Formulierung des Kostenparameters, der beim „generalized“ p-Median-Problem zusätzlich berücksichtigt wird. Die Kosten stellen eine logarithmische Kostenfunktion dar, die von der Verteilung der Nachfrage auf die Mediane abhängig ist. Diese Variable wird als Zuteilung bezeichnet und muss zusätzlich vor der Formulierung des Optimierungsproblems bestimmt werden. Die Zuteilung ist für die Ermittlung der Kosten zuständig und fließt in das Modell nur indirekt mit ein. Abschließend wird die Funktionsfähigkeit des neuen Modells überprüft und dem ursprünglichen Modell (dem umformulierten Warehouse Location Problem) gegenübergestellt. Tatsächlich werden bei dem erweiterten Modell durch die Platzierung der Mediane auf die Kante zusätzliche Kosten eingespart. Die vorliegende Arbeit zeigt das Prinzip, wie das „generalized“ p-Median-Problem erweitert werden kann, und liefert den Beweis über die Funktionstüchtigkeit dieser Methode. / The following master’s thesis deals with the MINISUM models on a graph. In this regard the properties of the generalized p-median problem have been investigated alongside the properties of the ordinary p-median problem. In the course of the investigation, the following discrepancy comes to the fore: although the generalized p-median problem has an infinite number of potential solutions, and the optimal location for such a problem may lie in both the vertex and on the edge of the graph, the median is often searched for exclusively in the vertex of the graph. This creates the risk that, upon attempting to find a solution, the optimal location to place the median may not be taken into consideration right from the start. The goal of the following thesis is to extend the generalized p-median problem so that a problem with an infinite number of possible solutions becomes a finite problem which can best be solved with a discrete method. In the first step, all potential locations along the edges (the so-called fictitious vertices) are determined using an empirical-analytical approach. They are equated with the vertices of the graph and taken into account when locating the minimum cost location. This covers all potential locations and through this method the problem receives a finite number of possible solutions. Another challenge lies in the unconventional formulation of the cost parameter, which is additionally taken into account in the generalized p-median problem. The cost represents a logarithmic cost function that depends on the distribution of demand on the median. In the following work, this variable shall be called the allocation and must first be determined in order to formulate the optimization problem framework. The allocation is responsible for determining the costs and is included only indirectly in the model. Finally, the functionality of the new model is checked and compared with the original model, the rewritten warehouse location problem. In fact, the placement of medians on an edge saves additional costs in the extended model. The following elaboration shows the principle of how the generalized p-median problem can be extended, and provides proof of the functionality of this extension.
59

A GIS-Based Localization of Regional Sorting Centers : A Case Study of Swedish Red Cross / En GIS-baserad lokalisering av regionala sorteringscentraler : En fallstudie av Svenska Röda Korset

Kaltsidis, Alexandros January 2022 (has links)
The Swedish Red Cross (SRK) plays an important humanitarian role by selling donated clothes in order to collect more money to help people in need. An extended network of 251 second-hand stores is built nationwide, where donors leave their clothes and buyers can buy them at competitive prices. However, an amount of these clothes remain unsold and ends up being shipped to textile recycling centers. The organization plans to build some Regional Sorting Facilities, where a careful sorting will take place and the clothes will be stored, until they will be redistributed to other stores within thecountry.This project aims to find the optimal number and location of these facilities in a way that the transportation cost from stores to facilities is minimized. SRKs Logistics Department operationalizes this aim to the following objective: place a minimum number of facilities such that at least 50% of the stores or 50% of the produced revenues are reached in less than 90 minutes of driving time. Thus, modern GIS software is used in a Location-Allocation analysis to solve the p-median problem. The core of the methodology in this thesis is the well known Vertex Substitution heuristic algorithm (Teitz & Bart).Empirical evaluations of seven (7) scenarios comprised of optimally placing an increasing number of facilities from one (1) to seven (7) reveal that five (5) facilities is sufficient to meet the operational objective with the minimal number of resources/facilities. The solutions for all scenarios are analyzed in terms of statistics (Key Performance Indicators) and are illustrated on maps.
60

Representation Yesterday and Today: The Changing Link between Public Opinion and Policy Outcomes over Time

Irvine, Michael 01 January 2016 (has links)
Who gets represented in America? How does representation change over time? This thesis attempts to answer both questions, which are necessarily linked to one another. I investigate long-term trends in representation and temporary fluctuations in group influence by using a probit model to examine the link between socioeconomic groups’ policy preferences and outcomes in year-groups roughly corresponding to presidential terms. I find evidence for the suggestion in the literature that American policymaking contains a strong bias in favor of the status quo, but I depart from the literature in finding little evidence for a suggested link between income and political influence. I find evidence of declining policy activity in the 1990s and 2000s relative to the 1980s but little evidence of a long-term trend towards less policy output. In general, I find little evidence of long-term trends in representation, including the idea that our policy outcomes are becoming more correlated with the views of minority groups such as African-Americans and Hispanics.

Page generated in 0.0266 seconds