Spelling suggestions: "subject:"publica""
111 |
Caput mortuumRollins, David Glenn 01 May 2016 (has links)
Caput Mortuum is a visual representation of my own spiritual quest for enlightenment using alchemy. Ancient alchemists sought perfection in all things and visualized a personal spiritual hierarchy that resided within all physical matter. The lowest tier of this scale represents the dull and lifeless material while the highest could touch the Divine. Reshaping the material world revealed the ordinary item's latent potential, aiding in its own transformation as well as the alchemist's into more perfect beings.
Inspired by this idea, I seek to bring ultimate perfection to every piece I create. By manipulating and altering books and book forms I replicate the physical work alchemists performed, each time changing myself with the book, elevating our spiritual beings in order to bring perfection from within.
|
112 |
Perceiving meter in romantic, post-minimal, and electro-pop repertoiresSkretta, James Edward 01 December 2015 (has links)
No description available.
|
113 |
Experiments in flowing and freely expanding dusty plasmasMeyer, John Kenneth 01 May 2015 (has links)
I study a dusty plasma produced in a DC glow discharge device. The chamber is a stainless steel cylinder 0.6 m in diameter and 0.9 m long. A stainless steel disk 3.2 cm in diameter acts as the anode and the walls act as the cathode. The discharge current is set between 1 - 10 mA and the voltage at the anode between 250 - 300 V. Dust is initially on a tray beneath the anode, and becomes trapped in the anode glow naturally with high discharge current. A secondary cloud can be made at a different location using a biased mesh. I make experimental observations of the dynamics of the secondary cloud as well as the unique interaction of the dust with a wire loop near the anode.
First, I describe the interaction of the secondary cloud with a wire when it the cloud is released to flow back to the primary cloud. A detached bow shock is observed as the cloud encounters an obstacle, and an elongated teardrop shaped void is formed downstream of the obstacle.
Second, a continuous flow is set up using at biased ring. The potentials of the ring and anode create a converging-diverging electrostatic potential structure which accelerates dust particles into a thin stream in the diverging section. The interaction of this stream and a wire obstacle is described.
Finally, the potentials of the mesh and anode are suddenly switched to float simultaneously to observe the secondary cloud expansion in the afterglow plasma. The rate of expansion is shown to depend inversely on the background pressure in the range of 100-200mTorr. The expansion shows a separation in the cloud and possible Yukawa-like expansion where the center of the cloud does not respond initially to the removal of confinement.
|
114 |
Examining the effects of paper-based and computer-based modes of assessment on mathematics curriculum-based measurementHensley, Kiersten Kenning 01 May 2015 (has links)
The computer to pupil ratio has changed drastically in the past decades, from 125:1 in 1983 to less than 2:1 in 2009 (Gray, Thomas, and Lewis, 2010), allowing for teachers and students to integrate technology throughout the educational experience. The area of educational assessment has adapted to the increased use of technology. Trends in assessment and technology include a movement from paper-based to computer-based testing for all types of assessments, from large-scale assessments to teacher-created classroom tests. Computer-based testing comes with many benefits when compared to paper-based testing, but it is necessary to determine if results are comparable, especially in situations where computer-based and paper-based tests can be used interchangeably.
The main purpose of this study was to expand upon the base of research comparing paper-based and computer-based testing, specifically with elementary students and mathematical fluency. The study was designed to answer the following research questions: (1) Are there differences in fluency-based performance on math computation problems presented on paper versus on the computer? (2) Are there differential mode effects on computer-based tests based on sex, grade level, or ability level?
A mixed-factorial design with both within- and between-subject variables was used to investigate the differences between performance on paper-based and computer-based tests of mathematical fluency. Participants completed both paper- and computer-based tests, as well as the Group Math Assessment and Diagnostic Evaluation as a measure of general math ability. Overall findings indicate that performance on paper- and computer-based tests of mathematical fluency are not comparable and student grade-level may be a contributing factor in that difference.
|
115 |
Maxillary central incisor crown-root relationships in class I normal occlusions and class III open and deep malocclusionsFuller, Jessica Kay 01 May 2015 (has links)
MAXILLARY CENTRAL INCISOR CROWN-ROOT RELATIONSHIPS IN CLASS I NORMAL OCCLUSIONS AND CLASS III OPEN AND DEEP MALOCCLUSIONS
By Jessica Fuller
M.S. Thesis Research Project
Introduction: The purposes of this thesis were several: (1) to examine a new crown-to-root angle based on anatomic points, the labial crown-root angle (LCRA), that was proposed in a recent University of Iowa thesis by Bauer, T.J. (2014) and correlate it with the collum angle (CA) values for Class I normal occlusions and Class III malocclusions; (2) to establish mean values for CA and LCRA for Class I normal occlusions and Class III open and deep bite malocclusions and statistically compare the groups; (3) to compare the significance of the correlation between overbite measures and CA and LCRA in the Class III sample. Only one study addressed the increased CA in Class III malocclusions however, the study did not include a normal occlusion control sample. Methods: 46 Class I normal samples, 20 Class III open bite samples and 22 Class III deep bite samples who met the inclusion criteria were measured cephalometrically. Relevant landmarks were placed, analyzed for reliability, and recorded for the measurements of interest. Results: A strong increasing correlation was found between CA and LCRA for all samples (Pearson's correlation coefficient = 0.82, p < .0001). The mean CA for the Class III deep bite group (9.32±4.46) was significantly different from the Class I normal core (3.38±1.70), Class I expanded (3.60±1.94) and Class III open bite (5.24±3.99) groups (ANOVA). The mean LCRA for the Class III deep bite group (39.67±5.64) was also significantly different from Class I normal core (31.97±4.25), Class I expanded (33.53±5.65) and Class III open bite (35.55±5.65) groups (ANOVA). There was no significant correlation between CA and overbite within Class III open (p=0.8029) and Class III deep bite groups (p=0.2089) or LCRA and overbite within Class III open (p=0.7529) and Class III deep bite groups (p=0.1864). Conclusions: LCRA and CA were highly correlated in Class III patients. Patients with Class III deep bites had statistically higher means for CA and LCRA than patients with Class I normal occlusions and Class III open bite malocclusions. There was no significant correlation between the measures for overbite and either CA or LCRA values in Class III patients.
|
116 |
Uniquely clean elements, optimal sets of units and counting minimal sets of unitsBorchers, Brian Edward 01 July 2015 (has links)
Let R be a ring. We say x ∈ R is clean if x = e + u where u is a unit and e is an idempotent (e2 = e). R is clean if every element of R is clean. I will give the motivation for clean rings, which comes from Fitting's Lemma for Vector Spaces. This leads into the ABCD lemma, which is the foundation of a paper by Camillo, Khurana, Lam, Nicholson and Zhou. Semi-perfect rings are a well known type of ring. I will show a relationship that occurs between clean rings and semi-perfect rings which will allow me to utilize what is known already about semi-perfect rings. It is also important to note that I will be using the Fundamental Theorem of Torsion-free Modules over Principal Ideal Domains to work with finite dimensional vector spaces. These finite dimensional vector spaces are in fact strongly clean, which simply means they are clean and the idempotent and unit commute. This additionally means that since L = e + u, Le = eL. Several types of rings are clean, including a weaker version of commutative Von Neumann regular rings, Duo Von Neumann regular, which I have proved. The goal of my research is to find out how many ways to write matrices or other ring elements as sums of units and idempotents. To do this, I have come up with a method that is self contained, drawing from but not requiring the entire literature of Nicholson. We also examine sets other than idempotents such as upper-triangular and row reduced and examine the possibility or exclusion that an element may be represented as the sum of a upper-triangular (resp. row reduced) element and a unit. These and other element properties highlight some of the complexity of examining an additive property when the underlying properties are multiplicative.
|
117 |
Essays in industrial organizationErickson, Philip Joseph 01 May 2016 (has links)
The motivation of this thesis is the study of markets in which consumers are under-informed concerning the quality of any given product and in which the quality of consumers also matters to producers of products. This study has resulted in a primary application paper, comprising the first chapter which focuses on the market for training lawyers, as well as a second technical chapter exploring theory which can prove useful in analyzing these markets. The first chapter is based on the observation that the number of lawyers being produced at high cost combined with the relative lack of job options has recently created significant concern. In order to partially explain this phenomenon, I propose a game of incomplete information modeling the strategic interaction between law schools as they compete for potential students. The information asymmetries come from the fact that any given law school is better informed about the quality of its education than its potential students. Using a change in market information structure generated by student placement reporting requirements, I use the model to estimate the dynamic effect of increased information on distributions of tuition rates, incoming student ability, class sizes, and the rate at which law schools open and potentially close. Using these estimates, I show that there have not necessarily been too many law schools or students, but rather an equilibrium enforced mismatch between students and their optimal schooling choices. The new information has acted as a forced collusion mechanism to partially overcome this mismatch, which has differentially decreased school welfare, strictly increased student welfare, and resulted in a positive total welfare gain of $685 million. The second chapter provides a thorough exploration of the microeconomic foundations for the multi-variate linear demand function for differentiated products that is widely used in industrial organization. A key finding is that strict concavity of the quadratic utility function is critical for the demand system to be well defined. Otherwise, the true demand function may be quite complex: Multi-valued, non-linear and income-dependent. The solution of the first order conditions for the consumer problem, which we call a local demand function, may have quite pathological properties. We uncover failures of duality relationships between substitute products and complementary products, as well as the incompatibility between high levels of complementarity and concavity. The two-good case emerges as a special case with strong but non-robust properties. A key implication is that all conclusions derived via the use of linear demand that does not satisfy the law of Demand ought to be regarded with some suspicion.
|
118 |
"The harder heroism of the hospital:" Union veterans and the creation of disability, 1862-1910Donovan, Brian Edward 01 January 2015 (has links)
The unprecedented size and scope of the American Civil War fundamentally redefined the relationship between state and citizen. Through its conscription laws, the Union government empowered itself to standardize and evaluate the bodies of its citizens; the concurrent General Law pension system extended this standardization into the realm of disability. The government served as both national physician and national accountant, distributing millions of dollars a year to men it deemed unable to earn up to their potential due to wounds and diseases contracted in the Union's defense. Moreover, since so many disabilities were the result of disease - and therefore invisible to the naked eye - the state also asserted its power to certify to the taxpayers that these veterans were indeed among the "deserving poor," not idlers or parasites. This became especially important as pension-related expenses ballooned to the second-largest line item on the budget, and the "veteran vote" became the most important single-issue bloc in American politics.
Veterans were themselves voters, however, and could negotiate at least some of the terms of their disability through the political process. This established that disability is discursively constructed - it is a social position, not a permanent physical impairment. Veterans' organizations might sweep socially problematic old soldiers up into Homes, but veterans always retained their influence at the ballot box. Thus, the same political process which enabled the state to seize unprecedented powers of surveillance also kept these new powers at least somewhat in check.
|
119 |
Formation of haloacetic acids and N-nitrosodimethylamine via the chlorination of carbon nanotubesNelson, Kyle Jeffery 01 May 2015 (has links)
Recent investigations have shown that engineered nanomaterials such as carbon nanotubes (CNTs) are a source and precursor for disinfection byproduct (DBP) formation. The aim of this study was to extend previous research of CNTs by investigating the potential for other classes of CNTs to generate disinfection byproducts (DBP) during chlorination. We examined particular types of CNTs with surface groups analogous to suspected model precursors for DBP formation.Specifically, we conducted experiments to determine the formation of haloacetic acids (HAAs) and N-nitrosodimethylamine (NDMA) via the chlorination of carbon nanotubes.
Polymer coated CNTs generated the greatest total HAA concentration of up to 170 μg-HAA/mg-CNT. Results showed that the presence of surface oxide groups (e.g. surface carboxylic acid groups) promotes HAA formation. We observed a reasonably strong correlation between the extent of HAA formation and the concentration of surface oxygen on the CNT surface. Results also showed that CNTs behave similar to model precursors for di- and trichloroacetic acid formation (DCAA and TCAA, respectively).
Nitrogen containing CNTs have been shown as source of N-nitrosodimethylamine (NDMA). Surprisingly, CS PEG, which does not contain N, produces NDMA when reacted with ethylenediamine (EDA). Ultimately, EDA is contributing N to CS PEG by sorbing to the CNT surface, which is the likely source of N for NDMA formation. At lower EDA concentrations, NDMA production is limited by available EDA. Conversely, at higher EDA concentrations, NDMA production is limited by available chlorine that is in competition with EDA and the CNT surface.
|
120 |
A framework for emerging topic detection in biomedicineMadlock-Brown, Charisse Renee 01 December 2014 (has links)
Emerging topic detection algorithms have the potential to assist researchers in maintaining awareness of current trends in biomedical fields--a feat not easily achieved with existing methods. Though topic detection algorithms for news-cycles exist, several aspects of this particular area make applying them directly to scientific literature problematic.
This dissertation offers a framework for emerging topic detection in biomedicine. The framework includes a novel set of weightings based on the historical importance of each topic identified. Features such as journal impact factor and funding data are used to develop a fitness score to identify which topics are likely to burst in the future. Characterization of bursts over an extended planning horizon by discipline was performed to understand what a typical burst trend looks like in this space to better understand how to identify important or emerging trends. Cluster analysis was used to create an overlapping hierarchical structure of scientific literature at the discipline level. This allows for granularity adjustment (e.g. discipline level or research area level) in emerging topic detection for different users. Using cluster analysis allows for the identification of terms that may not be included in annotated taxonomies, as they are new or not considered as relevant at the time the taxonomy was last updated. Weighting topics by historical frequency allows for better identification of bursts that are associated with less well-known areas, and therefore more surprising. The fitness score allows for the early identification of bursty terms. This framework will benefit policy makers, clinicians and researchers.
|
Page generated in 0.087 seconds