• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13913
  • 2678
  • 2060
  • 946
  • 643
  • 426
  • 416
  • 408
  • 208
  • 125
  • 115
  • 106
  • 104
  • 93
  • 84
  • Tagged with
  • 27051
  • 9874
  • 8505
  • 8479
  • 4588
  • 4173
  • 4011
  • 3988
  • 3458
  • 3273
  • 3266
  • 3264
  • 3263
  • 3241
  • 3226
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Rex Francorum et rex Angul-Saxonum: a comparison of Einhard’s Vita Karoli Magni and Asser's De Rebus Gestis Ælfredi

Hund, Helen Ann 05 1900 (has links)
Einhard’s Vita Karoli Magni and Asser’s De Rebus Gestis Ælfredi document the lives of two of the most fascinating kings to influence Western civilization – Charlemagne and Alfred the Great. The two biographies were written approximately seventy years apart by clerics who were closely connected to each ruler’s court. Einhard and Asser reinvented and popularized the genre of secular biography for the medieval Christian world. Their descriptions of Christian kings aided the development of a specifically European identity which incorporated classical, Germanic and Christian traditions. The two Vitae are superb examples of an ever-occurring theme in medieval European history, which historian Patrick Wormald calls “the parallel development and the interdependence of Frankish and Anglo-Saxon history.” This thesis compares some of the common themes and distinctive traditions of ninth and tenth century Frankish and Anglo-Saxon society as they are presented in the Vita Karoli Magni and the De Rebus Gestis Ælfredi / Thesis (M.A.)--Wichita State University, College of Liberal Arts and Sciences, Dept. of History
192

Solution to large facility layout problems using group technology

Jaganathan, Jai Kannan Janaki 05 1900 (has links)
In this work, a systematic methodology to construct cellular layouts using GT technique for large size problems has been developed. Previous researches in this field have addressed heuristics that can be used to solve only medium size problems. A mathematical model that uses reduced intercellular count as a criterion for cell formation is developed. The developed model includes details like machine sequence, production volume and machine revisits for formation of cells. A performance measure that is used to evaluate the cells being formed is proposed after some modifications in the existing method from Nair & Narendran (1998). Once the cell configurations are evaluated, separate layouts are developed for each cell depending on the amount of flow between machines within their respective cell. The best configuration is selected based on the least material handling cost that a configuration accounts for. This process defines entire steps of the proposed approach. The validity of the proposed approach was verified using small, medium and large case studies. From the case study results, it is concluded that the proposed methodology can be used to solve large facility layout problems using GT. The developed model proved to be efficient irrespective of the size of the problem considered, even after inclusion of details such as machine sequence, production volume and machine revisits along with the performance measure for the cells formed. By restricting the number of cell configurations between an upper and lower limit, the model eliminated the possibility of unwanted configurations that increases the complexity of the problem. So for a large size facility layout problem, the proposed method can be used to get the actual number of intercellular movement between cells and also can be used to select the best configuration for a given production plan using reduced material handling cost as the criteria. / Thesis (M.S.)--Wichita State University, College of Engineering, Dept. of Industrial and Manufacturing Engineering
Read more
193

Mechanical generation of concrete syntax rules for the Schütz semantic editor

Johnson, Christopher Wayne 05 1900 (has links)
Schütz is a programmer's editor designed by Dr. Rodney Bates. Through its language definition language, users may adapt Schütz to any programming language. Describing a programming language in the language definition language involves writing largely parallel rule sets in three different syntaxes—the concrete syntax, the abstract syntax, and the format syntax. In this thesis, we present a method for mechanically generating the concrete syntax rule set, given the rule sets of the other two syntaxes; we also prove that the concrete syntax rule sets thus generated are correct and unambiguous, given the same traits in the rule sets of the other two syntaxes. / Thesis (M.S.)--Wichita State University, College of Liberal Arts and Sciences, Dept. of Computer Science
194

Inconsistency in the line spread test as an objective measurement of thickened liquids

Kim, Yoen Hee 05 1900 (has links)
There is no objective measurement clinically used to confirm consistency of thickened liquids in dysphagia management. This research was designed to examine the line spread test (LST) as an objective measurement following Paik et al.’s (2004) LST tool model with its acceptable ranges of thickened liquids. The original method was to measure nectar and honey consistency of thickened liquids mixed by four naïve adults, four SLPs, four dietary staff members, four licensed nurses, and four registered nurses. In the initial three naïve adult’s trials: 10 out of 12 cases for nectar consistency failed to meet the acceptable range and 1 out of 12 cases for honey consistency failed. From this preliminary data, the clinical validity of the LST tool was questioned. The research method was modified to measure LST values of pure drinks. The LST measure was performed by the investigator, four SLPs, and six naïve adults. The objectives of this modified research were to obtain LST values of pure base drink and to verify what caused the discrepancy in LST values. It was found that using different plates (glass and Plexiglas) caused significant differences in LST value of all pure base drinks. Overall values measured on a Plexiglas plate were less than those on a glass plate. No significant difference was found for person who performed the LST or time of day the LST was performed. This research found that the LST tool was not a reliable method to measure the consistency of thickened liquids. Depending on which plate was used, LST values obtained varied for all base drinks. Measurements obtained in this study did not agree with values reported by Paik et al. (2004). / Thesis (M.A.)--Wichita State University, College of Health Professions, Dept. of Communication Sciences and Disorders
Read more
195

Calculation of quantum entanglement using a genetic algorithm

Lesniak, Joseph 05 1900 (has links)
Quantum entanglement is a multifaceted property that has attracted much attention, since it is used as the basis for such applications as quantum cryptography, quantum teleportation, and quantum computing. The calculation of quantum entanglement therefore has gained importance. As systems that use entanglement have evolved, the calculation of entanglement has become much more complex. A general method was developed for the calculation of entanglement for n-qubit or n-qudit system, based on the relative entropy of entanglement, and using a genetic algorithm technique. The method was tested on a two qubit system, for which there are some known points, and compared to exact calculations using the entanglement of formation and to another approximate method based on a quantum neural network. Advantages and disadvantages of the method and future work are discussed / Thesis (M.S.)--Wichita State University, College of Liberal Arts and Sciences, Dept. of Physics
196

A macro level environmental performance comparison: dry machining process vs wet machining process

Lodhia, Prashant 05 1900 (has links)
The metal cutting process, known as conventional machining process, utilizes cutting fluids to provide lubrication, cooling and easy chip removal. The long-term effects of cutting fluids disposal into the environment are becoming increasingly evident. Research has also proven the health hazards on manufacturing workers who come in direct contact with cutting fluids. The formulation of stringent rules and restrictions on use and disposal of cutting fluids has increased the cost associated with cutting fluids use to between 7% and 17% of total manufacturing cost. The dry machining process considered in this thesis, using diamond-coated tools for the machining of aluminum eliminates cutting fluids from the cutting process. The improvement in tool life in dry machining and complete elimination of cutting fluids from the process are known benefits of the diamond-coated tools. Even though these advantages have been documented, no one to date has considered the environmental impacts of the entire life cycle of either wet or dry machining. In this research, a macro level life cycle analysis (LCA), a systems analysis tool for evaluating environmental impacts over the life span of a product or process, is used to compare environmental performance of the conventional or wet machining process using uncoated carbide tools and the dry machining process using diamond-coated carbide tools at a macro level. A cost analysis of these machining processes is also included here to provide more informed guidelines for the local Wichita aircraft manufacturers and for the manufacturing industry in general. The results indicate that the dry machining surpasses the wet machining process in terms of environmental impacts at a macro level / Thesis (M.S.)--Wichita State University, College of Engineering, Dept. of Industrial and Manufacturing Engineering
Read more
197

ANN model to predict burr height and thickness

Manjunatha, Nikethan Narigudde 05 1900 (has links)
The drilling of metals produces undesirable projections at the surface of the hole called burrs, which are very costly to remove from the work piece. Any effort involved in simplifying the drilling process that decreases the burr size significantly helps in reducing deburring cost. This study focuses on the burrs formed in drilling of AL6061-T6 at the exit side of the work piece as they are usually larger and have complicated shape and size. Two models are developed using back propagation neural networks to predict burr height and thickness separately as a function of point angle, chisel edge angle and lip clearance angle. The results of this research show that the height and thickness of the burr can be controlled by proper selection of drill bit that consists of suitable geometric parameters. The optimal geometric parameters that yield minimum burr height and thickness are also suggested. Thus, the model assists in identifying suitable drill bit that yields minimum burr height and thickness and as a result helps in reducing deburring cost. / Thesis (M.S.)--Wichita State University, College of Engineering, Dept. of Industrial and Manufacturing Engineering
198

The cross and catholics: magic or religion?

McHenry, Bryon K. 05 1900 (has links)
The goal of this thesis is to understand the difference between Catholic “popular religion” and so-called “official religion” of the Roman Catholic Church, as it relates to the use of charms, and whether such usage is magic or religion. The “official religion” is belief and practices endorsed by the officials of the Holy See in Vatican City, “popular religion” is religion practiced without regard to “official” teachings or doctrines. I will look at criteria that qualify a religion in its practice as “popular” or “official.” I will then apply this to what I have observed in Wichita. It was concluded that the uses of charms by Catholics in Wichita were used as religious artifacts and not as magical charms. These religious artifacts were used in the proscribed manner of the Catholic Church and hence are “official.” / Thesis (M.A.)--Wichita State University, College of Liberal Arts and Sciences, Dept. of Anthropology.
199

Development of Middle Jurassic microbial buildupsin the Bighorn Basin of northern Wyoming

Ploynoi, Manwika 05 1900 (has links)
This research examines the development of microbial buildups in the Middle Jurassic Lower Sundance Formation in the Bighorn Basin of northwestern Wyoming. Previous studies of Jurassic microbial research have focused on development along the Tethys Sea. This research examines buildups along the western side of the North America continent, in present day Wyoming. Microbial buildups were studied through outcrop measurements, hand sample and petro graphic analysis. In addition, the stable carbon and oxygen isotopic composition were run to explain paleo climate and chemical composition of seawater during the Middle Jurassic. The studied outcrop is located in northern Wyoming near Cody along the east side of Cedar Mountain. The microbial outcrop has an excellent semi-circular, exposure of microbial buildups at the location. The buildups are about 2.27 meters in height, 2.9 meters in width and 30 cm in thickness. The microbial buildups are composed of several thrombolitic heads. The microbial buildups are associated with the Gypsum Spring and Lower Sundance Formation. The microbial units were deposited along a contact between these two Formations. The lithology of the Gypsum Spring is dominated by massive white gypsums at the lower unit and red shale at the upper unit. These deposited evaporites represent hypersaline condition with seasonally arid and warm climate over an extended period. The lithology of the Lower Sundance Formation is dominated by basal oolitic grainstone at the contact between itself and the Gypsum Spring Formation. The Lower Sundance Formation is also dominated by green shale and carbonate rocks with highly abundant in fossil especially ostracods, crinoids, and pelecypods. Based on petrographic analysis, the microbial buildups have thrombolitic or clotted characteristic features. Thrombolite structures are produced by sediment trapping, binding and/or precipitation as a result of the growth and metabolic activity of microorganisms, principally cyanobacteria. The thrombolite or bindstone is composed of highly clotted-looking algae such as Girvanella (Cyanobacteria), Cayeuxia (Chlorophyta), Solenopora (Rhodophyta) and other encrusting microorganisms such as foraminifera, ostracods, bryozoan, and cyanobacteria. The isotopic composition values of belemnites can be used as a proxy for chemical composition of marine water because belemnites are believed to have not fractionated. As a result, belemnite calcite constitutes the best standard for the geochemistry of Jurassic seawater and provides a reasonable approximation of sea water chemistry during deposition of a rock unit. The carbon isotopic values of belemnite from the Sundance Formation are 2.11 to 2.54 ‰, and oxygen isotopic values are -2.34 to - 2.36‰. The carbon isotopic values of carbonate rock samples including microbial thrombolite from the Lower Sundance Formation range from 1.5 to 2.5 ‰, and an oxygen isotopic value range from -5.5 to -9 ‰. Overall, the isotopic values of carbonate rocks from the Lower Sundance Formation of the study section are more negative than the standard for seawater as represented by the belemnite isotopic values. The result of the entire stratigraphic column has been altered post-depositionally. Negative values suggest that rocks have been diagenetically altered by meteoric water since deposition during the Jurassic. As a result, carbon and oxygen isotopes can not be used to determine original sea water chemistry. However isotopic composition values from the belemnite can be used to determine paleotemperature. The paleotemperature of the Middle Jurassic (Late Bathonian to Early Callovian) seawater based on the belemnite samples from Wyoming is calculated to about 15-17oC. / Thesis (M.S.)--Wichita State University, College of Liberal Arts and Sciences, Dept. of Geology.
Read more
200

Study of rear impact in light trucks and potential injuries to the occupants

Patel, Dhruv Vikram 05 1900 (has links)
According to National Highway Traffic Safety Administration (NHTSA), each year about 400,000 trucks are involved in motor vehicle crashes. Eighteen percent of these accidents are in rear-end crashes of the trucks. Accordingly, fatal injury had resulted to 5 percentages of the injuries. Whiplash is the common neck injury in rear impact consuming billions of dollars in insurance. However due to relatively low number of deaths or injuries in rear impact crashes, NHTSA does not conduct any rear impact testing to test the bumpers. The main objective of this thesis is to study the effects of low speed impact on light trucks and the potential injuries on the occupants. The Federal Motor Vehicle Safety Standards (FMVSS) includes rear impact testing of fuel leakage, but only has a voluntary test for rear bumper impact test at low speed. In this thesis, the low speed rear impact simulation of a light truck was performed to understand the bumper deformation. A Chevy light truck is impacted to a flat barrier at 5 mph by using the finite element code LS-Dyna. This simulation is analyzed and validated for its bumper impact test. A parametric study is thus performed to quantify the effect of various parameters on the rear end impact of the truck. Four vehicles were selected from public domain National crash analysis center (NCAC). These vehicles were Geo Metro, Chevy Truck, Ford Taurus and Ford single unit truck, selected according to the weight of the vehicles. The Chevy truck was chosen as target vehicle and other three models were selected as bullet vehicles. The target vehicle was then impacted with speed of 5, 10 and 15 mph. The accelerations were extracted from the center of gravity of the target vehicle (Chevy Truck). The acceleration pulses from the LS-Dyna were used in multi body analysis Mathematical Dynamic Model (Madymo). The seat model was built with similar characteristics as the Chevy truck seat. A Hybrid ΙΙΙ dummy model was positioned with seat and the model was given the acceleration pulses from the corresponding g’s at low speed for the truck impacted at 5, 10 and 15 mph. This model was used to study the injuries on the neck. The developed model was then compared for neck response from the occupant with head restraint and without head restraint. Output of the dummy response resulted in injury values needed to be studied. The injury values were compared with standard critical values complying with injuries. The result of this study can be utilized to obtain the effect of weight of impacting vehicles in low speed rear crashes of trucks. The impact response of the occupants and potential neck loads and injuries are also by products of this study / Thesis (M.S.)--Wichita State University, College of Engineering, Dept. of Mechanical Engineering.
Read more

Page generated in 0.0513 seconds