• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 32971
  • 15309
  • 9136
  • 4760
  • 3708
  • 3558
  • 831
  • 691
  • 615
  • 587
  • 508
  • 479
  • 449
  • 376
  • 365
  • Tagged with
  • 88876
  • 11799
  • 9394
  • 6308
  • 6198
  • 5273
  • 5035
  • 4572
  • 3983
  • 3898
  • 3808
  • 3756
  • 3556
  • 3432
  • 3388
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
451

Supervised Aggregation of Classifiers Using Artificial Prediction Markets

Unknown Date (has links)
Prediction markets have been demonstrated to be accurate predictors of the outcomes of future events. They have been successfully used to predict the outcomes of sporting events, political elections and even business decisions. Their prediction accuracy has even outperformed the accuracy of other prediction methods such as polling. As an attempt to reproduce their predictive capability, a machine learning model of prediction markets is developed herein for classification. This model is a novel classifier aggregation technique that generalizes linear aggregation techniques. This prediction market aggregation technique is shown to outperform or match Random Forest on both artificial and real data sets. The notion of specialization is also developed and explored herein. This leads to a new kind of classifier referred to as a specialized classifier. These specialized classifiers are shown to improve the accuracy of prediction market aggregation even to perfection. / A Thesis submitted to the Department of ScientifiC Computing in partial fulfillment of the requirements for the degree of Master of Science. / Fall Semester, 2009. / November 5, 2009. / Machine Learning, Aggregation, Random Forest / Includes bibliographical references. / Adrian Barbu, Professor Directing Thesis; Anke Meyer-Baese, Committee Member; Tomasz Plewa, Committee Member.
452

Effects of Vertical Mixing Closures on North Atlantic Overflow Simulations

Unknown Date (has links)
We are exploring the effect of using various vertical mixing closures on resolving the physical process known as overflow. This is when cold dense water overflows from a basin in the ocean. This process is responsible for the majority of the Ocean's dense water transport, and also creates many of the dense water currents that are part of what is known as the Ocean Conveyor Belt. One of the main places this happens is in the North Atlantic, in the Denmark strait and the Faroe Bank Sea Channel. To simulate this process, two ocean models are used, the Parallel Ocean Program (POP) and the hybrid-coordinate Parallel Ocean Program (HyPOP). Using these models, differences are observed in three main vertical mixing schemes Constant, Richardson Number, and KPP. Though, not included in this thesis the research also explores three different vertical griding schemes, Z-Grid, Sigma Coordinate, and Isopycnal grids. The goal is to attempt to determine which combination gives the most acceptable results for resolving the overflow process. This is motivated by the large role this process plays in the ocean, as well as the difficulty in modeling this process. If an ocean model cannot accurately simulate overflow, then a large portion of the ocean model will be incorrect and one cannot hope to get reasonable results for long simulations out of it. / A Thesis submitted to the Department of ScientifiC Computing in partial fulfillment of the requirements for the degree of Master of Science. / Fall Semester, 2009. / November 6, 2009. / Overflow, Ocean Modeling, Vertical Mixing, Viscosity, Diffusion / Includes bibliographical references. / Max Gunzburger, Professor Directing Thesis; Gordon Erlebacher, Committee Member; Janet Peterson, Committee Member.
453

Parallel Grid Generation and Multi-Resolution Methods for Climate Modeling Applications

Unknown Date (has links)
Spherical centroidal Voronoi tessellations (SCVT) are used in many applications in a variety of fields, one being climate modeling. They are a natural choice for spatial discretizations on the surface of the Earth. New modeling techniques have recently been developed that allow the simulation of ocean and atmosphere dynamics on arbitrarily unstructured meshes, including SCVTs. Creating ultra-high resolution SCVTs can be computationally expensive. A newly developed algorithm couples current algorithms for the generation of SCVTs with existing computational geometry techniques to provide the parallel computation of SCVTs and spherical Delaunay triangulations. Using this new algorithm, computing spherical Delaunay triangulations shows a speed up on the order of 4000 over other well known algorithms, when using 42 processors. As mentioned previously, newly developed numerical models allow the simulation of ocean and atmosphere systems on arbitrary Voronoi meshes providing a multi-resolution modeling framework. A multi-resolution grid allows modelers to provide areas of interest with higher resolution with the hopes of increasing accuracy. However, one method of providing higher resolution lowers the resolution in other areas of the mesh which could potentially increase error. To determine the effect of multi-resolution meshes on numerical simulations in the shallow-water context, a standard set of shallow-water test cases are explored using the Model for Prediction Across Scales (MPAS), a new modeling framework jointly developed by the Los Alamos National Laboratory and the National Center for Atmospheric Research. An alternative approach to multi-resolution modeling is Adaptive Mesh Refinement (AMR). AMR typically uses information about the simulation to determine optimal locations for degrees of freedom, however standard AMR techniques are not well suited for SCVT meshes. In an effort to solve this issue, a framework is developed to allow AMR simulations on SCVT meshes within MPAS. The resulting research contained in this dissertation ties together a newly developed parallel SCVT generator with a numerical method for use on arbitrary Voronoi meshes. Simulations are performed within the shallow-water context. New algorithms and frameworks are described and bench-marked. / A Dissertation submitted to the Department of ScientifiC Computing in partial fulfillment of the requirements for the degree of Doctor of Philosophy. / Summer Semester, 2011. / June 14, 2011. / spherical centroidal voronoi tessellation, grid generation, high performance computing, spherical delaunay triangulation, adaptive mesh refinement, shallow-water equations, ocean modeling / Includes bibliographical references. / Max Gunzburger, Professor Directing Thesis; Doron Nof, University Representative; Janet Peterson, Committee Member; Gordon Erlebacher, Committee Member; Michael Navon, Committee Member; John Burkardt, Committee Member; Todd Ringler, Committee Member.
454

Model checking for general parametric regression models

Li, Lingzhu 19 August 2019 (has links)
Model checking for regressions has drawn considerable attention in the last three decades. Compared with global smoothing tests, local smoothing tests, which are more sensitive to high-frequency alternatives, can only detect local alternatives dis- tinct from the null model at a much slower rate when the dimension of predictor is high. When the number of covariates is large, nonparametric estimations used in local smoothing tests lack efficiency. Corresponding tests then have trouble in maintaining the significance level and detecting the alternatives. To tackle the issue, we propose two methods under high but fixed dimension framework. Further, we investigate a model checking test under divergent dimension, where the numbers of covariates and unknown parameters go divergent with the sample size n. The first proposed test is constructed upon a typical kernel-based local smoothing test using projection method. Employed by projection and integral, the resulted test statistic has a closed form that depends only on the residuals and distances of the sample points. A merit of the developed test is that the distance is easy to implement compared with the kernel estimation, especially when the dimension is high. Moreover, the test inherits some feature of local smoothing tests owing to its construction. Although it is eventually similar to an Integrated Conditional Moment test in spirit, it leads to a test with a weight function that helps to collect more information from the samples than Integrated Conditional Moment test. Simulations and real data analysis justify the powerfulness of the test. The second test, which is a synthesis of local and global smoothing tests, aims at solving the slow convergence rate caused by nonparametric estimation in local smoothing tests. A significant feature of this approach is that it allows nonparamet- ric estimation-based tests, under the alternatives, also share the merits of existing empirical process-based tests. The proposed hybrid test can detect local alternatives at the fastest possible rate like the empirical process-based ones, and simultane- ously, retains the sensitivity to high-frequency alternatives from the nonparametric estimation-based ones. This feature is achieved by utilizing an indicative dimension in the field of dimension reduction. As a by-product, we have a systematic study on a residual-related central subspace for model adaptation, showing when alterna- tive models can be indicated and when cannot. Numerical studies are conducted to verify its application. Since the data volume nowadays is increasing, the numbers of predictors and un- known parameters are probably divergent as sample size n goes to infinity. Model checking under divergent dimension, however, is almost uncharted in the literature. In this thesis, an adaptive-to-model test is proposed to handle the divergent dimen- sion based on the two previous introduced tests. Theoretical results tell that, to get the asymptotic normality of the parameter estimator, the number of unknown parameters should be in the order of o(n1/3). Also, as a spinoff, we demonstrate the asymptotic properties of estimations for the residual-related central subspace and central mean subspace under different hypotheses.
455

Grave tales : an osteological assessment of health and lifestyle from 18th and 19th century burial sites around Cape Town

Friedling, Louise J January 2007 (has links)
Includes bibliographical references (p. 249-270). / Two unwalled 18th and 19th century colonial burial sites, Cobern Street and Marina Residence, were assessed osteologically and dentally to reconstruct the life histories and activity patterns of the poorer people living at the Cape. This was done to add to the history and knowledge of the descendants of these people, as little other information exists on them. Questions pertaining to diet, stress, activity patterns and trauma were investigated. Visual (standard and novel macroscopic methods e.g. distal humeri method), metric (femoral neck method) and histological (proximal anterior femur) techniques were tested and employed to estimate age and sex, as the skeletal material was fragmentary and incomplete. Only adults were assessed and analysed (n = 86 and n = 75 for Cobern Street and Marina Residence respectively) as the infant, juvenile and sub-adult skeletal material was too badly preserved and fragmentary to attempt reconstruction. Mortality profiles reveal that the two study sites were different in community dynamics. They led hard active lives as seen from their muscle marking and degenerative joint disease patterns. Osteoarthritis was not only very frequent within the groups but was found in much of the younger adult skeletal material. Stress and trauma were relatively low within the two populations. Dental disease was relatively high within the two study groups. This was as a result of a carbohydrate rich diet and poor oral hygiene. Thus the food they were consuming as well as the activities they were involved in had a huge impact on their lives. The first possible cases of syphilis, tuberculosis and Paget's disease at the Cape were found within these two study groups.
456

Toolkits for Automatic Web Service and Graphic User Interface Generation

Unknown Date (has links)
Over the past decade, Web Services have played a prominent role in the Internet area and in the business world. My interest is focused on developing the toolkits for automatic web service and graphical user interface (GUI) generation, KWATT. The standalone KWATT service generator(KSG) is a C++ application that generates web services from Tcl, Python, and Ruby scripts uploaded by end user with KGT(Kwatt Gui Tools), with minimal user intervention. KSG Parser parses the scripts and extracts information about procedures and user-defined control statements, embedded as comments. The KSG creates all necessary C++ wrappers, along with the code stubs required by gSOAP, a C++ interface to the SOAP protocol. Initially conceived to translate VTK front-end Tcl scripts into Web Services, the architecture is sufficiently general to accommodate a wide range of input languages. The work is extanded by considering the automatic creation of graphical user interfaces to allow interaction between an end user and the web service generated by the KSG. Kwatt GUI Generator(KGG) was developed to achieve this. The KGG is a web service that runs inside a service of Java-based open source, and it performs four major steps of GUI generation. First, the KGG receives the scripts from KGT (KWATT GUI Tools) after the corresponding web service generated successfully. Comment lines inserted into the scripts provide hints to the XML generator about the interface widgets. Second, the structure of the GUI is encoded into an XML file by parsing those scripts with the XML generator. Third, the KGG extracts information from the generated XML file, then passes them to a plugin. Finally, the plugin generates the corresponding language user interface that is sent back to the user by the KGG. / A Thesis submitted to the Department of Scientific Computing in partial fulfillment of the requirements for the degree of Master of Science. / Summer Semester, 2009. / July 27, 2009. / GUI, Web Service, SOAP / Includes bibliographical references. / Gordon Erlebacher, Professor Directing Thesis; Ming Ye, Committee Member; Xiaoqiang Wang, Committee Member.
457

Inverse Problems in Polymer Characterization

Unknown Date (has links)
This work implements inverse methods in various polymer characterization problems. In the first topic, a new approach is proposed to infer the comonomer content using Crystaf method considering and quantifying the associated uncertainty. In the second topic, a comparison is carried out between various rheological probes (methods) to determine their sensitivity in long-chain branching (LCB) detection and measurement. In the last topic, an open-source software is implemented to infer continuous and discrete relaxation modulus. / A Dissertation submitted to the Department of Scientific Computing in partial fulfillment of the requirements for the degree of Doctor of Philosophy. / Summer Semester, 2014. / May 29, 2014. / Characterization, Inverse methods, Polymer, Uncertainty Quantification / Includes bibliographical references. / Sachin Shanbhag, Professor Directing Dissertation; William Oates, University Representative; Anke Meyer-Baese, Committee Member; Peter Beerli, Committee Member; Jim Wilgenbusch, Committee Member.
458

On the Hadamard theorem and maximal determinants in combinatorial investigations

Unknown Date (has links)
The Hadamard Theorem plays a brief but important role in the study of maximal determinants in chapter II. The proof which is presented follows closely the paper by Everett and Ryser, but an attempt has been made to clarify and expand their work wherever possible. / Typescript. / "June, 1960." / "Submitted to the Graduate Council of Florida State University in partial fulfillment of the requirements for the degree of Master of Science." / Advisor: Marion F. Tinsley, Professor Directing Paper. / Includes bibliographical references (leaf 23).
459

Bioinformatics Analysis and Annotation of Microtubule Binding and Associated Proteins (MAPs) - Creating a Database of MAPs

Shenoy, Narmada 08 1900 (has links)
A Thesis Submitted to the Faculty of the School of Informatics, Indiana University, Indianapolis By Narmada Shenoy In Partial Fulfillment of the Requirements for the Degree of Master of Science August 2005 / Microtubules have many roles in the cytoskeletal infrastructure. This infrastructure underlies vital processes of cellular life such as motility, division, morphology, and intracellular organization and transport. These different roles are carried out by the creation of different microtubule (MT) systems (such as basal bodies, centrioles, flagellum, kinetochores, and mitotic spindles). The changing roles require the cytoskeleton to be both dynamic and static in nature. Guiding these processes are a network of proteins that direct cellular behavior through their ability to bind microtubules (MTs) in a spatial- and temporal-specific manner. The identification and characterization of the suite of microtubule binding and associated proteins (MAPs) involved in MT systems is important for the understanding of the biological form and function of each MT system. This research involved the analysis and annotation of four MAPs – Ensconsin in Humans, Hook (homolog 3) in Humans, Protein Regulator of Cytokinesis 1 (PRC1) in Humans and Anaphase Spindle Elongation protein (ASE1) in yeast. A bioinformatics approach was used for the annotation and analysis. A protocol for analysis and annotation of MAPs was developed. During the process, some limitations in using bioinformatics tools and procedures were encountered. These limitations were overcome, the initial protocol was improved on and a modified protocol of analysis was developed. A database was designed and built to hold annotated information on the MAPs. We seek to disseminate this database and its functionalities as a web resource to the scientific community. It will provide an excellent forum for researchers to obtain relevant information on MT binding and associated proteins (MAPs). Infection by parasitic protozoa causes incalculable morbidity and mortality to humans and agricultural animals. In this research, we have also focused on MAPs in parasitic organisms of the Apicomplexan and Trypanosomatid genera. The protocol for analysis incorporates steps to analyze MAPs from these organisms as well. Malaria (a potentially life threatening disease) is caused by Plasmodium, an Apicomplexan parasite. This parasite is transmitted to people by the female Anopheles mosquito, which feeds on human blood. African Sleeping Sickness is an acute disease 8 caused by Trypanosoma brucei that typically leads to death within weeks or months if not treated. Microtubule-associated proteins (MAPs) and their alteration of the unique microtubule (MT) systems play major roles in these organisms throughout their life cycle and are required for their pathogenic mechanisms. Each parasite contains unique MT systems that will test our annotation process as well as prepare the DB for addition of other novel MT systems, such as those contained with plants. Additionally, these single cell organisms have a multistage life cycle that provide similar annotation challenges to those encountered when one considers multi-cellular organisms. Therefore, a researcher working on any MT system within the database will find useful information regardless of the organism that they are studying. This will leave us with a sub-set of MAPs from parasitic organisms in our database that are potential drug-targets.
460

Polarographic trace analysis of copper, lead and cadmium

Jepson, Donald Paul 01 January 1973 (has links)
The area of study for this project was the development of an accurate, reliable, and inexpensive method of trace analysis for copper, cadmium, and lead in foods. AT this time, the Association of Official Analytical Chemists Manual, eleventh edition, (4) has only official and recognized methods of analysis for copper (25.023), cadmium (25.018), and lead (25.041). In general, these procedures are long and tedious. The digestions involved often require many hours and are quite dangerous due to leakage of metallic and organometallic fumes from the digestion apparatus. The official methods of analysis for these elements require a very large outlay of money for special equipment and reagents. It was the purpose of this project to determine the following: (A) The feasibility of using polarographic detection for copper, cadmium, and lead in trace amounts.; (B) Fina da new method of sample decomposition to replace the present digestion methods.; (C) Study the selection media prior to polarographic detection.; (D) Develop a method of simultaneous polarographic detection of copper, cadmium, and lead.

Page generated in 0.3989 seconds