• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 21
  • 9
  • 4
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 73
  • 54
  • 40
  • 32
  • 14
  • 11
  • 11
  • 11
  • 11
  • 10
  • 10
  • 10
  • 10
  • 10
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Production Of Chips And Crisp From Jerusalem Artichoke

Baltacioglu, Cem 01 February 2012 (has links) (PDF)
Jerusalem artichoke has been cultivated in various regions without any special breeding technique. On the other hand, in food industry it does not have a wide usage area. Hence, in food industry its use as a potato substitute in some products is believed to be increasing its economical value. As a first attempt chips and crisps produced from Jerusalem artichoke was analyzed for texture, sensory, color, oil and moisture content. Jerusalem artichoke chips were fried in a bench top deep fat fryer for 120s, 180s and 240s at 160&deg / C, 170&deg / C, 180&deg / C and 190&deg / C. When microwave oven was used samples were cooked for 60s, 75s, 90s, 105s, 120s, 135s and 150s at 600 Watt and 900 Watt. Rheological properties of Jerusalem artichoke puree were investigated and Xanthan gum (2%wb) and sodium metabisulphite (1%wb) added for the desired puree consistence and color. After the production of puree Jerusalem artichoke flour was produced and water added to this flour then dough obtained again. Rheological behaviour of the original puree and these prepared from the containing 1- 4.5 and 1 - 5.0 part water were quite similar. In the light of the experimental results obtained as frying temperature and treatment time increased, moisture content and lightness of the Jerusalem artichoke products have decreased but a*and b* values, hardness, fracture and oil content increased. The best results for frying of Jerusalem artichoke seem to be 180&deg / C with about 240s treatment time for the chips and the same temperature for 180s for the crisps. As microwave power level and duration of treatment increased, moisture content and lightness of the microwave cooked Jerusalem artichoke products have decreased, but a* and b* values increased. Hardness and fracturability values of the products first increased with time and then decreased. When microwave oven was used, the best results were obtained for about 105s treatment time at 600W for the Jerusalem artichoke chips and about 60s processing time at 900W for the crisp. Since treatment time for cooking was significantly reduced when microwave cooking was used, this method could be recommended as an alternative to conventional deep fat frying, as oil is not used as well.
22

Heat flow variability at the Costa Rica subduction zone as modeled by bottom-simulating reflector depths imaged in the CRISP 3D seismic survey

Cavanaugh, Shannon Lynn 09 November 2012 (has links)
3D seismic reflection data were acquired by the R/V Langseth and used to extract heat flow information using bottom-simulating reflector (BSR) depths across the southern Costa Rica convergent margin. These data are part of the CRISP Project, which will seismically image the Middle America subduction zone in 3D. The survey was conducted in an area approximately 55x11 km, northwest of the Osa Peninsula, Costa Rica. For the analysis presented here, seismic data were processed using a post-stack time migration. The BSR—a reverse polarity seismic reflection indicating the base of the gas hydrate phase boundary—is imaged clearly within the slope-cover sediments of the margin wedge. If pressure is taken into account, in deep water environments the BSR acts as a temperature gauge revealing subsurface temperatures across the margin. Two heat flow models were used in this analysis. In the Hornbach model BSR depth is predicted using a true 3D diffusive heat flow model combined with Integrated Ocean Drilling Program (IODP) thermal conductivity data and results are compared with actual BSR depth observations to constrain where heat flow anomalies exist. In the second model heat flow values are estimated using the heat flow equation. Uniform heat flow in the region should result in a deeper BSR downslope toward the trench due to higher pressure; however results indicate the BSR is deepest at over 325 meters below the seafloor (mbsf) further landward and shoals near the trench to less than 100 mbsf, suggesting elevated heat flow towards the toe of the accretionary prism. Heat flow values also reflect this relation. In addition to this survey-wide trend, local heat flow anomalies appear in the form of both circular patterns and linear trends extending across the survey, which can be related to mounds, thrust faults, folds, double BSRs, and seafloor erosion imaged in the seismic data. I suggest that these areas of higher local heat flow represent sites where advection of heat from deep, upward-migrating, thermogenically-sourced fluids and/or gases may be taking place. These heat flow trends have implications for not only earthquake nucleation, but also methane hydrate reserve stability. / text
23

Design and Implementation of Calculated Readout by Spectral Parallelism (CRISP) in Magnetic Resonance Imaging (MRI)

So, Simon Sai-Man January 2010 (has links)
CRISP is a data acquisition and image reconstruction technique that offers theoretical increases in signal-to-noise ratio (SNR) and dynamic range over traditional methods in magnetic resonance imaging (MRI). The incoming broadband MRI signal is de-multiplexed into multiple narrow frequency bands using analog filters. Signal from each narrowband channel is then individually captured and digitized. The original signal is recovered by recombining all the channels via weighted addition, where the weights correspond to the frequency responses of each narrowband filter. With ideal bandpasses and bandwidth dependent noise after filtering, SNR increase is proportional to sqrt(N), where N is the number of bandpasses. In addition to SNR improvement, free induction decay (FID) echoes in CRISP experience a slower decay rate. In situations where resolution is limited by digitization noise, CRISP is able to capture data further out into the higher frequency regions of k-space, which leads to a relative increase in resolution. The conversion from one broadband MR signal into multiple narrowband channels is realized using a comb or bank of active analog bandpass filters. A custom CRISP RF receiver chain is implemented to downconvert and demodulate the raw MR signal prior to narrowband filtering, and to digitize the signals from each filter channel simultaneously. Results are presented demonstrating that the CRISP receiver chain can acquire 2D MR images (without narrowband filters) with SNR similar to SNR of images obtained with a clinical system. Acquiring 2D CRISP images (with narrowband filters) was not possible due to the lack of phase lock between rows in k-space. RMS noise of narrowband, broadband and unfiltered 1D echoes are compared.
24

The Misplaced Role of “Utilitarianism” in John Stuart Mill’s Utilitarianism

Wright, David 2012 August 1900 (has links)
This thesis aims to provide the appropriate historical context for interpreting John Stuart Mill's Utilitarianism. The central question considered here concerns two views of Mill's intentions for Utilitarianism, and whether the work should be read as Mill arguing for his own version of utilitarianism, or as an ecumenical document expressing and defending the views of many utilitarians. The first view, labeled the orthodox view, as defended by Roger Crisp, is probably the most commonly held view as to how to interpret the document. The second view, labeled the revisionist view, is defended by Daniel Jacobson in a recent article. By examining Mill's place in the history of utilitarianism, his journals, correspondence, and other writings leading up to and after the publication of Utilitarianism, this thesis argues in support of the revisionist position. Furthermore, it is argued that certain portions of the book deserve special consideration apart from other chapters, and this is taken to have implications for the future of research in Mill?s thought. This thesis has four chapters including the first introductory chapter, which outlines the motivations guiding the orthodox and revisionist views. The second chapter provides a general exposition of Utilitarianism, as well as an outline of the primary evidence supporting the orthodox and revisionist positions. The third chapter is a defense of the revisionist position, and it highlights the specific biographical context in which Utilitarianism was composed, as well as evidence from Mill's writings, correspondence, and journals suggesting that he saw the need to write a general defense of the principle of utility and elaborate his theory of justice. This chapter also includes a historiographical analysis of Mill's biographers, which suggests that Utilitarianism is not viewed by Mill's biographers as being especially central to his considered views on utilitarianism. Finally, the chapter includes a section on the early reception and criticisms offered against Utilitarianism, which partly explains why the book has come to be interpreted as it has. The final chapter reviews the evidence for the revisionist position and explains the implications for Mill scholarship in light of the findings of this study.
25

Mineração de dados em base de germoplasma

Hiragi, Gilberto de Oliveira 03 1900 (has links)
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2008. / Submitted by Jaqueline Oliveira (jaqueoliveiram@gmail.com) on 2008-11-28T12:09:59Z No. of bitstreams: 1 DISSERTACAO_2008_GilbertoOliveiraHiragi.pdf: 895106 bytes, checksum: 181e2a9a782456ffe4637fc5519e09c8 (MD5) / Approved for entry into archive by Georgia Fernandes(georgia@bce.unb.br) on 2009-02-11T16:01:26Z (GMT) No. of bitstreams: 1 DISSERTACAO_2008_GilbertoOliveiraHiragi.pdf: 895106 bytes, checksum: 181e2a9a782456ffe4637fc5519e09c8 (MD5) / Made available in DSpace on 2009-02-11T16:01:26Z (GMT). No. of bitstreams: 1 DISSERTACAO_2008_GilbertoOliveiraHiragi.pdf: 895106 bytes, checksum: 181e2a9a782456ffe4637fc5519e09c8 (MD5) / Os bancos de germoplasma do SIBRARGEN (Sistema Brasileiro de Informações em Recursos Genéticos) funcionam como um grande catálogo das espécies vegetais e de seus acessos (tipos característicos dentro de um grupo ou variabilidades dentro da espécie), contendo mais de 100 mil acessos catalogados. Esses bancos incluem a identificação do acesso (passaporte), descrição dos aspectos genótipos (caracterização) e descrição dos aspectos fenótipos (avaliação) e permitem aos pesquisadores dessa área realizarem consultas SQL mas recuperando apenas os dados armazenados, resultantes da resolução das expressões booleanas utilizadas como critérios de busca. Essas consultas não facilitam a descoberta de novos conhecimentos ou a construção de modelos de previsão ou descrição. Essa pesquisa propõe uma metodologia de mineração de dados, derivada do modelo de referência CRISP/DM, que auxilie a exploração dessas bases de dados por pesquisadores não vinculados à área de informática (por exemplo, biólogos ou agrônomos) visando facilitar a realização de tarefas previstas nas seguintes fases do CRISP/DM: entendimento do negócio, compreensão dos dados, preparação de dados, modelagem, avaliação dos modelos gerados e colocação em uso. Para materializar a metodologia proposta e automatizar a sua utilização por parte de não-informatas, foi implementada a ferramenta HaDog (Hiragi Approach for Data Mining of Germoplasm). HaDog foi implementada utilizando a linguagem Java, banco de dados Oracle® versão 10g release 2 e é acessível através de uma interface Web, disponível aos pesquisadores credenciados para acesso ao SIBRARGEN. A metodologia de mineração de germoplasma proposta foi avaliada de forma experimental através de dois estudos de casos conduzidos com o apoio de pesquisadores da Embrapa Recursos Genéticos e Biotecnologia: determinação de acessos representativos de uma espécie ou grupo de espécies e proposição de coletas direcionadas, ambos problemas típicos de interesse do curador (pesquisador responsável pelo banco de germoplasma de uma espécie). Essa avaliação experimental mostrou que é possível introduzir os especialistas na área na utilização de técnicas de mineração de dados na base de germoplasma sem requerem que eles se envolvam em atividades de programação. Os resultados experimentais obtidos até o momento demonstram que o HaDog pode se constituir em um importante facilitador para a mineração das bases do SIBRARGEN, visando, principalmente, a descoberta de novos conhecimentos pelos especialistas. _________________________________________________________________________________________ ABSTRACT / The banks of germplasm of the SIBRARGEN (Brazilian Information System in Genetic Resources) function as a great catalogue of the vegetal species and of its accesses (characteristic types inside of a group or variabilities inside of the species), contend more than 100 thousand catalogued accesses. These banks include the identification of the access (passport), description of the genotypes aspects (characterization) and phenotype description (evaluation) and allow researchers of this area to carry through SQL queries but recouping only the stored data, resultant of the resolution of the used boolean expressions as criteria search. These queries don’t facilitate to the discovery of new knowledge or the construction of forecast models or description. This research considers a data mining methodology, derived from the model of reference CRISP/DM, that assists the exploration of these databases for researchers tied with the computer science area (for example, biologists or agronomists) aiming to facilitate the accomplishment of tasks foreseen in the following phases of the CRISP/DM: business understanding, data understanding, data preparation, modeling, evaluation of the generated models and deployment. To materialize the methodology proposal and to automatize its use by people who aren’t of the computer science area, the HaDog tool was implemented (Hiragi Approach of Data Mining of Germplasm). HaDog was implemented using the Java language, database Oracle® version 10g release 2 and is accessible through a Web interface, available to the credential researchers for access to the SIBRARGEN. The methodology of mining of germplasm proposal was evaluated of experimental form through two studies of cases lead with the support of researchers of the Embrapa (Genetic Resources and Biotechnology: determination of representative accesses of a species or group of species and proposal of directed collections, both typical problems of interest of the custodian (responsible researcher for the Bank of germplasm of a species). This experimental evaluation showed that it is possible to introduce the specialists in the area in the use of techniques of mining of data in the base of germplasm without require that they become involved themselves in activities of programming. The experimental results obtained so far show that HaDog can be a major facilitator for the mining of foundations of SIBRARGEN, targeting mainly, the discovery of new knowledge by specialists.
26

Dolování znalostí z rozsáhlých statistických souborů lékařských dat

Badelita, Elvyn-George January 2015 (has links)
Final thesis deals with information-mining from large sets of medical data using methods and machine learning algorithms. The subject of the theoretical part is machine learning and its distribution, description of the basic data types in data mining, most important classifications and predictions methods, criterion defining the quality of prediction methods, description of data mining methodology and frequently used systems. The practical part focuses on statistical and informatics survey of provided medical data, appropriate transformation, subsequent design and implementation of experiments using machine learning methods to acquire new knowledge and hidden information and finally interpretation of the results together with conclusions for target groups.
27

Evolutionary Analysis of the CAP Superfamily of Proteins using Amino Acid Sequences and Splice Sites

January 2016 (has links)
abstract: Here I document the breadth of the CAP (Cysteine-RIch Secretory Proteins (CRISP), Antigen 5 (Ag5), and the Pathogenesis-Related 1 (PR)) protein superfamily and trace some of the major events in the evolution of this family with particular focus on vertebrate CRISP proteins. Specifically, I sought to study the origin of these CAP subfamilies using both amino acid sequence data and gene structure data, more precisely the positions of exon/intron borders within their genes. Counter to current scientific understanding, I find that the wide variety of CAP subfamilies present in mammals, where they were originally discovered and characterized, have distinct homologues in the invertebrate phyla contrary to the common assumption that these are vertebrate protein subfamilies. In addition, I document the fact that primitive eukaryotic CAP genes contained only one exon, likely inherited from prokaryotic SCP-domain containing genes which were, by nature, free of introns. As evolution progressed, an increasing number of introns were inserted into CAP genes, reaching 2 to 5 in the invertebrate world, and 5 to 15 in the vertebrate world. Lastly, phylogenetic relationships between these proteins appear to be traceable not only by amino acid sequence homology but also by preservation of exon number and exon borders within their genes. / Dissertation/Thesis / Masters Thesis Biology 2016
28

Crossing the Chasm: Deploying Machine Learning Analytics in Dynamic Real-World Scenarios

January 2016 (has links)
abstract: The dawn of Internet of Things (IoT) has opened the opportunity for mainstream adoption of machine learning analytics. However, most research in machine learning has focused on discovery of new algorithms or fine-tuning the performance of existing algorithms. Little exists on the process of taking an algorithm from the lab-environment into the real-world, culminating in sustained value. Real-world applications are typically characterized by dynamic non-stationary systems with requirements around feasibility, stability and maintainability. Not much has been done to establish standards around the unique analytics demands of real-world scenarios. This research explores the problem of the why so few of the published algorithms enter production and furthermore, fewer end up generating sustained value. The dissertation proposes a ‘Design for Deployment’ (DFD) framework to successfully build machine learning analytics so they can be deployed to generate sustained value. The framework emphasizes and elaborates the often neglected but immensely important latter steps of an analytics process: ‘Evaluation’ and ‘Deployment’. A representative evaluation framework is proposed that incorporates the temporal-shifts and dynamism of real-world scenarios. Additionally, the recommended infrastructure allows analytics projects to pivot rapidly when a particular venture does not materialize. Deployment needs and apprehensions of the industry are identified and gaps addressed through a 4-step process for sustainable deployment. Lastly, the need for analytics as a functional area (like finance and IT) is identified to maximize the return on machine-learning deployment. The framework and process is demonstrated in semiconductor manufacturing – it is highly complex process involving hundreds of optical, electrical, chemical, mechanical, thermal, electrochemical and software processes which makes it a highly dynamic non-stationary system. Due to the 24/7 uptime requirements in manufacturing, high-reliability and fail-safe are a must. Moreover, the ever growing volumes mean that the system must be highly scalable. Lastly, due to the high cost of change, sustained value proposition is a must for any proposed changes. Hence the context is ideal to explore the issues involved. The enterprise use-cases are used to demonstrate the robustness of the framework in addressing challenges encountered in the end-to-end process of productizing machine learning analytics in dynamic read-world scenarios. / Dissertation/Thesis / Doctoral Dissertation Computer Science 2016
29

PERFORMANCE EVALUATION of MILITARY TRAINING EXERCISES USING DATA MINING

Dubey, Rohini January 2016 (has links)
Attaining training objectives is the measure of a successful training as objectives defines the purpose of instructional events. Application of the training objectives is challenging in large and complex military trainings. The trainings in military domain not only focus on the completion of the trainings but effectively achieving the objectives of the training is the goal of the exercises. It has been realized that the performance to achieve the goal is strengthen by the instructional processes and materials which are crafted to address specific training objectives. Simulation is one of the effective and realistic learning tools which can be used in trainings. As it is known that simulation generates enormous data, analysis of this data which may contain hidden information is a challenging task. The use of data mining is a solution to this problem. The aim of this project is to propose a framework of a system for the instructors which can be followed for evaluating trainee’s performance so that their fulfillment of the training objectives can be improved. A proposal which is studied in this project is learning from previous training experiences using data mining techniques to improve the effectiveness of the training by predicting the performance of the trainee. For selecting the good prediction model to estimate the learning outcome of the trainees, different classification techniques have been compared. CRISP-DM model is considered as a base for proposing the framework in this dissertation. Proposed framework is then applied on the dataset obtained from the Swedish Military for the exercises which involved shooting the target.
30

Návrh systému pro doporučování pracovních příležitostí / Design of a system for recommending job opportunities

Paulavets, Anastasiya January 2014 (has links)
This thesis deals with recommender systems in the field of e-recruitment. The main objective is to design a job recommender system for career portal UNIjobs.cz. First, the theoretical background of recommender systems is provided. In the following part, specific properties of job recommender systems are discussed, as well as existing approaches to recommendation in the e-recruitment environment. The last part of the thesis is dedicated to designing a recommender system for career portal UNIjobs.cz. The output of that part is the main contribution of the thesis.

Page generated in 0.0566 seconds