• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 136
  • 60
  • 27
  • 12
  • 11
  • 10
  • 9
  • 8
  • 4
  • 4
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 312
  • 312
  • 97
  • 83
  • 83
  • 63
  • 55
  • 44
  • 44
  • 41
  • 40
  • 38
  • 35
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Using Decision Tree Voting to Select a Polyhedral Model Loop Transformation

Ruvinskiy, Ray January 2013 (has links)
Algorithms in fields like image manipulation, sound and signal processing, and statistics frequently employ tight loops. These loops are computationally intensive and CPU-bound, making their performance highly dependent on efficient utilization of the CPU pipeline and memory bus. Recent years have seen CPU pipelines becoming more and more complicated, with features such as branch prediction and speculative execution. At the same time, clock speeds have stopped their prior exponential growth rate due to heat dissipation issues, and multiple cores have become prevalent. These developments have made it more difficult for developers to reason about how their code executes on the CPU, which in turn makes it difficult to write performant code. An automated method to take code and optimize it for most efficient execution would, therefore, be desirable. The Polyhedral Model allows the generation of alternative transformations for a loop nest that are semantically equivalent to the original. The transformations vary the degree of loop tiling, loop fusion, loop unrolling, parallelism, and vectorization. However, selecting the transformation that would most efficiently utilize the architecture remains challenging. Previous work utilizes regression models to select a transformation, using as features hardware performance counter values collected during a sample run of the program being optimized. Due to inaccuracies in the resulting regression model, the transformation selected by the model as the best transformation often yields unsatisfactory performance. As a result, previous work resorts to using a five-shot technique, which entails running the top five transformations suggested by the model and selecting the best one based on their actual runtime. However, for long-running benchmarks, five runs may be take an excessive amount of time. I present a variation on the previous approach which does not need to resort to the five-shot selection process to achieve performance comparable to the best five-shot results reported in previous work. With the transformations in the search space ranked in reverse runtime order, the transformation selected by my classifier is, on average, in the 86th percentile. There are several key contributing factors to the performance improvements attained by my method: formulating the problem as a classification problem rather than a regression problem, using static features in addition to dynamic performance counter features, performing feature selection, and using ensemble methods to boost the performance of the classifier. Decision trees are constructed from pairs of features (performance counters and structural features than can be determined statically from the source code). The trees are then evaluated according to the number of benchmarks for which they select a transformation that performs better than two baseline variants, the original program and the expected runtime if a randomly selected transformation were applied. The top 20 trees vote to select a final transformation.
22

Prognosis of Glioblastoma Multiforme Using Textural Properties on MRI

Heydari, Maysam 11 1900 (has links)
This thesis addresses the challenge of prognosis, in terms of survival prediction, for patients with Glioblastoma Multiforme brain tumors. Glioblastoma is the most malignant brain tumor, which has a median survival time of no more than a year. Accurate assessment of prognostic factors is critical in deciding amongst different treatment options and in designing stratified clinical trials. This thesis is motivated by two observations. Firstly, clinicians often refer to properties of glioblastoma tumors based on magnetic resonance images when assessing prognosis. However, clinical data, along with histological and most recently, molecular and gene expression data, have been more widely and systematically studied and used in prognosis assessment than image based information. Secondly, patient survival times are often used along with clinical data to conduct population studies on brain tumor patients. Recursive Partitioning Analysis is typically used in these population studies. However, researchers validate and assess the predictive power of these models by measuring the statistical association between survival groups and survival times. In this thesis, we propose a learning approach that uses historical training data to produce a system that predicts patient survival. We introduce a classification model for predicting patient survival class, which uses texture based features extracted from magnetic resonance images as well as other patient properties. Our prognosis approach is novel as it is the first to use image-extracted textural characteristics of glioblastoma scans, in a classification model whose accuracy can be reliably validated by cross validation. We show that our approach is a promising new direction for prognosis in brain tumor patients.
23

Metodologias para mapeamento de suscetibilidade a movimentos de massa

Riffel, Eduardo Samuel January 2017 (has links)
O mapeamento de áreas com predisposição à ocorrência de eventos adversos, que resultam em ameaça e danos a sociedade, é uma demanda de elevada importância, principalmente pelo papel que exerce em ações de planejamento, gestão ambiental, territorial e de riscos. Diante disso, este trabalho busca contribuir na qualificação de metodologias e parâmetros morfométricos para mapeamento de suscetibilidade a movimentos de massa através de SIG e Sensoriamento Remoto, um dos objetivos é aplicar e comparar metodologias de suscetibilidade a movimentos de massa, entre elas o Shalstab, e a Árvore de Decisão que ainda é pouco utilizada nessa área. Buscando um consenso acerca da literatura, fez-se necessário organizar as informações referentes aos eventos adversos através de classificação, para isso foram revisados os conceitos relacionados com desastres, tais como suscetibilidade, vulnerabilidade, perigo e risco. Também foi realizado um estudo no município de Três Coroas – RS, onde foram relacionadas as ocorrências de movimentos de massa e as zonas de risco da CPRM. A partir de parâmetros morfométricos, foram identificados padrões de ocorrência de deslizamentos, e a contribuição de fatores como uso, ocupação e declividade. Por fim, foram comparados dois métodos de mapeamento de suscetibilidade, o modelo Shalstab e a Árvore de Decisão. Como dado de entrada dos modelos foram utilizados parâmetros morfométricos, extraídos de imagens SRTM, e amostras de deslizamentos, identificadas por meio de imagens de satélite de alta resolução espacial. A comparação das metodologias e a análise da acurácia obteve uma resposta melhor para a Árvore de Decisão. A diferença, entretanto, foi pouco significativa e ambos podem representar de forma satisfatória o mapa de suscetibilidade. No entanto, o Shalstab apresentou mais limitações, devido à necessidade de dados de maior resolução espacial. A aplicação de metodologias utilizando SIG e Sensoriamento Remoto contribuíram com uma maior qualificação em relação à prevenção de danos ocasionados por movimentos de massa. Ressalta-se, entretanto, a necessidade de inventários consistentes, para obter uma maior confiabilidade na aplicação dos modelos. / The mapping of areas with predisposition to adverse events, which result in threat and damage to society, is a demand of great importance, mainly for the role it plays in planning, environmental, territorial and risk management actions. Therefore, this work seeks to contribute to the qualification of methodologies and morphometric parameters for mapping susceptibility to mass movements through GIS and Remote Sensing, one of the objectives is to apply and compare methodologies of susceptibility to mass movements, among them Shalstab, and the Decision Tree that is still little used in this area. Seeking a consensus about the literature, it was necessary to organize the information regarding the adverse events through classification, for this the concepts related to disasters such as susceptibility, vulnerability, danger and risk were reviewed. A study was also carried out in the city of Três Coroas - RS, where the occurrence of mass movements and the risk zones of CPRM were related. From morphometric parameters, patterns of occurrence of landslides were identified, and the contribution of factors such as use, occupation and declivity. Finally, two methods of susceptibility mapping, the Shalstab model and the Decision Tree, were compared. Morphometric parameters, extracted from SRTM images, and sliding samples, identified by means of high spatial resolution satellite images, were used as input data. The comparison of the methodologies and the analysis of the accuracy obtained a better answer for the Decision Tree. The difference, however, was insignificant and both can represent satisfactorily the map of susceptibility. However, Shalstab presented more limitations due to the need for higher spatial resolution data. The application of methodologies using GIS and Remote Sensing contributed with a higher qualification in relation to the prevention of damages caused by mass movements. However, the need for consistent inventories to obtain greater reliability in the application of the models is emphasized.
24

Real-Time Power System Topology Monitoring Supported by Synchrophasor Measurements

January 2015 (has links)
abstract: ABSTRACT This dissertation introduces a real-time topology monitoring scheme for power systems intended to provide enhanced situational awareness during major system disturbances. The topology monitoring scheme requires accurate real-time topology information to be effective. This scheme is supported by advances in transmission line outage detection based on data-mining phasor measurement unit (PMU) measurements. A network flow analysis scheme is proposed to track changes in user defined minimal cut sets within the system. This work introduces a new algorithm used to update a previous network flow solution after the loss of a single system branch. The proposed new algorithm provides a significantly decreased solution time that is desired in a real- time environment. This method of topology monitoring can provide system operators with visual indications of potential problems in the system caused by changes in topology. This work also presents a method of determining all singleton cut sets within a given network topology called the one line remaining (OLR) algorithm. During operation, if a singleton cut set exists, then the system cannot withstand the loss of any one line and still remain connected. The OLR algorithm activates after the loss of a transmission line and determines if any singleton cut sets were created. These cut sets are found using properties of power transfer distribution factors and minimal cut sets. The topology analysis algorithms proposed in this work are supported by line outage detection using PMU measurements aimed at providing accurate real-time topology information. This process uses a decision tree (DT) based data-mining approach to characterize a lost tie line in simulation. The trained DT is then used to analyze PMU measurements to detect line outages. The trained decision tree was applied to real PMU measurements to detect the loss of a 500 kV line and had no misclassifications. The work presented has the objective of enhancing situational awareness during significant system disturbances in real time. This dissertation presents all parts of the proposed topology monitoring scheme and justifies and validates the methodology using a real system event. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2015
25

Metodologias para mapeamento de suscetibilidade a movimentos de massa

Riffel, Eduardo Samuel January 2017 (has links)
O mapeamento de áreas com predisposição à ocorrência de eventos adversos, que resultam em ameaça e danos a sociedade, é uma demanda de elevada importância, principalmente pelo papel que exerce em ações de planejamento, gestão ambiental, territorial e de riscos. Diante disso, este trabalho busca contribuir na qualificação de metodologias e parâmetros morfométricos para mapeamento de suscetibilidade a movimentos de massa através de SIG e Sensoriamento Remoto, um dos objetivos é aplicar e comparar metodologias de suscetibilidade a movimentos de massa, entre elas o Shalstab, e a Árvore de Decisão que ainda é pouco utilizada nessa área. Buscando um consenso acerca da literatura, fez-se necessário organizar as informações referentes aos eventos adversos através de classificação, para isso foram revisados os conceitos relacionados com desastres, tais como suscetibilidade, vulnerabilidade, perigo e risco. Também foi realizado um estudo no município de Três Coroas – RS, onde foram relacionadas as ocorrências de movimentos de massa e as zonas de risco da CPRM. A partir de parâmetros morfométricos, foram identificados padrões de ocorrência de deslizamentos, e a contribuição de fatores como uso, ocupação e declividade. Por fim, foram comparados dois métodos de mapeamento de suscetibilidade, o modelo Shalstab e a Árvore de Decisão. Como dado de entrada dos modelos foram utilizados parâmetros morfométricos, extraídos de imagens SRTM, e amostras de deslizamentos, identificadas por meio de imagens de satélite de alta resolução espacial. A comparação das metodologias e a análise da acurácia obteve uma resposta melhor para a Árvore de Decisão. A diferença, entretanto, foi pouco significativa e ambos podem representar de forma satisfatória o mapa de suscetibilidade. No entanto, o Shalstab apresentou mais limitações, devido à necessidade de dados de maior resolução espacial. A aplicação de metodologias utilizando SIG e Sensoriamento Remoto contribuíram com uma maior qualificação em relação à prevenção de danos ocasionados por movimentos de massa. Ressalta-se, entretanto, a necessidade de inventários consistentes, para obter uma maior confiabilidade na aplicação dos modelos. / The mapping of areas with predisposition to adverse events, which result in threat and damage to society, is a demand of great importance, mainly for the role it plays in planning, environmental, territorial and risk management actions. Therefore, this work seeks to contribute to the qualification of methodologies and morphometric parameters for mapping susceptibility to mass movements through GIS and Remote Sensing, one of the objectives is to apply and compare methodologies of susceptibility to mass movements, among them Shalstab, and the Decision Tree that is still little used in this area. Seeking a consensus about the literature, it was necessary to organize the information regarding the adverse events through classification, for this the concepts related to disasters such as susceptibility, vulnerability, danger and risk were reviewed. A study was also carried out in the city of Três Coroas - RS, where the occurrence of mass movements and the risk zones of CPRM were related. From morphometric parameters, patterns of occurrence of landslides were identified, and the contribution of factors such as use, occupation and declivity. Finally, two methods of susceptibility mapping, the Shalstab model and the Decision Tree, were compared. Morphometric parameters, extracted from SRTM images, and sliding samples, identified by means of high spatial resolution satellite images, were used as input data. The comparison of the methodologies and the analysis of the accuracy obtained a better answer for the Decision Tree. The difference, however, was insignificant and both can represent satisfactorily the map of susceptibility. However, Shalstab presented more limitations due to the need for higher spatial resolution data. The application of methodologies using GIS and Remote Sensing contributed with a higher qualification in relation to the prevention of damages caused by mass movements. However, the need for consistent inventories to obtain greater reliability in the application of the models is emphasized.
26

Sistema de información para la toma de decisiones, usando técnicas de análisis predictivo para la Empresa IASACORP International S.A.

Espinoza Espinoza, Bertha Yrene, Gutiérrez Rivera, Natalia Elizabeth January 2015 (has links)
En la actualidad, las empresas manejan una gran cantidad de información, el cual era inimaginable años atrás, la capacidad de recolectarla es muy impresionante. En consecuencia, para varias empresas esta información se ha convertido en un tema difícil de manejar. Diariamente, las empresas sea del sector, tipo o tamaño que sea, toman decisiones, las cuales la mayoría son decisiones estratégicas que pueden afectar el correcto funcionamiento de la empresa. Es aquí, donde ingresa una de las herramientas más mencionadas en el área de TI: Business Intelligence, este término se refiere al uso de datos en una empresa para facilitar la toma de decisiones, explotar su información, y mejor aún, plantear o predecir escenarios a futuro. El presente trabajo permitirá al área de Marketing de la empresa Iasacorp International, obtener información sobre el comportamiento y hábitos de compra de los clientes, mediante técnicas de minería de datos como Árbol de Decisión y técnicas de análisis predictivo, la cual ayudará a la toma de decisiones para establecer estrategias de venta de las líneas (bisutería, complementos de vestir, accesorios de cabello, etc.) que maneja la empresa y de las próximas compras. De acuerdo a lo planteado anterior mente, la implementación de este tipo de sistemas de información ofrece a la empresa ventajas competitivas, permite a la gerencia analizar y entender mejor la información y por consecuencia tomar mejores decisiones de negocio. At present, companies handle a lot of information, which was unimaginable years ago, the ability to collect it is very impressive. Consequently, for many companies this information has become a difficult issue to handle. Due to the large volume of information we have, instead of being useful you can fall in a failed attempt to give proper use. Every day, companies in any sector, type or size, make decisions, most of which are strategic decisions that may affect the proper functioning of the company. It´s here, where we talk about the most mentioned tools in the area of IT: Business Intelligence, this term refers to the use of data in an enterprise to facilitate decision-making, exploit their information, and better yet, raise or predict scenarios future. This work will allow the area Iasacorp Marketing Company International, information on the behavior and buying habits of customers, through predictive analysis techniques, which will help the decision to establish sales strategies lines (jewelry, clothing, hair accessories, etc.) that manages the company and nearby shopping. According to the points made above, the implementation of such information systems offers companies competitive advantages, allows management to better analyze and understand information and consequently make better business decisions.
27

Classificação da exatidão de coordenadas obtidas com a fase da portadora L1 do GPS / Accuracy's classification of GPS L1 carrier phase obtained coordinates

Mauro Menzori 20 December 2005 (has links)
A fixação das duplas diferenças de ambigüidades no processamento dos dados da fase da portadora do Sistema de Posicionamento Global (GPS), é um dos pontos cruciais no posicionamento relativo estático. Esta fixação também é utilizada como um indicador de qualidade e fornece maior segurança quanto ao resultado do posicionamento. No entanto, ela é uma informação puramente estatística baseada na precisão da medida e dissociada da exatidão das coordenadas geradas na solução. A informação sobre a exatidão das coordenadas de pontos medidos através de um vetor simples, é sempre inacessível, independente de a solução ser fixa ou “float”. Além disso, existe um risco maior em assumir um resultado de solução “float”, mesmo que ele tenha uma boa, porém, desconhecida exatidão. Por estes motivos a solução “float” não é aceita por muitos contratantes de serviços GPS, feitos com a fase da portadora, que exigem uma nova coleta de dados, com o conseqüente dispêndio de tempo e dinheiro. Essa tese foi desenvolvida no sentido de encontrar um procedimento que melhore esta situação. Para tanto, se investigou o comportamento da exatidão em medidas obtidas com a fase da portadora L1 do GPS, monitorando os fatores variáveis presentes neste tipo de medição, o que tornou possível a classificação da exatidão de resultados. Inicialmente, a partir de um conjunto de dados GPS, coletados ao longo dos anos de 2003, 2004 e 2005 em duas bases de monitoramento contínuo da USP, se fez uma análise sistemática do comportamento das variáveis contidas nos dados. A seguir se estruturou um banco de dados, que foi usado como referência na indução de uma árvore de decisão adotada como paradigma. Por último, a partir desta árvore se pôde inferir a exatidão de soluções de posicionamento obtidas com o uso da portadora L1. A validação do procedimento foi feita através da classificação da exatidão de resultados de várias linhas base, coletadas em diferentes condições e locais do estado de São Paulo e do Brasil / The most crucial step on the relative static positioning, when using the Global Positioning System (GPS) carrier phase data, is the fixing ambiguities integer values. The integer ambiguity solution is also used as a quality indicator, ensuring quality to the positioning results. In despite of its capability, the ambiguity fix solution is purely statistical information, based on the precision of measurements and completely apart from the coordinate's solution accuracy. In a single baseline processing, the positioning coordinates accuracy is always inaccessible, no matter if the final solution is float or fixed. In fact, there is some inner risk when using the float solution, although they have a good, nevertheless, unknown accuracy. Probably that is why several GPS job contractors reject the float solutions and require a new data observation, with the consequent time and money loss. This research was developed to improve that situation, investigation the inner accuracy in several GPS L1 carrier phase measurements. Checking the variable factors existing on this kind of measurement it was possible to classify the results accuracy behavior. The investigation was developed in tree steps: started with the systematic analysis of a group of L1 observation data, collected during the years: 2003, 2004 and 2005, followed by the construction of a structured data bank which generated a decision tree, performing the paradigm used to classify the accuracy of any measurement made with GPS L1 carrier phase; and ended with the research validation, through the accuracy classification that was made on several baselines, collected on different conditions and places around the state of São Paulo and Brazil
28

Využitie genetických algoritmov pri tvorbe rozhodovacích stromov / Applying genetic algorithms for decision trees induction

Šurín, Lukáš January 2015 (has links)
Decision trees are recognized and widely used technique for processing and analyzing data. These trees are designed with typical and generally known inductive techniques (such as ID3, C4.5, C5.0, CART, CHAID, MARS). Predictive power of created trees is not always perfect and they often provide a room for improvement. Induction of trees with difficult criterias is hard and sometime impossible. In this paper we will deal with decision trees, namely their creation. We use the mentioned room for improvement by metaheuristic, genetic algorithms, which is used in all types of optimalization. The work also includes an implementation of a new proposed algorithm in the form of plug-in into Weka environment. A comparison of the proposed method for induction of decision trees with known algorithm C4.5 is an integral part of this thesis. Powered by TCPDF (www.tcpdf.org)
29

Category management mléčných produktů / Category management in dairy category

Cieluch, Petr January 2008 (has links)
This thesis describes Category management from the theoretical as well as from the practical point of view. It solves a concrete project between supplier (Danone) and customer (Jednota České Budějovice) on traditional market using all tools conected with Category management in terms of assortment analysis and shelf layouts.
30

Decision Tree Model to Support the Successful Selection of a Database Engine for Novice Database Administrators

Monjaras, Alvaro, Bcndezu, Enrique, Raymundo, Carlos 09 May 2019 (has links)
El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado. / There are currently several types of databases that have different ways of manipulating data that affects the performance of transactions when dealing with the information stored. And it is very important for companies to manage information fast, so they do not lose any operation because of a bad performance of a database, in the same way, they need to operate fast while keeping the integrity of the information. Likewise, every database category's purpose is to serve a specific or specifics use cases to perform fast to manage the information when needed, so in this paper, we study and analyze the SQL, NoSQL and In Memory databases to understand their fit uses cases and make performance tests to build a decision tree that can help to take the decision to choose what database category to use to maintain a good performance. The precision of the tests of relational databases was 96.26% in NoSQL databases was 91.83% and finally in IMDBS was 93.87%.

Page generated in 0.0662 seconds