• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 41
  • 13
  • 7
  • 6
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 104
  • 26
  • 18
  • 18
  • 15
  • 15
  • 13
  • 12
  • 11
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

TRANSFORMS IN SUFFICIENT DIMENSION REDUCTION AND THEIR APPLICATIONS IN HIGH DIMENSIONAL DATA

Weng, Jiaying 01 January 2019 (has links)
The big data era poses great challenges as well as opportunities for researchers to develop efficient statistical approaches to analyze massive data. Sufficient dimension reduction is such an important tool in modern data analysis and has received extensive attention in both academia and industry. In this dissertation, we introduce inverse regression estimators using Fourier transforms, which is superior to the existing SDR methods in two folds, (1) it avoids the slicing of the response variable, (2) it can be readily extended to solve the high dimensional data problem. For the ultra-high dimensional problem, we investigate both eigenvalue decomposition and minimum discrepancy approaches to achieve optimal solutions and also develop a novel and efficient optimization algorithm to obtain the sparse estimates. We derive asymptotic properties of the proposed estimators and demonstrate its efficiency gains compared to the traditional estimators. The oracle properties of the sparse estimates are derived. Simulation studies and real data examples are used to illustrate the effectiveness of the proposed methods. Wavelet transform is another tool that effectively detects information from time-localization of high frequency. Parallel to our proposed Fourier transform methods, we also develop a wavelet transform version approach and derive the asymptotic properties of the resulting estimators.
12

A NEW INDEPENDENCE MEASURE AND ITS APPLICATIONS IN HIGH DIMENSIONAL DATA ANALYSIS

Ke, Chenlu 01 January 2019 (has links)
This dissertation has three consecutive topics. First, we propose a novel class of independence measures for testing independence between two random vectors based on the discrepancy between the conditional and the marginal characteristic functions. If one of the variables is categorical, our asymmetric index extends the typical ANOVA to a kernel ANOVA that can test a more general hypothesis of equal distributions among groups. The index is also applicable when both variables are continuous. Second, we develop a sufficient variable selection procedure based on the new measure in a large p small n setting. Our approach incorporates marginal information between each predictor and the response as well as joint information among predictors. As a result, our method is more capable of selecting all truly active variables than marginal selection methods. Furthermore, our procedure can handle both continuous and discrete responses with mixed-type predictors. We establish the sure screening property of the proposed approach under mild conditions. Third, we focus on a model-free sufficient dimension reduction approach using the new measure. Our method does not require strong assumptions on predictors and responses. An algorithm is developed to find dimension reduction directions using sequential quadratic programming. We illustrate the advantages of our new measure and its two applications in high dimensional data analysis by numerical studies across a variety of settings.
13

Application of Influence Function in Sufficient Dimension Reduction Models

Shrestha, Prabha 28 September 2020 (has links)
No description available.
14

The Free Basic Water Policy, Planning for Social Justice and the Water Needs of HIV/AIDS Affected Households in South African Townships

Tsiri, Makgabo Hendrick 14 November 2006 (has links)
Student Number : 0104363F - MSc research report - School of Architecture and Planning - Faculty of Engineering and the Built Environment / South Africa is a constitutional state. The constitution is the supreme law of the country (RSA, 1996). Any of the state laws, policies and programmes that are inconsistent with the constitution are invalid, thus they have no legitimate standing. In the preamble of its constitution, the post apartheid South Africa sworn itself as a country recognising the past injustices, hence planning for the society based on social justices, in order to improving the quality of life of all citizens and free the potential of each person. The Bill of Rights is a cornerstone of democracy in South Africa, as it encompasses all human rights, especially socioeconomic rights, whose fulfilment will contribute towards realisation of equal and united society based on social justice. However, the post apartheid South African government adopted a Free Basic Water policy as a way of adhering to the constitutional requirements of ensuring that everyone has the right to sufficient water. Access to clean sufficient water has been identified as a crucial requirement for Care and Prevention to the HIV/AIDS affected households. In the midst of socioeconomic inequalities, scarce water resources and high HIV/AIDS prevalence confronting the post apartheid South Africa today, the Free Basic Water policy guarantees every household of eight; irrespective of its socioeconomic status and health concerns, 6kl/6000 litres of water every month free. The local government has been blamed for not being responsive these special water needs of the poor HIV/AIDS households, especially in townships areas, where water is mostly provided on cost-recovery. However, little attention has been paid on the difficulty faced by the local government authorities in this regard. The report argue for a need of collaboration between planners and others major stakeholders, to come up with group-conscious water policy that will guide for planning of a society based on social justice. However, the research recommends that this policy should not only be guided by / concerned with justice and fairness in the distribution of basic needs of the society with special needs. More important, this new policy should be fair, thus account for the sustainability of the water resources, since South Africa is regarded as water-scarce country.
15

Minimização de conjuntos de casos de teste para máquinas de estados finitos / Teste suite minimization for finite state machines

Mello Neto, Lúcio Felippe de 09 May 2008 (has links)
O TESTE baseado em modelos visa a possibilitar a derivação de conjuntos de casos de teste a partir de especificações formais, tais como Máquinas de Estados Finitos. Os conjuntos de teste podem ser obtidos tanto pelos métodos clássicos de geração quanto por alguma abordagem ad hoc. Procura-se obter um conjunto de teste que consiga detectar todos os possíveis defeitos de uma implementação e possua tamanho reduzido para que a sua aplicação seja factível. Por questões de ordem prática, pode não ser possível a aplicação de todo o conjunto de teste gerado. Desse modo, um subconjunto de casos de teste deve ser selecionado, ou seja, uma minimização do conjunto de teste deve ser realizada. No entanto, é fundamental que a minimização reduza o custo de aplicação dos testes, mas mantenha a efetividade em revelar defeitos. Neste trabalho, propõe-se um algoritmo de minimização de conjuntos de teste para Máquinas de Estados Finitos. O algoritmo baseia-se em condições de suficiência para que a completude em relação à detecção de defeitos seja mantida. O algoritmo foi utilizado em dois diferentes contextos. Utilizou-se o algoritmo com conjuntos de teste gerados de forma aleatória para verificar a minimização obtida. O algoritmo também foi utilizado para reduzir o esforço em se obter um conjunto completo em relação à detecção de defeitos / THE Model-based testing aims at generating test suites from formal specifications, such as Finite State Machines. Test suites can be obtained either from classical test derivation methods or from some ad-hoc approach. It is desirable to produce a test suite which detects all possible faults of an implementation and has small size, so that its application can be feasible. For practical reasons, the application of the generated test suite may not be possible. Therefore, a subset of test cases should be selected, i.e., a test suite minimization should be performed. However, it is important that the minimization reduces the test application cost, but keeps the effectiveness in revealing faults. In this work, an algorithm is proposed for the minimization of test suites generated from Finite State Machines. The algorithm is based on sufficient conditions, so that test suite completeness can be maintained. The algorithm was used in two different contexts. It was used with randomly generated test suites to verify the minimization obtained. The algorithm was also used to reduce the effort of obtaining a test suite with full fault coverage
16

Leibniz e Hobbes: causalidade e princípio de razão suficiente / Leibniz and Hobbes: causality and principle of sufficient reason

Hirata, Celí 31 August 2012 (has links)
O escopo desta pesquisa de doutorado é examinar a relação entre a doutrina hobbesiana da causalidade e o princípio de razão suficiente em Leibniz, assinalando a aproximação e o distanciamento entre um e outro. Se, por um lado, o filósofo alemão é claramente influenciado por Hobbes na formulação de seu princípio, por outro, é por meio desse próprio princípio que ele critica alguns dos aspectos mais decisivos da filosofia de Hobbes, como o seu materialismo, necessitarismo, bem como a sua concepção de justiça divina e a sua tese de que Deus não pode ser conhecido pela luz natural. Em alguns textos de sua juventude, Leibniz prova que nada é sem razão pela identificação da razão suficiente com a totalidade dos requisitos, demonstração que praticamente reproduz aquela pela qual Hobbes defende que todo efeito tem a sua causa necessária. Entretanto, em oposição a Hobbes, que reduz a realidade a corpos em movimento, Leibniz utilizará o conceito de razão suficiente para demonstrar que somente um princípio incorporal pode dotar os corpos com movimento. É igualmente por meio do princípio de razão suficiente e da sua distinção em relação ao princípio de contradição que Leibniz defende que os eventos no mundo não são absolutamente necessários, mas contingentes. Por fim, é utilizando-se deste princípio que o autor da Teodiceia argumentará que Deus pode ser conhecido pela razão natural e que a justiça divina consiste na sua bondade guiada pela sua sabedoria, em contraste com a definição hobbesiana de justiça fundamentada no poder. Assim, se Leibniz se apropria de certos elementos da doutrina hobbesiana da causalidade é para submeter a causalidade eficiente e mecânica que é defendida pelo inglês a uma determinação essencialmente teleológica da realidade. / The aim of this thesis is to examine the relationship between the Hobbesian doctrine of causality and the principle of sufficient reason in Leibniz, indicating the closeness and distance between them. If, on the one hand, the German philosopher is clearly influenced by Hobbes in the formulation of his principle, on the other hand is through this very principle that he criticizes some of the most decisive aspects of the philosophy of Hobbes, as his materialism, necessitarianism, as well his conception of divine justice and his thesis that God can not be known by natural light. In some texts of his youth, Leibniz proves that nothing is without reason by means of the identification of the sufficient reason with the totality of all requisites, demonstration that almost reproduces that one by which Hobbes argues that every effect has a necessary cause. However, in opposition to Hobbes, that reduces the reality to bodies in motion, Leibniz uses the concept of sufficient reason to demonstrate that only an incorporeal principle can provide body with movement. It is also through the principle of sufficient reason and its distinction from the principle of contradiction that Leibniz argues that events in the world are not absolutely necessary, but contingent. Finally, it is using this principle that the author of the Theodicy argues that God can be known by natural reason and that divine justice consists in his goodness guided by wisdom, in contrast to the Hobbesian definition of justice based on power. So, if Leibniz appropriates certain elements of the Hobbesian doctrine of causation is in order to submit the mechanical efficient causality defended by Hobbes to an essentially teleological determination of the reality.
17

Distributed Statistical Learning under Communication Constraints

El Gamal, Mostafa 21 June 2017 (has links)
"In this thesis, we study distributed statistical learning, in which multiple terminals, connected by links with limited capacity, cooperate to perform a learning task. As the links connecting the terminals have limited capacity, the messages exchanged between the terminals have to be compressed. The goal of this thesis is to investigate how to compress the data observations at multiple terminals and how to use the compressed data for inference. We first focus on the distributed parameter estimation problem, in which terminals send messages related to their local observations using limited rates to a fusion center that will obtain an estimate of a parameter related to the observations of all terminals. It is well known that if the transmission rates are in the Slepian-Wolf region, the fusion center can fully recover all observations and hence can construct an estimator having the same performance as that of the centralized case. One natural question is whether Slepian-Wolf rates are necessary to achieve the same estimation performance as that of the centralized case. In this thesis, we show that the answer to this question is negative. We then examine the optimality of data dimensionality reduction via sufficient statistics compression in distributed parameter estimation problems. The data dimensionality reduction step is often needed especially if the data has a very high dimension and the communication rate is not as high as the one characterized above. We show that reducing the dimensionality by extracting sufficient statistics of the parameter to be estimated does not degrade the overall estimation performance in the presence of communication constraints. We further analyze the optimal estimation performance in the presence of communication constraints and we verify the derived bound using simulations. Finally, we study distributed optimization problems, for which we examine the randomized distributed coordinate descent algorithm with quantized updates. In the literature, the iteration complexity of the randomized distributed coordinate descent algorithm has been characterized under the assumption that machines can exchange updates with an infinite precision. We consider a practical scenario in which the messages exchange occurs over channels with finite capacity, and hence the updates have to be quantized. We derive sufficient conditions on the quantization error such that the algorithm with quantized update still converge."
18

Uma contribuição para determinação de um conjunto essencial de operadores de mutação no teste de programas C. / A contribution for the determination of a sufficient mutant operators set for C-program testing.

Barbosa, Ellen Francine 06 November 1998 (has links)
Estudos empíricos têm mostrado que a Análise de Mutantes – um dos critérios de teste baseado em erros – é bastante eficaz em revelar a presença de erros. Entretanto, seu alto custo, decorrente principalmente do grande número de mutantes gerados, tem motivado a proposição de diversas abordagens alternativas para a sua aplicação. Um estudo relevante nesse sentido resultou na determinação de um conjunto essencial de operadores de mutação para a linguagem Fortran, mostrando-se que é possível reduzir o custo de aplicação do critério, preservando um alto grau de adequação em relação à Análise de Mutantes. Alguns estudos também têm demonstrado que a redução da eficácia não é significativa. Este trabalho tem como objetivo investigar alternativas pragmáticas para a aplicação do critério Análise de Mutantes e, nesse contexto, é proposto um procedimento para a determinação de um conjunto essencial de operadores de mutação para a linguagem C, a partir dos operadores implementados na ferramenta Proteum. Procurando aplicar e validar o procedimento proposto, dois grupos distintos de programas são utilizados. Para ambos os grupos, o conjunto essencial obtido apresenta resultados bastante significativos quanto à redução de custo, com um decréscimo muito pequeno no grau de adequação em relação à Análise de Mutantes. Estratégias para evoluir e refinar um conjunto essencial para diferentes domínios de aplicação também são investigadas. / Mutation Analysis – one of the error based criteria – has been found to be effective on revealing faults. However, its high cost, due to the high number of mutants created, has motivated the proposition of many alternative approaches for its application. In this perspective, a relevant study resulted on the determination of a sufficient mutant operator set for Fortran, indicating that it is possible to have a large cost reduction of mutation testing, preserving a high mutation score. Some studies have also shown that the reduction on the effectiveness is not significant. This work aims to investigate pragmatic alternatives for mutation analysis application and, in this context, a procedure for the determination of a sufficient mutant operators set for C is proposed, using Proteum testing tool. Aiming to apply and validate the proposed procedure, two different groups of programs are used. For both of them, the sufficient mutant operator set presents very significant results in terms of cost reduction, with a very small reduction on the mutation score. Strategies to evolve and refine an essential mutant operator set to different application domains are also investigated.
19

Minimização de conjuntos de casos de teste para máquinas de estados finitos / Teste suite minimization for finite state machines

Lúcio Felippe de Mello Neto 09 May 2008 (has links)
O TESTE baseado em modelos visa a possibilitar a derivação de conjuntos de casos de teste a partir de especificações formais, tais como Máquinas de Estados Finitos. Os conjuntos de teste podem ser obtidos tanto pelos métodos clássicos de geração quanto por alguma abordagem ad hoc. Procura-se obter um conjunto de teste que consiga detectar todos os possíveis defeitos de uma implementação e possua tamanho reduzido para que a sua aplicação seja factível. Por questões de ordem prática, pode não ser possível a aplicação de todo o conjunto de teste gerado. Desse modo, um subconjunto de casos de teste deve ser selecionado, ou seja, uma minimização do conjunto de teste deve ser realizada. No entanto, é fundamental que a minimização reduza o custo de aplicação dos testes, mas mantenha a efetividade em revelar defeitos. Neste trabalho, propõe-se um algoritmo de minimização de conjuntos de teste para Máquinas de Estados Finitos. O algoritmo baseia-se em condições de suficiência para que a completude em relação à detecção de defeitos seja mantida. O algoritmo foi utilizado em dois diferentes contextos. Utilizou-se o algoritmo com conjuntos de teste gerados de forma aleatória para verificar a minimização obtida. O algoritmo também foi utilizado para reduzir o esforço em se obter um conjunto completo em relação à detecção de defeitos / THE Model-based testing aims at generating test suites from formal specifications, such as Finite State Machines. Test suites can be obtained either from classical test derivation methods or from some ad-hoc approach. It is desirable to produce a test suite which detects all possible faults of an implementation and has small size, so that its application can be feasible. For practical reasons, the application of the generated test suite may not be possible. Therefore, a subset of test cases should be selected, i.e., a test suite minimization should be performed. However, it is important that the minimization reduces the test application cost, but keeps the effectiveness in revealing faults. In this work, an algorithm is proposed for the minimization of test suites generated from Finite State Machines. The algorithm is based on sufficient conditions, so that test suite completeness can be maintained. The algorithm was used in two different contexts. It was used with randomly generated test suites to verify the minimization obtained. The algorithm was also used to reduce the effort of obtaining a test suite with full fault coverage
20

Food for thought : Self-sufficient households towards a sustainablefood supply

Palokangas, Timo, Eriksson, William, Persson, Madeleine, Norman, Rebecca January 2016 (has links)
This bachelor thesis examines to what extent a co-create community called Bobyggetin Herrljunga, Sweden, can be food self-sufficient. To obtain a more comprehensiveresult, the difference between a vegetarian and non-vegetarian diet, as well as howtime spent on farming and available cultivation area affects the outcome, are studied.Moreover, difficulties regarding food self-sufficiency are brought up and discussed.Linear programming is used to maximise the amount of calories obtained fromfarming at Bobygget. The result shows that the degree of self-sufficiency at Bobyggetreaches 21% with vegetarian food, and 27% with non-vegetarian food. With thepreconditions regarding the available area of Bobygget, the maximum work time peradult is 9 min per day for vegetarian food, and 13 min per day for non-vegetarianfood. Difficulties concerning self-sufficiency, including time consumption and basicfarming knowledge, are identified. Possible solutions, such as starting modestly withfew crops and small area, consider contract farming, and create a knowledge base forBobygget, are presented.

Page generated in 0.0392 seconds