• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 51
  • 9
  • 7
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 95
  • 11
  • 10
  • 10
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

// F - E - G // : Agential complexities: what do I do now and what will you make of it? // If the carpet is pulled from under our feet, where will we land?

Bellugi Klima, Sarah January 2023 (has links)
// F - E- G // // F - E- G // is the title of my independent degree project, for the Master program New Performative Practices at Stockholm University of the Arts. The project was intended as a means for  presenting and understanding my artistic practice and its various aspects. It was a public presentation in the university’s theatre hall. The format chosen was a 50 minute solo performance with a 10 minute participatory section at the beginning of the show. I was the performer, as well as the music and recordings' editor, text writer, and project coordinator in collaboration with SKH's producer and staff. My aim with the project was to fit together the various themes and practices that I had been exploring during the two years of the course: a somatic sci-fi based practice, clown technique, the use of voice, the relationship of the performer to the audience, finding honesty and repeatability in performance, the use of chance and randomness in creation and performing, the use of language in guiding others toward physical and personal insights and experiences, what kind of social impact performance art can have, Nonviolent Communication (NVC) as a choice of life and, finally,  reconnecting to my ageing dancing body. I call the method chosen to explore these various interests and curiosities a personal-experiential method: the ideas were tried out physically and concretely, over a span of time that allowed for reflection and introspection. The outcome of the project was rich in personal learnings and challenges, as well as a milestone in my development as an artist and a person. I hope it served as a moment of poetry and enjoyment to those who visited and participated in the performances: that they might recognise themselves in similar situations and feelings and that they might be either relieved or comforted or somehow get the chance to witness their experience through the dramaturgy of the performance. For more on the project please find an brief expansion on this abstract in the uploaded pdf file. For a more comprehensive view of my artistic practice during the two years at SKH please visit my Research Catalogue site at https://www.researchcatalogue.net/shared/82db91825a5ca2dcf19ae6b16f28d33a (to navigate the site once exposition is open: hover cursor on the top left of the page for a menu).
52

Heuristics: Bias Vs. Smart Instrument. An Exploration of the Hot Hand

Cooper, Jehangir 23 August 2013 (has links)
No description available.
53

With or without context : Automatic text categorization using semantic kernels

Eklund, Johan January 2016 (has links)
In this thesis text categorization is investigated in four dimensions of analysis: theoretically as well as empirically, and as a manual as well as a machine-based process. In the first four chapters we look at the theoretical foundation of subject classification of text documents, with a certain focus on classification as a procedure for organizing documents in libraries. A working hypothesis used in the theoretical analysis is that classification of documents is a process that involves translations between statements in different languages, both natural and artificial. We further investigate the close relationships between structures in classification languages and the order relations and topological structures that arise from classification. A classification algorithm that gets a special focus in the subsequent chapters is the support vector machine (SVM), which in its original formulation is a binary classifier in linear vector spaces, but has been extended to handle classification problems for which the categories are not linearly separable. To this end the algorithm utilizes a category of functions called kernels, which induce feature spaces by means of high-dimensional and often non-linear maps. For the empirical part of this study we investigate the classification performance of semantic kernels generated by different measures of semantic similarity. One category of such measures is based on the latent semantic analysis and the random indexing methods, which generates term vectors by using co-occurrence data from text collections. Another semantic measure used in this study is pointwise mutual information. In addition to the empirical study of semantic kernels we also investigate the performance of a term weighting scheme called divergence from randomness, that has hitherto received little attention within the area of automatic text categorization. The result of the empirical part of this study shows that the semantic kernels generally outperform the “standard” (non-semantic) linear kernel, especially for small training sets. A conclusion that can be drawn with respect to the investigated datasets is therefore that semantic information in the kernel in general improves its classification performance, and that the difference between the standard kernel and the semantic kernels is particularly large for small training sets. Another clear trend in the result is that the divergence from randomness weighting scheme yields a classification performance surpassing that of the common tf-idf weighting scheme.
54

Hurstův exponent a náhodnost v časových řadách / Hurst Exponent and Randomness in Time Series

Zeman, Martin January 2010 (has links)
The main goal of this thesis is to test the ability of the Hurst exponent to recognise some processes with deterministic signal as nonrandom and to test the randomness of daily stock returns of three stocks traded in BCPP. Critical values to determine the critical region of a randomness hypothesis test were set for this purpose. Another goal of the thesis is the description of the Hurst exponent estimation by means of Rescaled Range Analysis and outline some problems accompanying this estimation if the Hurst exponent would be used as a randomness indicator. Within the frame of Rescaled Range Analysis was constructed another method that showed to be successful in recognising some series that contain deterministic signal.
55

Turing machine algorithms and studies in quasi-randomness

Kalyanasundaram, Subrahmanyam 09 November 2011 (has links)
Randomness is an invaluable resource in theoretical computer science. However, pure random bits are hard to obtain. Quasi-randomness is a tool that has been widely used in eliminating/reducing the randomness from randomized algorithms. In this thesis, we study some aspects of quasi-randomness in graphs. Specifically, we provide an algorithm and a lower bound for two different kinds of regularity lemmas. Our algorithm for FK-regularity is derived using a spectral characterization of quasi-randomness. We also use a similar spectral connection to also answer an open question about quasi-random tournaments. We then provide a "Wowzer" type lower bound (for the number of parts required) for the strong regularity lemma. Finally, we study the derandomization of complexity classes using Turing machine simulations. 1. Connections between quasi-randomness and graph spectra. Quasi-random (or pseudo-random) objects are deterministic objects that behave almost like truly random objects. These objects have been widely studied in various settings (graphs, hypergraphs, directed graphs, set systems, etc.). In many cases, quasi-randomness is very closely related to the spectral properties of the combinatorial object that is under study. In this thesis, we discover the spectral characterizations of quasi-randomness in two different cases to solve open problems. A Deterministic Algorithm for Frieze-Kannan Regularity: The Frieze-Kannan regularity lemma asserts that any given graph of large enough size can be partitioned into a number of parts such that, across parts, the graph is quasi-random. . It was unknown if there was a deterministic algorithm that could produce a parition satisfying the conditions of the Frieze-Kannan regularity lemma in deterministic sub-cubic time. In this thesis, we answer this question by designing an O(n[superscript]w) time algorithm for constructing such a partition, where w is the exponent of fast matrix multiplication. Even Cycles and Quasi-Random Tournaments: Chung and Graham in had provided several equivalent characterizations of quasi-randomness in tournaments. One of them is about the number of "even" cycles where even is defined in the following sense. A cycle is said to be even, if when walking along it, an even number of edges point in the wrong direction. Chung and Graham showed that if close to half of the 4-cycles in a tournament T are even, then T is quasi-random. They asked if the same statement is true if instead of 4-cycles, we consider k-cycles, for an even integer k. We resolve this open question by showing that for every fixed even integer k geq 4, if close to half of the k-cycles in a tournament T are even, then T must be quasi-random. 2. A Wowzer type lower bound for the strong regularity lemma. The regularity lemma of Szemeredi asserts that one can partition every graph into a bounded number of quasi-random bipartite graphs. Alon, Fischer, Krivelevich and Szegedy obtained a variant of the regularity lemma that allows one to have an arbitrary control on this measure of quasi-randomness. However, their proof only guaranteed to produce a partition where the number of parts is given by the Wowzer function, which is the iterated version of the Tower function. We show here that a bound of this type is unavoidable by constructing a graph H, with the property that even if one wants a very mild control on the quasi-randomness of a regular partition, then any such partition of H must have a number of parts given by a Wowzer-type function. 3. How fast can we deterministically simulate nondeterminism? We study an approach towards derandomizing complexity classes using Turing machine simulations. We look at the problem of deterministically counting the exact number of accepting computation paths of a given nondeterministic Turing machine. We provide a deterministic algorithm, which runs in time roughly O(sqrt(S)), where S is the size of the configuration graph. The best of the previously known methods required time linear in S. Our result implies a simulation of probabilistic time classes like PP, BPP and BQP in the same running time. This is an improvement over the currently best known simulation by van Melkebeek and Santhanam.
56

Compressed wavefield extrapolation with curvelets

Lin, Tim T. Y., Herrmann, Felix J. January 2007 (has links)
An explicit algorithm for the extrapolation of one-way wavefields is proposed which combines recent developments in information theory and theoretical signal processing with the physics of wave propagation. Because of excessive memory requirements, explicit formulations for wave propagation have proven to be a challenge in {3-D}. By using ideas from ``compressed sensing'', we are able to formulate the (inverse) wavefield extrapolation problem on small subsets of the data volume, thereby reducing the size of the operators. According {to} compressed sensing theory, signals can successfully be recovered from an imcomplete set of measurements when the measurement basis is incoherent} with the representation in which the wavefield is sparse. In this new approach, the eigenfunctions of the Helmholtz operator are recognized as a basis that is incoherent with curvelets that are known to compress seismic wavefields. By casting the wavefield extrapolation problem in this framework, wavefields can successfully be extrapolated in the modal domain via a computationally cheaper operatoion. A proof of principle for the ``compressed sensing'' method is given for wavefield extrapolation in 2-D. The results show that our method is stable and produces identical results compared to the direct application of the full extrapolation operator.
57

Vliv situačních a osobnostních faktorů na ochotu platit za bezcenné informace / The influence of situational and personality factors on willingness to pay for worthless information

Frollová, Nikola January 2018 (has links)
Thesis presents recent relevant studies of cognitive biases and false perceiving of randomness. There is an ongoing general belief that past performance automatically predicts future performance even in the cases when the result is act of randomness. This became inspiration for the main topic of my scientific research. Based on recent relevant literature I study if it is possible to influence people buying valueless information by evoking feeling of loss. Also I am trying to answer question which personal factors stand behind this behaviour. The results shows, that the manipulation with loss had nothing to do with buying valueless information. However it seems that personality factors are connected with this phenomenon to a certain extent. I had identified one factor called 'Irrational Thinking', which partly explains why 71% of the participants were interested in valueless transaction.
58

Aplicação de um jogo digital e análise de conceitos da teoria cinética dos gases / Application of a digital game and analysis of kinetic concepts of gases theory

Figueiredo, Márcia Camilo [UNESP] 04 March 2016 (has links)
Submitted by MÁRCIA CAMILO FIGUEIREDO null (marciacamilof@gmail.com) on 2016-04-28T13:27:25Z No. of bitstreams: 1 TESE_UNESP_SP_BAURU_MARCIA_CAMILO_FIGUEIREDO_DEFESA_04_03_2016.pdf: 8905822 bytes, checksum: 7ea8b928aabd49b34c0aefc4bdbe536f (MD5) / Approved for entry into archive by Felipe Augusto Arakaki (arakaki@reitoria.unesp.br) on 2016-05-02T12:40:38Z (GMT) No. of bitstreams: 1 figueiredo_mc_dr_bauru.pdf: 8905822 bytes, checksum: 7ea8b928aabd49b34c0aefc4bdbe536f (MD5) / Made available in DSpace on 2016-05-02T12:40:38Z (GMT). No. of bitstreams: 1 figueiredo_mc_dr_bauru.pdf: 8905822 bytes, checksum: 7ea8b928aabd49b34c0aefc4bdbe536f (MD5) Previous issue date: 2016-03-04 / Fundação Araucária de Apoio ao Desenvolvimento Científico e Tecnológico do Paraná (FAADCT/PR) / Esta pesquisa objetivou investigar se licenciandos em química enunciam e compreendem os conceitos de aleatoriedade e irreversibilidade, presentes na Teoria Cinética dos Gases, por meio de etapas construídas para um jogo digital e da sua aplicação após finalizado. Participaram da pesquisa vinte e um acadêmicos do curso de licenciatura em química de uma universidade tecnológica federal no Paraná, separados em: grupo 01, 02 e 03. Para a coleta de dados, cada grupo, em determinados momentos, respondeu questionários, elaborou desenhos e participou de entrevistas semiestruturadas. O desenvolvimento da pesquisa foi orientado pela abordagem qualitativa e por alguns estudos realizados por Piaget. Para tratar e analisar os dados, optamos pelos princípios da análise de conteúdo. A partir do conteúdo dos desenhos construídos nas etapas do jogo digital, foi possível constatar que, os participantes do grupo 01 e 02 buscaram ilustrar em alguma etapa do jogo, uma aproximação de distribuição homogênea do sistema. Nos desenhos dos níveis, verificamos que a maioria (quatorze) dos participantes levou em consideração as experiências obtidas durante o jogo, porque mudaram a maneira de prever em algum nível, as trajetórias de partículas no sistema; dentre os vinte e um sujeitos, apenas oito ilustraram nos quatro níveis do jogo, as primeiras previsões de colisões do lado esquerdo, alcançando o padrão de análise estabelecido. No conteúdo obtido nos discursos dos participantes, referente ao conceito de irreversibilidade, foi possível verificar que este conhecimento não está bem construído nas estruturas cognitivas dos participantes do grupo 01 e 02, porque cinco apresentaram discurso não elucidativo nas quatro etapas do jogo e dois não souberam elucidar o conceito investigado em três etapas. Com relação ao conceito de aleatoriedade, verificamos que os participantes utilizaram palavras diferentes em cada contexto de aplicação das etapas e dos níveis do jogo digital, apresentando distintos discursos, como de gênero científico, próximo do gênero científico, de senso comum, não elucidativo e elucidativo ao jogo. Depreende-se que os recursos didáticos digitais utilizados podem proporcionar aos estudantes compreenderem e apreenderem conteúdos de caráter microscópico e submicroscópico. Portanto, as etapas e os níveis do jogo digital poderão contribuir para que os sujeitos apreendam cientificamente os conceitos da teoria cinética dos gases, como também em outras áreas do conhecimento. / This research aimed to investigate whether licentiate in chemistry enunciate and understand the concepts of randomness and irreversibility present in Kinetic Theory of Gases, through the construction of steps of a digital game and its application after finalized. The participants were twenty-one academics in chemistry degree course of a federal technological university in Paraná, separated into: Group 01, 02 and 03. For the collection of data, each group, at certain times, answered questionnaires, prepared drawings and participated in semi-structured interviews. The development of the research was guided by a qualitative approach and some studies conducted by trough the ideas of Piaget. To process and analyze the data, we chose the principles of content analysis. From the content of the drawings built on the steps of digital game, it was found that the participants of group 01 and 02 sought to illustrate in some stage of the game, a homogeneous distribution approach of the system. In the drawings levels, we found that the majority (fourteen) of the participants took into account the experiences gained during the game because it changed the way to predict to some degree, the particle trajectories in the system; among the twenty-one subjects, only eight illustrated in the four levels of the game, the first predictions of collisions on the left side, reaching the established pattern analysis. The content obtained in the speeches of the participants, referring to the concept of irreversibility, it was found that this knowledge is not well built in cognitive structures of group members 01 and 02, because five had not been elucidated speech in the four stages of the game and two did not know how to elucidate the concept investigated in the three steps. Regarding the concept of randomness, we found that participants used different words in each application context of the stages and the digital game levels, with different speeches, as scientific genre, close to the scientific genus, common sense, not enlightening and instructive the game. It appears that digital teaching resources used can provide students understand and grasp microscopic and submicroscopic character content. Therefore, the steps and levels of the digital game can contribute to the subject scientifically seize the concepts of kinetic theory of gases, as well as in other areas of knowledge.
59

Vícekriteriální a robustní zobecnění úlohy prodavače novin / Multicriteria and robust extension of news-boy problem

Šedina, Jaroslav January 2018 (has links)
This thesis studies a classic single-period stochastic optimization problem called the newsvendor problem. A news-boy must decide how many items to order un- der the random demand. The simple model is extended in the following ways: endogenous demand in the additive and multiplicative manner, objective func- tion composed of the expected value and Conditional Value at Risk (CVaR) of profit, multicriteria objective with price-dependent demand, multiproduct exten- sion under dependent and independent demands, distributional robustness. In most cases, the optimal solution is provided. The thesis concludes with the nu- merical study that compares results of two models after applying the Sample Average Approximation (SAA) method. This study is conducted on the real data. 1
60

Investigating the applicability of execution tracing techniques for root causing randomness-related flaky tests in Python / En undersökning av exekveringsspårning och dess tillämplighet för att orsaksbestämma skakiga tester i Python relaterade till slumpmässighet

Erik, Norrestam Held January 2021 (has links)
Regression testing is an essential part of developing and maintaining software. It helps verify that changes to the software have not introduced any new bugs, and that the functionality still works as intended. However, for this verification to be valid, the executed tests must be assumed to be deterministic, i.e. produce the same output under the same circumstances. Unfortunately, this is not always the case. A test that exhibits non-deterministic behavior is said to be flaky. Flaky tests can severely inhibit the benefits of regression testing, as developers must figure out whether a failing test is due to a bug in the system under test (SUT) or test flakiness. Moreover, the non-deterministic nature of flaky tests poses several problems. Not only are the failures difficult to reproduce and debug, but developers are more likely to ignore the outcome of flaky tests, potentially leading to overlooked bugs in the SUT. The aim of this thesis was to investigate the applicability of execution tracing techniques as a means of providing root cause analysis for flaky tests in the randomness and network categories. This involved reproducing and studying flakiness, as well as implementing and evaluating a prototype with the ability to analyze runtime behavior in flaky tests. To gain a better understanding of reproducibility and common traits among flaky tests in the selected categories, a pre-study was conducted. Based on the outcome of the pre-study and findings in related literature, the network category was dropped entirely, and two techniques were chosen to be implemented. The implementation process resulted in the FlakyPy tool, a plugin for pytest that provides root cause analysis aimed at randomness flakiness. When run against a dataset of 22 flaky tests, the tool was able to identify potential root causes in 15 of these. This serves as an indication that execution tracing has the potential of detecting possible root causes in flaky randomness tests in Python. However, more research is needed to evaluate how developers perceive the usefulness of such tools.

Page generated in 0.07 seconds