Spelling suggestions: "subject:"tet 1heory"" "subject:"tet btheory""
371 |
Formalisation mathématique, univers compositionnels et interprétation analytique chez Milton Babbitt et Iannis Xenakis / Mathematical Formalization, Compositional Universes and Analytical Interpretation in Works by Milton Babbitt and Iannis XenakisSchaub, Stéphan 18 December 2009 (has links)
Cette thèse explore les rapports entre la formalisation mathématique, ses implications dans les pratiques compositionnelles et ses répercussions sur l’interprétation analytique. Elle se base sur l’étude de Semi-Simple Variations pour piano (1956) de Milton Babbitt, de Nomos Alpha pour violoncelle (1965-1966) de Iannis Xenakis (1921/22-2001), et d’une sélection de textes théoriques publiés par les compositeurs autour de ces deux œuvres.L’argumentaire est divisé en trois parties. La première étudie les textes théoriques afin d’en dégager les éléments formalisés (ou résultant de la formalisation) ainsi que les indices permettant d’en retracer les implications dans les pratiques compositionnelles. Ces éléments sont précisés dans la deuxième partie via la reconstitution des traces laissées par les composantes formalisées dans les textes musicaux (« modèles théoriques »). Ce travail est mené sur les deux œuvres proposées à l’analyse puis élargi à d’autres œuvres. Les modèles théoriques de deux d’entre elles étaient jusque-là restés inédits : Three Compositions for Piano de Babbitt (1947-1948) ; et Akrata de Xenakis pour 16 vents (1964-1965). La troisième partie soumet Semi-Simple Variations et Nomos Alpha à deux analyses : la première étudie les modèles théoriques, directement liés aux univers formels, alors que la seconde examine les surfaces sonores et comment elles articulent et/ou contredisent les éléments dégagés au niveau des modèles. Tout en soulignant le rôle fondamental, quoique très différent, joué par la formalisation dans les univers compositionnels de Babbitt et de Xenakis, la distance séparant les modèles théoriques des surfaces sonores amènent à considérer l’apport des premiers dans l’interprétation analytique comme étant de nature avant tout heuristique. En tant que traces de la pensée du compositeur, les modèles théoriques peuvent néanmoins suggérer de nouvelles pistes et méritent d’être intégrés à des recherches futures. / This thesis explores the links between mathematical formalization, its implications in compositional practices, and its repercussions on the analytical interpretation. It is based on the study of Milton Babbitt’s Semi-Simple Variations for piano (1956), on Iannis Xenakis’ Nomos Alpha for violoncello (1965-1966), and on a selection of theoretical texts published by the composers around these two works. The argumentation is divided in three parts. The first studies the theoretical texts with the aim of extracting from them the components that are formalized (or result form formalization) as well as the indices pointing towards their implications in the compositional practices. These are then put into sharper view in the second part, through the reconstruction of the traces left by these components in the musical texts (“theoretical models”). This reconstruction is first conducted on the two main works considered before other works are considered. The theoretical models of two of them had until now remained unknown: Three Compositions for Piano by Babbitt (1947-1948) ; and Akrata for 16 winds (1964-1965) by Xenakis. The third part submits Semi-Simple Variations and Nomos Alpha to two analyses: one is aimed at the theoretical models, directly linked to the formal universes, the second examines the musical surfaces and at the way they articulate and/or contradict the elements observed in the models. While underlining the fundamental - though very different - role played by mathematical formalization in Babbitt’s and Xenakis’ compositional universes, the distance that separates the theoretical models from the musical surfaces brings one to consider the role of the former within the analytical enterprise as essentially heuristic. As traces of the composer’s thinking, the theoretical models may however suggest new approaches and deserve to be fully integrated in future research.
|
372 |
Measurable Selection Theorems for Partitions of Polish Spaces into Gδ Equivalence ClassesSimrin, Harry S. 05 1900 (has links)
Let X be a Polish space and Q a measurable partition of X into Gδ equivalence classes. In 1978, S. M. Srivastava proved the existence of a Borel cross section for Q. He asked whether more can be concluded in case each equivalence class is uncountable. This question is answered here in the affirmative. The main result of the author is a proof that shows the existence of a Castaing Representation for Q.
|
373 |
Algebraic and Topological Properties of Unitary Groups of II_1 FactorsDowerk, Philip 21 April 2015 (has links)
The thesis is concerned with group theoretical properties of unitary groups, mainly of II_1 factors. The author gives a new and elementary proof of an result on extreme amenability, defines the bounded normal generation property and invariant automatic continuity property and proves these for various unitary groups of functional analytic types.
|
374 |
Contributions for Handling Big Data Heterogeneity. Using Intuitionistic Fuzzy Set Theory and Similarity Measures for Classifying Heterogeneous DataAli, Najat January 2019 (has links)
A huge amount of data is generated daily by digital technologies such as
social media, web logs, traffic sensors, on-line transactions, tracking data,
videos, and so on. This has led to the archiving and storage of larger and
larger datasets, many of which are multi-modal, or contain different types
of data which contribute to the problem that is now known as “Big Data”.
In the area of Big Data, volume, variety and velocity problems remain difficult
to solve. The work presented in this thesis focuses on the variety
aspect of Big Data. For example, data can come in various and mixed formats
for the same feature(attribute) or different features and can be identified
mainly by one of the following data types: real-valued, crisp and
linguistic values. The increasing variety and ambiguity of such data are
particularly challenging to process and to build accurate machine learning
models. Therefore, data heterogeneity requires new methods of analysis
and modelling techniques to enable useful information extraction and the
modelling of achievable tasks. In this thesis, new approaches are proposed
for handling heterogeneous Big Data. these include two techniques for filtering
heterogeneous data objects are proposed. The two techniques called
Two-Dimensional Similarity Space(2DSS) for data described by numeric
and categorical features, and Three-Dimensional Similarity Space(3DSS)
for real-valued, crisp and linguistic data are proposed for filtering such data. Both filtering techniques are used in this research to reduce the noise
from the initial dataset and make the dataset more homogeneous. Furthermore,
a new similarity measure based on intuitionistic fuzzy set theory
is proposed. The proposed measure is used to handle the heterogeneity
and ambiguity within crisp and linguistic data. In addition, new combine
similarity models are proposed which allow for a comparison between the
heterogeneous data objects represented by a combination of crisp and linguistic
values. Diverse examples are used to illustrate and discuss the efficiency
of the proposed similarity models. The thesis also presents modification
of the k-Nearest Neighbour classifier, called k-Nearest Neighbour
Weighted Average (k-NNWA), to classify the heterogeneous dataset described
by real-valued, crisp and linguistic data. Finally, the thesis also
introduces a novel classification model, called FCCM (Filter Combined
Classification Model), for heterogeneous data classification. The proposed
model combines the advantages of the 3DSS and k-NNWA classifier and
outperforms the latter algorithm. All the proposed models and techniques
have been applied to weather datasets and evaluated using accuracy, Fscore
and ROC area measures. The experiments revealed that the proposed
filtering techniques are an efficient approach for removing noise from heterogeneous
data and improving the performance of classification models.
Moreover, the experiments showed that the proposed similarity measure
for intuitionistic fuzzy data is capable of handling the fuzziness of heterogeneous
data and the intuitionistic fuzzy set theory offers some promise
in solving some Big Data problems by handling the uncertainties, and the
heterogeneity of the data.
|
375 |
Foundations Of Memory Capacity In Models Of Neural CognitionChowdhury, Chandradeep 01 December 2023 (has links) (PDF)
A central problem in neuroscience is to understand how memories are formed as a result of the activities of neurons. Valiant’s neuroidal model attempted to address this question by modeling the brain as a random graph and memories as subgraphs within that graph. However the question of memory capacity within that model has not been explored: how many memories can the brain hold? Valiant introduced the concept of interference between memories as the defining factor for capacity; excessive interference signals the model has reached capacity. Since then, exploration of capacity has been limited, but recent investigations have delved into the capacity of the Assembly Calculus, a derivative of Valiant's Neuroidal model. In this paper, we provide rigorous definitions for capacity and interference and present theoretical formulations for the memory capacity within a finite set, where subsets represent memories. We propose that these results can be adapted to suit both the Neuroidal model and Assembly calculus. Furthermore, we substantiate our claims by providing simulations that validate the theoretical findings. Our study aims to contribute essential insights into the understanding of memory capacity in complex cognitive models, offering potential ideas for applications and extensions to contemporary models of cognition.
|
376 |
How To Approach SuperdeterminismLemmini, Nadil January 2024 (has links)
Quantum mechanics stands unmatched in its experimental success. However, significant gaps remain in the quantum description of nature, notably the absence of a satisfactory integration of gravity and an unresolved measurement problem. This thesis investigates a potential approach to resolving these issues known as superdeterminism. A superdeterministic theory is one which violates an assumption called Statistical Independence. Although this approach has historically been readily dismissed as harmful to science, superdeterminism has recently gained traction as researchers address these criticisms and explore its implications more deeply. This thesis aims to provide a unified resource on superdeterminism, compiling progress, identifying current gaps, and suggesting future research directions. We establish criteria for evaluating superdeterministic theories and apply these to existing models. Our analysis focuses on two main approaches: the Donadi-Hossenfelder path integral approach and Palmer's Invariant Set Theory. The path integral approach appears promising, particularly if a suitable measure for the "quantumness" of states can be developed. Invariant Set Theory provides a unique, geometric framework but is still too early in its development to show clear potential. Our findings underscore the early stage of superdeterminism research and the need for further theoretical development and empirical testing. By providing a structured framework for future work, this thesis seeks to advance the understanding and application of superdeterminism in addressing the foundational issues in quantum mechanics.
|
377 |
Neural network ensemblesDe Jongh, Albert 04 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2004. / ENGLISH ABSTRACT: It is possible to improve on the accuracy of a single neural network by using
an ensemble of diverse and accurate networks. This thesis explores diversity
in ensembles and looks at the underlying theory and mechanisms employed
to generate and combine ensemble members. Bagging and boosting are
studied in detail and I explain their success in terms of well-known theoretical
instruments. An empirical evaluation of their performance is conducted
and I compare them to a single classifier and to each other in terms of accuracy
and diversity. / AFRIKAANSE OPSOMMING: Dit is moontlik om op die akkuraatheid van 'n enkele neurale netwerk te verbeter
deur 'n ensemble van diverse en akkurate netwerke te gebruik. Hierdie
tesis ondersoek diversiteit in ensembles, asook die meganismes waardeur
lede van 'n ensemble geskep en gekombineer kan word. Die algoritmes
"bagging" en "boosting" word in diepte bestudeer en hulle sukses word aan
die hand van bekende teoretiese instrumente verduidelik. Die prestasie van
hierdie twee algoritmes word eksperimenteel gemeet en hulle akkuraatheid
en diversiteit word met 'n enkele netwerk vergelyk.
|
378 |
The identity, application and legacy of Paul Hindemith's theory of musicDesbruslais, Simon Stephen January 2013 (has links)
This thesis investigates the relationship between Hindemith’s music theory and his evolving compositional practice. It focuses on the first volume of his Unterweisung im Tonsatz (1937); both evaluating the very identity of the treatise and analysing how it may be applied to free composition. Above all, this work highlights the increased use of quartal pitch collections found in Hindemith’s Unterweisung-based compositions. Archival documents from the universities of Yale, Berlin, Buffalo, and the Frankfurt Hindemith Institute augment this process, and are used to revise our understanding of how Hindemith’s music theory originated, and how it relates to his practice and teaching. The dissertation begins by exploring the theoretical and intellectual climate of the Rundfunkversuchsstelle at the Berlin Hochschule für Musik within a critical commentary of Hindemith’s music theory. It then develops a new theoretical perspective of quartal pitch space, and atonal prolongation, to provide an analytical toolkit. The list of compositions in the Unterweisung appendix, which Hindemith felt most successfully demonstrated his theory in practice, structures the next three chapters. The Sonata for Solo Viola op. 25/1, a pre-Unterweisung composition, is followed by the Ludus Tonalis, which was published soon afterwards, which is investigated for its explicit theoretical connections. The third analytical chapter focuses on the Das Marienleben cycle as a work written before the Unterweisung, and subsequently revised with theoretical concerns. The final two chapters investigate the prominent decline in popularity experienced by Hindemith, both regarding his theory and compositions, from the 1950s. This is epitomised by a number of strongly-worded polemics published in The Music Review, much of which, it may be argued, is inaccurate or unduly critical. The thesis ends by constructing a Hindemith legacy based on a selection of archival documents and scores, together with a selection of trends in composition and music theory.
|
379 |
An analysis of Priaulx Rainier’s Barbaric Dance Suite for pianoKruger, Esthea 03 1900 (has links)
Thesis (MMus (Music))--University of Stellenbosch, 2009. / Priaulx Rainier (1903-1986) was a South-African born composer whose highly original
compositional style attracted great attention during her lifetime. She spent most of her life in
England, but was inspired by the images and recollections of her youth in Africa. Despite the
critical acclaim she received, little research has been done about her, both in South Africa and
abroad. Additionally, the nature of existing sources is mostly not analytical, but rather
provides an overview of her life or general aspects of her style. Although some conclusions
have been drawn about her compositional style, they are not thoroughly substantiated by
concrete analytical evidence. Also, the focus is mostly on her prominent rhythmic use (often
linked by authors to the “African” element of her idiom), with an evident disregard of the
other aspects of style, most notably with regard to pitch coherence.
This research attempts to correct this unbalanced discourse by analysing one of her few solo
piano works, the Barbaric Dance Suite (composed in 1949), and pointing out significant pitch
relations, similarities and contrasts. The rationale for selecting this specific work originated
from Rainier’s own pronouncement that “The Suite is a key to all my later music, for in the
three DANCES, their structural embryo is, on a small scale, the basis for most of the later
works.” Although the scope of the research did not allow for a comparative analysis, it is
strongly believed that the conclusions reached in this study could also be applicable to many
of Rainier’s other works, especially of the early period.
The study consists of an introduction in which the Barbaric Dance Suite is contextualised,
followed by the main body of the thesis that consists of a detailed analysis of each of the three
movements. The foremost method of analysis used is set theory analysis, which could be
briefly described as a method whereby (particularly atonal) music is segmented and
categorised in pitch class sets. As set theory focuses exclusively on the dimension of pitch,
traditional methods of analysis are employed to examine the other musical parameters. In the
conclusion, the analytical results are contextualised with regard to existing pronouncements
on Rainier’s oeuvre. The study also comments on the applicability of set theory as analytical
system in Rainier’s music. The many complex pitch relations that were discovered by the
intensive analysis of pitch content has given enough evidence to conclude that Rainier’s use
of sonorities has been unjustly neglected in the discourse of this work and perhaps also in her
musical style as a whole. It is hoped that further detailed analysis of her use of sonorities in
other works could lead authorities to revise the insistent pronouncements on her rhythmic use
in favour of a more balanced assessment of all aspects of her compositional style.
|
380 |
Fuzzy Set Theory Applied to Make Medical Prognoses for Cancer PatientsZettervall, Hang January 2014 (has links)
As we all know the classical set theory has a deep-rooted influence in the traditional mathematics. According to the two-valued logic, an element can belong to a set or cannot. In the former case, the element’s membership degree will be assigned to one, whereas in the latter case it takes the zero value. With other words, a feeling of imprecision or fuzziness in the two-valued logic does not exist. With the rapid development of science and technology, more and more scientists have gradually come to realize the vital importance of the multi-valued logic. Thus, in 1965, Professor Lotfi A. Zadeh from Berkeley University put forward the concept of a fuzzy set. In less than 60 years, people became more and more familiar with fuzzy set theory. The theory of fuzzy sets has been turned to be a favor applied to many fields. The study aims to apply some classical and extensional methods of fuzzy set theory in life expectancy and treatment prognoses for cancer patients. The research is based on real-life problems encountered in clinical works by physicians. From the introductory items of the fuzzy set theory to the medical applications, a collection of detailed analysis of fuzzy set theory and its extensions are presented in the thesis. Concretely speaking, the Mamdani fuzzy control systems and the Sugeno controller have been applied to predict the survival length of gastric cancer patients. In order to keep the gastric cancer patients, already examined, away from the unnecessary suffering from surgical operation, the fuzzy c-means clustering analysis has been adopted to investigate the possibilities for operation contra to nonoperation. Furthermore, the approach of point set approximation has been adopted to estimate the operation possibilities against to nonoperation for an arbitrary gastric cancer patient. In addition, in the domain of multi-expert decision-making, the probabilistic model, the model of 2-tuple linguistic representations and the hesitant fuzzy linguistic term sets (HFLTS) have been utilized to select the most consensual treatment scheme(s) for two separate prostate cancer patients. The obtained results have supplied the physicians with reliable and helpful information. Therefore, the research work can be seen as the mathematical complements to the physicians’ queries.
|
Page generated in 0.065 seconds