Spelling suggestions: "subject:"[een] CLASSIFIED"" "subject:"[enn] CLASSIFIED""
141 |
Pop art tendencies in self-managed socialism : pop reactions and counter-cultural pop in Yugoslavia in 1960s and 1970sDzuverovic, Lina January 2017 (has links)
This thesis explores forms of Pop Art on the territory of the former Yugoslavia in the 1960s and 1970s, seeking to identify its local variants. Yugoslavia, a single party state, built on the legacy of the anti-fascist Partisan struggle, principles of solidarity, egalitarianism, self-management and a strong sense of internationalism due to its founding role in the Non-Aligned Movement, was, at the same time, a country immersed in what has been termed 'utopian consumerism'. The thesis examines how Yugoslav artists during this period dealt with the burgeoning consumer society and media boom, kitsch and the Westernization of Yugoslav culture, phenomena which were ideologically at odds with the country’s own socialist principles. Starting from an analysis of the role of the artist in post-war Yugoslav system of self-management, the thesis proposes that Pop in Yugoslavia can be read as a critical site of articulation and negotiation of that role. Yugoslavia’s founding principles, formed as a legacy of the People’s Liberation Struggle (1941 – 1945), were based upon self-management and the introduction of social property, with art being a democratizing force with a central emancipatory role in the building of the new socialist state. But socialist modernism gradually relegated culture to a more illustrative role, as a form of ‘soft power’ for the Socialist Federal Republic of Yugoslavia. The thesis proposes a reading of artists’ diverse engagements with popular culture and materials as varied expressions of resistance to the severing of links with Yugoslavia’s founding principles. My original contribution to knowledge lies in the identification of two strands of Pop in the country–‘Yugoslav Pop Reactions’ and ‘Yugoslav Countercultural Pop’ which each turned to popular culture and cheap everyday materials as an alternative channel through which to respond to socialist modernism. My claim is that the two positions represent two diametrically opposed responses to the disenchantment with socialist modernism and artists’ roles in society – both using the language of Pop Art but representing two different conceptual positions. The thesis is structured around three core questions. Firstly it asks whether it is possible to retrospectively apply the category of Pop Art to artworks which never originally claimed this term. Secondly it examines ways in which Pop tendencies altered the position of Yugoslav female artists, who, marginalised in a heavily male-dominated environment, looked to Pop as an enabling force, allowing new working methods and‘giving licence’ to new types of practices. The third question is concerned with the relationship between power, politics and Pop Art in Yugoslavia, asking to what extent Yugoslav Pop was a form ofpolitical practice, and to what extent is it was a local adaptation of international currents and themes. This thesis is associated with Tate’s multiannual research into ‘global pop’, which culminated in the exhibition ‘The World Goes Pop’ (September 2015 – January 2016, Tate Modern) through a Collaborative Doctoral Award (AHRC). This involved an advisory role in the exhibition research on the territory of the former Yugoslavia, identifying artists and artworks for potential inclusion in the exhibition. The methodology of the thesis was in part shaped by this context, beginning with close studies of artworks, their critical reception, and the study of their context–the sites of production and exhibition in the country at the time. Whilst both local and international literature on Yugoslav art history, global Pop Art as well as Yugoslav material culture and political context has been important, the core research involved oral histories, and visits to artists’ studios, museum collections, depots and archives in search of original artworks. The thesis draws on approximately twenty interviews with artists, curators, art historians and other art workers who were active in 1960s and 1970s, combined with the above-mentioned scholarship.
|
142 |
Heavily doped bulk unipolar structuresMostafa, Alaa El-Din Sabet January 1993 (has links)
Transport properties of bulk unipolar (barrier) devices are investigated in the steady-state mode. This has entailed the study of heavily doped silicon characteristic which comprises important regions of the multilayer bulk unipolar devices. The devices covered are Camel diodes, P-plane barrier diodes and open-base bipolar transistors. Two operating modes are distinguished: the punchthrough mode, and the non punch-through (bipolar) mode. A combination of thermionic and diffusion mechanisms is used in the current - voltage analysis. Minority carrier transport at the polysilicon - monosilicon interface is also studied in polysilicon emitter bulk unipolar diodes with the emphasis having been placed on the influence of heavy doping; the aim being the development of a useful predictive tool for the study of these structures in which transistor action can be obtained due to the mechanism of barrier height modulation via minority-carrier injection. The validity of the analysis is evaluated by comparison with available experimental results. A new form of multigrain - barrier bulk unipolar diode structure has been proposed and analyzed using the carrier trapping model at the grain boundary of the polysilicon. Heavy doping effects / parameters are included in the developed analysis. As it stands, the present model helps in fulfilling the purpose of giving an insight into the physical mechanism of charge carrier transport with heavy doping at a fundamental level and providing a tool for the examination of the behaviour of alternative device configurations. However, heavy doping effects are revealed as being of profound importance in the determination of bulk unipolar device characteristics.
|
143 |
A study of novel computing methods for solving large electromagnetic hazards problemsJones, Christopher Charles Rawlinson January 2002 (has links)
The aim of this work is to explore means to improve the speed of the computational electromagnetics (CEM) processing in use for aircraft design and certification work by a factor of 1000 or so. The investigation addresses particularly the set of problems described as electromagnetic hazards comprising lightning, EMC and the illumination of an aircraft by external radio sources or HIRF (high intensity radiated fields). These are areas which are very much aspects of the engineering of the aircraft where the requirement for accuracy of simulations is of the order of 6dB as build and test repeatability cannot achieve better than this. Computer simulations of these interactions at the outset of this work were often taking 10 days and more on the largest parallel computers then available in the UK (Cray T3D - 40 GFLOPS nominal peak). Such run times made any form of optimisation impossibly lengthy. While the future offered the certain prospect of more powerful computers, the simulations had to become more comprehensive in their representation of materials and features, geometry of the object, and particularly the representation of wires and cables had to improve radically, and tum around times for analysis had to be improved for design assessment as well as to make design optimisation by trade-off studies feasible. All of these could easily consume all the advantage that the new computers would give. The investigation has centred around techniques that might be applied via alteration to the most widely used and usable numerical methods in CEM applied to the electromagnetic hazards, and to techniques that might be applied to the manner of their use. In one case, the investigation has explored a particular possibility for minimising the duration of computation and extrapolating the resulting data to the longest time-scales required. Future improvements in the capabilities of radiating boundary conditions to mimic the effect of an infinite boundary at close range will further improve the benefits already established in this work, but this is not yet realisable. However, it has been established that a combination of techniques with some processes devised through this work can and does deliver the performance improvement sought. It has further been shown that the issues such as object resonance that could have incurred significant error and distrust of computational results can be satisfactorily overcome within the required accuracy. Four papers have been published arsing from this work. Some of these techniques are now in use in routine analyses contributing to BAE SYSTEMS programmes. Plans are in place to incorporate all of the successful techniques and processes.
|
144 |
Quality of life, biomarkers, and involvement of ghrelin in women with breast cancerAl-Khawaja, Nasreen January 2015 (has links)
Breast cancer (BC) is the most common and most lethal cancer among women worldwide. More than a million and a half are diagnosed every year with more than 600,000 deaths among women worldwide. It is estimated than 1 in every 7 women will develop breast cancer in their life time. It is a major public health concern with high economic cost as well. BC is a multidimensional construct. Several dimensions of this construct have never been examined before in the United Arab Emirates (UAE). This study investigated major facets of the Quality of life (QOL) among women with BC in the UAE, compared it with a sample of age matched healthy group of women without any neoplastic background, changes in serum biomarkers of women with BC and to detect the impact of the disease on these biomarkers at the beginning of the disease before treatment started and then again 12 months later following treatment for the cancer and the role of ghrelin hormone in BC and depression at a tissue level and at serum level. In order to examine QOL with its all dimensions among women with BC, an epidemiological case-control study was conducted recruiting a sample of 300 women, 155 women with BC and 145 age-matched healthy women without any neoplastic background as a control group. This was carried out by using a series of standardized psychometric tools in addition to conducting a psychiatric diagnostic interview. Moreover, blood biomarker results were reviewed retrospectively for cases and controls at the beginning and then 12 months following treatment for BC. In relation to the histopathological characteristics and treatment modalities for BC, all pathology, medical and oncology data for 155 women with BC was retrieved from the computer system and analyzed retrospectively. Finally, in relation to ghrelin hormone, all mammary morphological types, normal, benign and malignant were examined with immunohistochemistry for the expression of ghrelin and its functioning receptor (GHS-R1a). Serum of the same women, whose mammary tissue sections were examined by IHC, was tested for ghrelin serum level to find out its link to BC and depression. This was carried out by Enzyme-Linked Immunosorbent Assay (ELISA). The results have demonstrated that women with BC had poor QOL in comparison to the control group. They had poor view of their body image and sexuality and moreover physical disability rate was high. They also tended to suppress negative emotions to a great extent. Anxiety symptoms were also high. Major depressive disorders and post traumatic disorders were lower among women with BC compared to healthy controls. Several risk factors turned to be linked to BC. These included age, having night shift work, hypertension, diabetes mellitus, oral contraceptive pills, hormonal replacement therapy and not breast feeding. In terms of significant traumatic life events, the Arabic version of the CESC English scale showed to have high validity and reliability among women with BC in the UAE. The results also showed that the levels of several serum haematological and biochemical markers seemed to be abnormal among women with BC compared to healthy control. These included elevated levels of platelet, basophils, liver enzymes, lactate dehydrogenase and tumour serum markers. On the other hand, they were low levels of serum magnesium, C-reactive protein and creatinine. Analysis of histopathological characteristics indicated that the aggressive biological nature of the disease was at the late stage and presentation to medical services for treatment. Clinically, women with BC seemed to have all treatment modalities for BC with high rate of mastectomy and axillary clearance. Regarding ghrelin hormone and it relation to BC, the results showed that malignant mammary tissues had an exclusive and differential immune-reactivity to ghrelin hormone, whereas its receptor, the GHS-R1a, was immune-reactive all mammary tissue morphological types. In addition, more metastasis to the lymph nodes was significantly correlated with more immune-reactivity to ghrelin receptor. The results for gene expression for pro-ghrelin, ghrelin and its receptors were inconclusive It is concluded that breast cancer is the most common cancer among women in the UAE. It attacks women at an earlier age than their counterparts in the West. More attention should to be allocated to the QOL and the unmet psychosocial needs of women with BC. This in turn would improve compliance to treatment and prognosis as well. It is also recommended that awareness campaigns and early screening should be applied for early detection of the disease to prevent late presentation to the medical services and other complications.
|
145 |
The integration of explanation-based learning and fuzzy control in the context of software assurance as applied to modular avionicsTimperley, Matthew January 2015 (has links)
A Modular Power Management System (MPMS) is an energy management system intended for highly modular applications, able to adapt to changing hardware intelligently. There is a dearth in the literature on Integrated Modular Avionics (IMA), which has previously not addressed the implications for software operating within this architecture. Namely, the adaptation of control laws to changing hardware. This work proposes some approaches to address this issue. Control laws may require adaptation to overcome hardware degradation, or system upgrades. There is also a growing interest in the ability to change hardware configurations of UASs (Unmanned Aerial Systems) between missions, to better fit the characteristics of each one. Hardware changes in the aviation industry come with an additional caveat: in order for a software system to be used in aviation it must be certified as part of a platform. This certification process has no clear guidelines for adaptive systems. Adapting to a changing platform, as well as addressing the necessary certification effort, motivated the development of the MPMS. The aim of the work is twofold. Firstly, to modify existing control strategies for new hardware. This is achieved with generalisation and transfer earning. Secondly, to reduce the workload involved with maintaining a safety argument for an adaptive controller. Three areas of work are used to demonstrate the satisfaction of this aim. Explanation-Based Learning (EBL) is proposed for the derivation of new control laws. The EBL domain theory embodies general control strategies, which are specialised to form fuzzy rules. A method for translating explanation structures into fuzzy rules is presented. The generation of specific rules, from a general control strategy, is one way to adapt to controlling a modular platform. A fuzzy controller executes the rules derived by EBL. This maintains fast rule execution as well as the separation of strategy and application. The ability of EBL to generate rules which are useful when executed by a fuzzy controller is demonstrated by an experiment. A domain theory is given to control throttle output, which is used to generate fuzzy rules. These rules have a positive impact on energy consumption in simulated flight. EBL is proposed, for rule derivation, because it focuses on generalisation. Generalisations can apply knowledge from one situation, or hardware, to another. This can be preferable to re-derivation of similar control laws. Furthermore, EBL can be augmented to include analogical reasoning when reaching an impasse. An algorithm which integrates analogy into EBL has been developed as part of this work. The inclusion of analogical reasoning facilitates transfer learning, which furthers the flexibility of the MPMS in adapting to new hardware. The adaptive capability of the MPMS is demonstrated by application to multiple simulated platforms. EBL produces explanation structures. Augmenting these explanation structures with a safetyspecific domain theory can produce skeletal safety cases. A technique to achieve this has been developed. Example structures are generated for previously derived fuzzy rules. Generating safety cases from explanation structures can form the basis for an adaptive safety argument.
|
146 |
Mapping utopian art : alternative political imaginaries in new media art (2008-2015)Balaskas, Vasileios (Bill) January 2017 (has links)
This thesis investigates the proliferation of alternative political imaginaries in the Web-based art produced during the global financial crisis of 2008 and its aftermath (2008- 2015), with a particular focus on the influence of communist utopianism. The thesis begins by exploring the continuous relevance of utopianism to Western political thought, including the historical context within which the financial crisis of 2008 occurred. This context has been defined by the new political, social and cultural milieu produced by the development of Data Capitalism – the dominant economic paradigm of the last two decades. In parallel, the thesis identifies the “organic” connections between leftist utopian thought and networked technologies, in order to claim that the events of 2008 functioned as a catalyst for their reactivation and expansion. Following this analysis, the thesis focuses on how politically engaged artists have reacted to the global financial crisis through the use of the World Wide Web. More specifically, the thesis categorises a wide range of artworks, institutional and non-institutional initiatives, as well as theoretical texts that have either been written by artists, or have inspired them. The result of this exercise is a mapping of the post-crisis Web-based art, which is grounded on the technocultural tools employed by artists as well as on the main concepts and ideals that they have aimed at materialising through the use of such tools. Furthermore, the thesis examines the interests of Data Capitalists in art and the Internet, and the kinds of restrictions and obstacles that they have imposed on the political use of the Web in order to safeguard them. Finally, the thesis produces an overall evaluation of the previously analysed cultural products by taking into account both the objectives of their creators and the external and internal limitations that ultimately shape their character. Accordingly, the thesis locates the examined works within the ideological spectrum of Marxist and post-Marxist thought in order to formulate a series of proposals about the future of politically engaged Web-based art and the ideological potentialities of networked communication at large.
|
147 |
Tracing loss, touching absenceRocha Watt, Dionea January 2017 (has links)
This research considers an artist’s encounter with works of art that carry or evoke the affective traces of an experience of loss. Examining images, photographs and sculptural objects and installations that inscribe and in turn expose absence in presence, this research through writing as a practice simultaneously investigates and performs the work as a response to loss. The thesis proposes that the work of art evokes loss by materialising absence. The work of art, like the work of mourning, works by inscribing a trace of the affective experience – the absence of the presence of the other. It is through the affective materiality of the work of art that we come to sense loss; when confronted with, and wounded by, the inscription of absence and its powerful relation to time. Drawing on psychoanalytic theory, the study shows how loss can silence but also move us to create a new language when existing forms of representation fail to signify. Shifting between asignification and signification, the new poetic language carries an imprint of the body; it reconnects to affects to inscribe loss. In the languages of writing, photography and sculpture, I suggest, art attempts to give shape to what cannot be said, to what cannot be shown, to what resists representation. Through close readings of works by Felix Gonzalez-Torres and Louise Bourgeois, the thesis suggests that by resisting representation these artists create works in which textile materials indicate a fundamental encounter with a material sign that gives rise to affects. I analyse works in which fabric is infused with the trace of an absent other. The analysis of contemporary works rubs against the narratives of the origins of art in the ‘Corinthian Maid’ and in the history of prehistoric handprints on cave walls, both of which reveal the gesture of inscribing a presence that anticipates absence. The study draws on philosophy to consider that what is inscribed is not only the absence of a presence but existence; what is inscribed is the vestige or trace of a ‘passing through the world’. The research is generated by a transformative encounter with loss and with art that invites yet resists interpretation; an affective encounter through which what is other can touch, and what touches can be thought. Art, I suggest (after Deleuze), can move us to recover the creative potency of thought in order to inscribe the singularity of the encounter. To write through loss is to write what is impossible to represent and yet insists on being written.
|
148 |
Models of fault-tolerant quantum computationDawson, Christopher Malcolm Unknown Date (has links)
This thesis is concerned with certain theoretical problems that arise naturally in the context of fault-tolerant quantum computation. Fault-tolerance can be defined as the art of building reliable devices from unreliable components, and is of particular importance for quantum computers that aim to precisely control the dynamics of extremely sensitive quantum systems. A model of quantum computation is a specification of the basic building blocks by which a quantum computation is implemented. The best known model is the quantum circuit model, where computations are implemented by means of unitary quantum gates that are applied to two-level quantum systems known as qubits. In a physical implementation of a quantum circuit, the gates and qubits will inevitably be affected by noise. Fault-tolerant quantum circuits are designed to be resilient against the effects of this noise, provided that it is not too strong. Fault-tolerance in the quantum circuit model is well developed thanks to the theory of quantum error-correcting codes. These codes allow for the correction of small numbers of errors introduced by a variety of noise processes. In a fault-tolerant quantum circuit, qubits are replaced with encoded qubits, and quantum gates with encoded gates that are immediately followed by special quantum circuits for error correction. Provided the rate at which errors occur is below a constant threshold value, the accumulation of errors can be checked so that the correct output of the computation correctly determined. The threshold acts as both a measure of how good the design of a quantum circuit is, and also as a target for experimenters aiming to implement quantum circuits. Much current research in quantum computation is aimed at designing quantum circuits that increase the noise threshold, hopefully to the point where it becomes within the reach of experimenters. A fault-tolerant encoded quantum gate must limit the propagation of errors so that the codes corrective capabilities are not overwhelmed. It is not so easy to design encoded gates that satisfy this property, and to date only a finite handful of such gates are known. In a faulttolerant quantum circuit, all quantum gates must be decomposed or compiled in terms of those that may be implemented fault-tolerantly. In the first part of this thesis, we present two results that may be applied to this problem of gate compilation. The first is a generic method based on the Solovay-Kitaev theorem that may be applied to all quantum gates, but is most effective for those that act on single qubits. We present the Solovay-Kitaev theorem in its simples known form as an algorithm, together with novel constructions that can be used to implement it. Following this we give a two specialized methods for two-qubit gate decomposition, based on the Cartan decomposition of the Lie group SU(4). The cluster-state model of computation is an alternative to the quantum circuit model, and makes use of quantum measurements and highly entangled cluster-states to implement a quantum computation. The fault-tolerant techniques developed for quantum circuits are not immediately applicable in this model, so in order for it to be a realistic candidate for performing computations we prove that such techniques are possible. In the second part of the thesis we prove that constant fault-tolerance thresholds may be achieved in the cluster-state model, and in particular in an adaption to an optical implementation. Following this we design a complete error correction scheme for optical cluster-state computation, and numerically determine the threshold of this model in the face of the dominant noise models likely to affect such an implementation.
|
149 |
Models of fault-tolerant quantum computationDawson, Christopher Malcolm Unknown Date (has links)
This thesis is concerned with certain theoretical problems that arise naturally in the context of fault-tolerant quantum computation. Fault-tolerance can be defined as the art of building reliable devices from unreliable components, and is of particular importance for quantum computers that aim to precisely control the dynamics of extremely sensitive quantum systems. A model of quantum computation is a specification of the basic building blocks by which a quantum computation is implemented. The best known model is the quantum circuit model, where computations are implemented by means of unitary quantum gates that are applied to two-level quantum systems known as qubits. In a physical implementation of a quantum circuit, the gates and qubits will inevitably be affected by noise. Fault-tolerant quantum circuits are designed to be resilient against the effects of this noise, provided that it is not too strong. Fault-tolerance in the quantum circuit model is well developed thanks to the theory of quantum error-correcting codes. These codes allow for the correction of small numbers of errors introduced by a variety of noise processes. In a fault-tolerant quantum circuit, qubits are replaced with encoded qubits, and quantum gates with encoded gates that are immediately followed by special quantum circuits for error correction. Provided the rate at which errors occur is below a constant threshold value, the accumulation of errors can be checked so that the correct output of the computation correctly determined. The threshold acts as both a measure of how good the design of a quantum circuit is, and also as a target for experimenters aiming to implement quantum circuits. Much current research in quantum computation is aimed at designing quantum circuits that increase the noise threshold, hopefully to the point where it becomes within the reach of experimenters. A fault-tolerant encoded quantum gate must limit the propagation of errors so that the codes corrective capabilities are not overwhelmed. It is not so easy to design encoded gates that satisfy this property, and to date only a finite handful of such gates are known. In a faulttolerant quantum circuit, all quantum gates must be decomposed or compiled in terms of those that may be implemented fault-tolerantly. In the first part of this thesis, we present two results that may be applied to this problem of gate compilation. The first is a generic method based on the Solovay-Kitaev theorem that may be applied to all quantum gates, but is most effective for those that act on single qubits. We present the Solovay-Kitaev theorem in its simples known form as an algorithm, together with novel constructions that can be used to implement it. Following this we give a two specialized methods for two-qubit gate decomposition, based on the Cartan decomposition of the Lie group SU(4). The cluster-state model of computation is an alternative to the quantum circuit model, and makes use of quantum measurements and highly entangled cluster-states to implement a quantum computation. The fault-tolerant techniques developed for quantum circuits are not immediately applicable in this model, so in order for it to be a realistic candidate for performing computations we prove that such techniques are possible. In the second part of the thesis we prove that constant fault-tolerance thresholds may be achieved in the cluster-state model, and in particular in an adaption to an optical implementation. Following this we design a complete error correction scheme for optical cluster-state computation, and numerically determine the threshold of this model in the face of the dominant noise models likely to affect such an implementation.
|
150 |
Cost effectiveness of mined land rehabilitation of the strip coal mines of QueenslandGolding, B. Unknown Date (has links)
No description available.
|
Page generated in 0.0631 seconds