81 |
Estudo das covariâncias envolvidas no método ko de análise por ativação neutrônica / Study of covariances involved in the ko method of neutron activation analysisCARDOSO, VANDERLEI 09 October 2014 (has links)
Made available in DSpace on 2014-10-09T12:34:27Z (GMT). No. of bitstreams: 0 / Made available in DSpace on 2014-10-09T14:09:41Z (GMT). No. of bitstreams: 0 / Tese (Doutoramento) / IPEN/T / Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
|
82 |
Estudo das covariâncias envolvidas no método ko de análise por ativação neutrônica / Study of covariances involved in the ko method of neutron activation analysisCARDOSO, VANDERLEI 09 October 2014 (has links)
Made available in DSpace on 2014-10-09T12:34:27Z (GMT). No. of bitstreams: 0 / Made available in DSpace on 2014-10-09T14:09:41Z (GMT). No. of bitstreams: 0 / O presente trabalho se propôs ao desenvolvimento de uma metodologia para o tratamento das incertezas do método k0 de Análise por Ativação Neutrônica (AAN), de forma abrangente e acurada, aplicando a metodologia de Análise por Covariâncias. Todos os parâmetros envolvidos na determinação da concentração de um elemento estudado foram analisados de forma criteriosa, estabelecendo as correlações entre eles. Também foram estabelecidas as possíveis correlações entre as concentrações de elementos diferentes, para a mesma amostra e em amostras diferentes. Este procedimento gerou um número grande de correlações que foram tratadas rigorosamente. Os dados para análise foram obtidos experimentalmente, por meio de irradiações efetuadas na posição de irradiação 24A próxima ao núcleo do reator de pesquisas IEA-R1 do IPEN-CNEN/SP. Os parâmetros α e ƒ, de caracterização do campo neutrônico, foram determinados, aplicando-se vários métodos apresentados na literatura. Um tratamento estatístico detalhado foi aplicado a cada uma das medidas, verificando-se as diversas incertezas parciais e suas correlações. Com o objetivo de aprofundar o estudo, foram escolhidos os alvos de 64Zn e 68Zn, para os quais foram determinados experimentalmente os parâmetros nucleares k0 e Q0, que apresentavam discrepâncias na literatura. Os valores destes parâmetros para o 64Zn resultaram 5,63(8) × 10-3 e 1,69(6), respectivamente. Para o 68Zn, resultaram 4,00(6) × 10-4 e 2,34(4), respectivamente. Estes valores foram comparados com os dados existentes na literatura. O método de Monte Carlo foi aplicado em diversas fases do estudo, para permitir a determinação acurada de alguns parâmetros, necessários para a análise completa dos dados. / Tese (Doutoramento) / IPEN/T / Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
|
83 |
Generating 8-Bit Sound Effects Using Interactive EvolutionGarpenhall, Tobias January 2022 (has links)
Interactive evolution is explored and applied for the automatic generation of 8-bit sound effects (SFX). Procedurally generating a type of content can result in greater accessibility, cut development costs, and more. However, a natural problem that follows this approach is user fatigue. An 8-bit SFX generator is developed, tested, and then evaluated to understand its capabilities, usability, and effectiveness of several applicable solutions for reducing user fatigue. Results indicate that the software is intuitive to learn and use while providing a decent variety to the generated content with a probable feeling of progression. The implemented solutions for abating user fatigue show promise towards making the software practically viable. However, there are still areas in the developed artifact that suggest for further study.
|
84 |
Low-Level Static Analysis for Memory Usage and Control Flow RecoveryBockenek, Joshua Alexander 07 March 2023 (has links)
Formal characterization of the memory used by a program is an important basis for security analyses, compositional verification, and identification of noninterference.
However, soundly proving memory usage requires operating on the assembly level due to the semantic gap between high-level languages and the code that processors actually execute.
Automated methods, such as model checking, would not be able to handle many interesting functions due to the undecidability of memory usage.
Fully-interactive methods do not scale well either.
Sound control flow recovery (CFR) is also important for binary decompilation, verification, patching, and security analysis.
It lifts raw unstructured data into a form that allows reasoning over behavior and semantics.
However, doing so requires interpreting the behavior of the program when indirect or dynamic control flow exists, creating a recursive dependency.
This dissertation tackles the first property with two contributions that perform proof generation combined with interactive theorem proving in a semi-automated manner:
an untrusted tool extracts as much information as it can from the functions under test and then generates all the necessary proofs to be completed in a theorem prover.
The first, Floyd-style approach still requires significant manual effort but provides good flexibility and ensures no paths are analyzed more than once.
In contrast, the second, Hoare-style approach sacrifices some flexibility and avoidance of repeated path evaluation in order to achieve much greater automation.
However, neither approach can handle the dynamic control flow caused by indirect branching.
The second property is handled by the second set of contributions of this dissertation.
These two contributions provide fully-automated methods of recovering control flow from binaries even in the presence of indirect branching.
When such dynamic control flow cannot be overapproximatively resolved, it is clearly noted in the resultant output.
In the first approach to control flow recovery, a structured memory representation allows for general analysis of control flow in the presence of indirection, gaining scalability by utilizing context-free function analysis.
It supports various aliasing conditions via the usage of nondeterminism, with multiple output states potentially being produced from a given input state.
The second approach adds function context and abstract interpretation-inspired modeling of the C++ exception handling (EH) application binary interface (ABI), allowing for the discovery of previously-unknown paths while maintaining or increasing automation. / Doctor of Philosophy / Modern computer programs are so complicated that individual humans cannot manually check all but the smallest programs to make sure they are correct and secure.
This is even worse if you want to reduce the trusted computing base (TCB), the stuff that you have to assume is working right in order to say a program will execute correctly.
The TCB includes your computer itself, but also whatever tools were used to take the programs written by programmers and transform them into a form suitable for running on a computer. Such tools are often called compilers.
One method of reducing the TCB is to examine the lowest-level representation of that program, the assembly or even machine code that is actually run by your computer.
This poses unique challenges, because operating on such a low level means you do not have a lot of the structure that a more abstract, higher-level representation provides.
Also, sometimes you want to formally state things about a program's behavior; that is, say things about what it does with a high degree of confidence based on mathematical principles.
You may also want to verify that one or more of those statements are true.
If you want to be detailed about that behavior, you may need to know all of the chunks, or regions, in random-access memory (RAM) that are used by that program.
RAM, henceforth referred to as just ``memory'', is your computer's first place of storage for the information used by running programs.
This is distinct from long-term storage devices like hard disk drives (HDDs) or solid-state drives (SSDs), which programs do not normally have direct access to.
Unfortunately, there is no one single approach that can automatically determine with absolute certainty for all cases the exact regions of memory that are read or written.
This is called undecidability, and means that you need to approximate those memory regions a lot of the time if you want to have a significant degree of automation.
An underapproximation, an approach that only gives you some of the regions, is not useful for formal statements as it might miss out on some behavior; it is unsound.
This means that you need an overapproximation, an approach that is guaranteed to give you at least the regions read or written.
Therefore, the first contribution of this dissertation is a preliminary approach to such an overapproximation.
This approach is based on the work of Robert L. Floyd, focusing on the direct control flow (where the steps of a program go) in an individual function (structured program component).
It still requires a lot of user effort, including having to manually specify the regions in memory that were possibly used and do a lot of work to prove that those regions are (overapproximatively) correct, so our tests were limited in scope.
The second contribution automated a lot of the manual work done for the first approach.
It is based on the work of Charles Antony Richard Hoare, who developed a verification approach focusing on the syntax (the textual form) of programs.
This contribution produces what we call formal memory usage certificates (FMUCs), which are formal statements that the regions of memory they describe are the only ones possibly affected by the functions under test.
These statements also come with proofs, which for our work are like scripts used to verify that the things the FMUCs assert about the corresponding functions can be shown to be true given the assumptions our FMUCs have.
Sometimes those proofs are incomplete, though, such as when there is a loop (repeated bit of code) in a function under test or one function calls (executes) another.
In those cases, a user has to finish the proof, in the first case by weakening (removing information from) the FMUC's statements about the loop and in the second by composing, or combining, the FMUCs of the two functions.
Additionally, this second approach cannot handle dynamic control flow.
Such control flow occurs when the low-level instructions a program uses to move to another place in that program do not have a pre-stored location to go to.
Instead, that location is supplied as the program is running.
This is opposed to direct control flow, where the place to go to is hard-coded into the program when it is compiled.
The tool also cannot not deal with aliasing, which is when different state parts (value-holding components) of a program contain the same value and that value is used as the numeric address or identifier of a location in memory.
Specifically, it cannot deal with potential aliasing, when there is not enough information available to determine if the state parts alias or not.
Because of that, we had to add extra assumptions to the FMUCs that limited them to those cases where ambiguous memory-referencing state parts referred to separate memory locations.
Finally, it specifically requires assembly as input; you cannot directly supply a binary to it.
This is also true of the first contribution.
Because of this, we were able to test on more functions than before, but not a lot more.
Not being able to deal with dynamic control flow is a big problem, as almost all programs use it.
For example, when a function reaches its end, it has to figure out where to return to based on the current state of the program (in the previous contribution, this was done manually).
This means that control flow recovery (CFR) is very important for many applications, including decompilation (converting a program back into a higher-level form), patching (updating a program in place without modifying the original code and recompiling it), and low-level analysis or verification in general.
However, as you may have noticed from earlier in this paragraph, in order to deal with such dynamic control flow you need to figure out what the possible destinations are for the individual control flow transfers.
That can require knowing where you came from in the program, which means that analysis of dynamic control flow requires context (in this context, information previously obtained in the program).
Even worse, it is another undecidable problem that requires overapproximation.
To soundly recover control flow, we developed Hoare graphs (HGs), the third contribution of this dissertation.
HGs use memory models that take the form of forests, or collections of tree data structures.
A single tree represents a region in memory that may have multiple symbolic references, or abstract representations of a value.
The children of the tree represent regions used in the program that are enclosed within their parent tree elements.
Now, instead of assuming that all ambiguous memory regions are separate, we can use them under various aliasing conditions.
We have also implemented support for some forms of dynamic control flow.
Those that are not supported are clearly marked in the resultant HG.
No user interaction is required even when loops are present thanks to a methodology that automatically reduces the amount of information present at a re-executed instruction until the information stabilizes.
Function composition is also automatic now thanks to a method that treats each function as its own context in a safe and automated way, reducing memory consumption of our tool and allowing larger programs to be examined.
In the process we did lose the ability to deal with recursion (functions that call themselves or call other functions that call back to the original), though.
Lastly, we provided the ability to directly load binaries into the tool, no external disassembly (converting machine code into human-readable instructions) needed.
This all allowed much greater testing than before, with applications to multiple programs and program libraries.
The fourth and final contribution of this dissertation iterates on the HG work by narrowing focus to the concept of exceptional control flow.
Specifically, it models the kind of exception handling used by C++ programs.
This is important as, if you want to explore a program's behavior, you need to know all the places it goes to.
If you use a tool that does not model exception handling, you may end up missing paths of execution caused by unwinding.
This is when an exception is thrown and propagates up through the program's current stack of function calls, potentially reaching programmer-supplied handling for that exception.
Despite this, commonplace tools for static, low-level program analysis do not model such unwinding.
The control flow graph (CFG) produced by our exception-aware tool are called exceptional interprocedural control flow graphs (EICFGs).
These provide information about the exceptions being thrown and what paths they take in the program when they are thrown.
Additional improvements are a better methodology for handling dynamic control flow as well adding back in support for recursion.
All told, this allowed us to explore even more programs than ever before.
|
85 |
民國64年至72年主要製造業財務比率穩定性之探討劉逢良, LIU, FENG-LIANG Unknown Date (has links)
第一章:緒論─闡述對「產業」及「景氣狀況」是影響財務比率的重要因素的質疑。
第二章:研究主題的界定─ヾ民國64年至72年係一個完整的景氣循環ゝ主要製造業的
定義及範圍ゞ財務比率的選擇及定義々穩定性的操作性定義ぁ本研究的架構。
第三章:相關文獻的探討ヾ國內部份ゝ國外部份。
第四章:研究方法ヾ資料來源─次級資料ゝ本研究待驗證之假設ゞ變異數分析之應用
及限制。
第五章:實證研究─變異數分析的結果。
第六章:實證結果的分析、解釋及應用ヾ實證結果的分析和解釋ゝ實證結果在管理上
的應用。
第七章:結論與建議。
|
86 |
"The conceit of this inconstant stay": Shakespeare's Philosophical Conquest of Time Through PersonificationRoberson, Triche 05 August 2010 (has links)
Throughout the procreation sonnets and those numerous sonnets that promise immortality through verse for Shakespeare's beloved young man, the poet personifies time as an agent of relentlessly destructive change. Yet Shakespeare's approach to the personification of time, as well as his reactions to time, changes over the course of the sequence. He transforms his fear of and obsession with time as a destroyer typical of most sonnets to an attitude of mastery over the once ominous force. The act of contemplating time's power by personification provides the speaker with a deeper awareness of time, love, and mutability that allows him to form several new philosophies which resolve his fear. By the end of the sequence, the poet no longer fortifies himself and the beloved against time's devastation because his new outlook fosters an acceptance of time that opposes and thus negates his previous contention with this force.
|
87 |
Der Begriff des FakeRömer, Stefan 09 July 1998 (has links)
Der Begriff des »Fake« meint eine mimetische Nachahmung eines anderen Kunstwerks, die im Gegensatz zur Fälschung selbst auf ihren gefälschten Charakter hinweist. Eine Künstlerin reproduzierte Fotografien von Walker Evans; diese eigenen Fotografien präsentierte sie auf ähnliche Weise wie das Vorbild; der Titel, »Sherrie Levine After Walker Evans«, weist die Arbeit als Aneignung aus, die die gewandelten kontextuellen und konzeptuellen Bedingungen des identischen Bilds reflektiert. Das Fake zielt demnach mittels einer genauen Bilduntersuchung auf einen kunsthistorischen Erkenntnisprozeß: Die Reproduktion wird nicht mehr moralisch als Fälschung verurteilt, sondern das Fake wird als Kritik der Institution der Kunst und ihrer Ideologie des Originals betrachtet. Das erste Kapitel widmet sich den neuen künstlerischen Strategien zu Anfang der 1970er Jahre und diskursanalytisch der historischen Fälschungsliteratur, dem Verhältnis von Original und Fälschung, um in Abgrenzung davon den Begriff »Fake« einzuführen. Im zweiten Kapitel werden sieben ausgewählte Beispiele von Fakes auf ihre konzeptuelle Formation hin untersucht. Im letzten Kapitel werden die weitreichenden Konsequenzen dargelegt, die das Fake für die Bild- und Kunsttheorie im Verhältnis zu gesellschaftlichen Entwicklungen bedeutet. / The concept of Fake describes a mimetic imitation of another work of art which, in contrast to forgery, hints at its faked nature. A female artist reproduced photographs by Walker Evans and presented these photographs like the original; the title, "Sherrie Levine After Walker Evans" identifies the work as an appropriation which reflects the contextually and conceptually changed conditions of the identical image. Accordingly, the fake aims at an art historical cognitive process by means of an exact examination of the respective artwork: The reproduction is no longer morally condemned as forgery, but the fake is regarded as criticism of the institution of art and its ideology of the orginal. The first chapter deals with the new artistic strategies at the beginning of the 1970s and, in a discursive analysis, the historical literature of Fake as well as with the relation of orginal and forgery in distinction to the concept of "Fake". In the second chapter seven examples of Fake are examined for their conceptual formation. The third and last chapter is a description of the far reaching consequences of Fake for image and art theory in relation to social developments.
|
88 |
Producing tea coolies?Varma, Nitin 05 December 2013 (has links)
Als "Coolie" gilt gemeinhin der "ungelernte" Arbeiter. Das Anbieten von Leiharbeit hatte diverse präkoloniale Vorläufer. Im 19. Jahrhundert wurde der Versuch unternommen, den Begriff des "Coolies" durch diskursive Auslegungen und die Methoden einer "flexiblen-inflexiblen" Arbeit neu zu prägen. "Coolie"-Arbeit galt meist als ein Kompromiss zwischen der Vergangenheit (Sklavenarbeit) und der Zukunft (freie Arbeit/Lohnarbeit) und als Spagat zwischen beiden Systemen. Sie wurde als ein Übergangsstadium und Teil eines versprochenen Wandels dargestellt. Die Teeplantagen Assams nahmen wie viele andere tropische Plantagen in Südasien auch im 19. Jahrhundert offiziell ihren Betrieb auf. Ursprünglich wurden sie von lokalen Arbeitern betrieben. Erst in den späten 1850er-Jahren wurden die lokalen Arbeiter durch von außerhalb der Provinz importierten Arbeitskräften ersetzt, die in der historischen Literatur gemeinhin unproblematisch mit dem Begriff "Coolies" belegt werden. Durch eine Analyse des Übergangs von der lokal rekrutiert für "Kuli" Arbeit und durch eine Analyse seiner Einführung die Studie argumentiert, dass "Kuli" Arbeit wurde "produziert" in den Kolonialkapitalistischen Plantagen in Assam. Mein Anliegen besteht dabei ausdrücklich nicht darin, einen zügellosen kolonialen Kapitalismus nahezulegen, dessen Strategie es ist, "Coolies" zu definieren und hervorzubringen oder die historischen Umstände, Verhandlungen, Streitfragen und Krisen zu betonen. Ich versuche vielmehr, die Erzählungen vom plötzlichen Auftauchen des archetypischen Teeplantagen-"Coolies" (d.i. als importierter und unfreier Lohnarbeiter) zu hinterfragen und sein Erscheinen, sein Bestehen und seine Verlagerungen mehr im Sinne grundlegender und diskursiver Prozesse auszulegen. / “Coolie” is a generic category for the “unskilled” manual labour. The offering of services for hire had various pre-colonial lineages. In the nineteenth century there was an attempt to recast the term in discursive constructions and material practices for “mobilized-immobilized” labour. Coolie labour was often proclaimed as a deliberate compromise straddling the regimes of the past (slave labour) and the future (free labour). It was portrayed as a stage in a promised transition. The tea plantations of Assam, like many other tropical plantations in South Asia, were inaugurated and formalized during this period. They were initially worked by the locals. In the late 1850s, the locals were replaced by labourers imported from outside the province who were unquestioningly designated “coolies” in the historical literature. Qualifying this framework of transition (local to coolie labour) and introduction (of coolie labour), this study makes a case for the “production” of coolie labour in the history of the colonial-capitalist plantations in Assam. The intention of the research is not to suggest an unfettered agency of colonial-capitalism in defining and “producing” coolies, with an emphasis on the attendant contingencies, negotiations, contestations and crises. The study intervenes in the narratives of an abrupt appearance of the archetypical coolie of the tea gardens (i.e., imported and indentured) and situates this archetype’s emergence, sustenance and shifts in the context of material and discursive processes.
|
89 |
Inequality, education and the social sciencesKinville, Michael Robert 17 January 2017 (has links)
Die konzeptionelle Verbindung zwischen Bildung und Gesellschaft, die im 19. Jahrhundert deutlich gemacht und wissenschaftlich begründet wurde, wird oft als selbstverständlich betrachtet. Diese veraltete Verbindung bildete aber die Basis für Bildungsreformen im Sekundärbereich in Deutschland und Indien in der zweiten Hälfte des 20. Jahrhunderts. Diese Arbeit unternimmt den Versuch, zum Verständnis dieser Verzögerung zwischen den Ideen und den Reformen, die sie einrahmten, beizutragen, indem sie eine geeignete Theorie der Verbindung zwischen Bildung und einer komplexen Gesellschaft aufstellt. Grundsätzliche Annäherungen an Gesellschaft und Bildung treten in Dialog mit post-kolonialen und kritischen Theorien. Universalistische Annahmen werden problematisiert, und eine offene Lösung für die Vorstellung zukünftiger Reformen wird präsentiert. Nationale Bildungsreformen in Indien und Deutschland nach ihren „Critical Junctures“ von 1947/1945 werden eingehend und chronologisch verglichen, um einen spezifischen Charakter historisch- und bildungs-bedingter Reproduktion beider Länder herauszuarbeiten sowie einen gemeinsamen Lernprozess zu ermöglichen. Abschließend wird eine Lösung des Problems in der Form offener Bildung präsentiert. Bildung als öffentliches Gut muss nicht zwangsläufig nur auf soziale Probleme reagieren, stattdessen kann sie verändert werden, um sozialen Wandel voran zu treiben. / The conceptual link between education and society, forged in the 19th Century, is often taken for granted. This seemingly outdated connection, however, has guided reforms in secondary education in India and Germany throughout the second half of the 20th Century. This study attempts to understand this lag between underlying ideas and the reforms they framed by synthesizing a viable theory for imagining the connection between education and a complex society. Foundational approaches to society and education are brought into dialogue with post-colonial and critical theories. Universalistic assumptions are problematized, and an open-ended solution for theorizing new connections is presented. National educational reforms in India and Germany subsequent to their critical junctures of 1947/1945 are exhaustively and chronologically compared in order to conceptualize a generic character of historical-educational reproduction for each country and to facilitate a process of mutual learning. Finally, a solution to the problems associated with educational reproduction is presented. Education as a public good does not need to simply be reactive to social problems. Instead, it can be reconfigured so as to drive social change.
|
90 |
Representações da violência do pós-64 na ficção literária de Ivan Ângelo : a escrita engajadaAzevedo, Francesca Batista de January 2015 (has links)
Através das representações da violência presentes nas obras ficcionais escritas por Ivan Ângelo A Festa (1976) e A casa de vidro (1979), durante a Ditadura do Pós-64, é possível relacionar o discurso literário às dimensões sociais. Pelo crivo da história e conceitos originários da Crítica Literária e da Literatura Comparada aplicados ao campo da Sociologia da Literatura, é possível compreender com sensibilidade e rigor alguns episódios e elementos da vida social brasileira, tais como: o papel do escritor em tempos de repressão à liberdade de expressão. Esse exercício reflexivo, arriscado e recente na sociologia materializa tanto os estudos literários quanto os sociológicos, de modo que o imaginário adquire status de objeto social, o que introduz, na análise sociológica dos textos (verbais e imagéticos) e suas circunstâncias de produção, a afirmação de uma das características mais sociais e humanas: a narratividade. A escrita criativa é uma fonte de acesso a posicionamentos tanto estéticos quanto políticos, e através dessas interfaces mesclam-se a memória, o esquecimento e a construção da ideia do passado. Por isso, a prática da leitura sob essa via ressignifica o sentido do presente e avança um passo a mais em direção à literatura como um fato social inegável e como ato ao mesmo tempo individual e coletivo que persiste através dos tempos e da cultura nacional. / It is possible to establish relations with literary discourse and social dimensions by means of the representations of violence existent in the fictional works The celebration (1976) and The tower of glass (1979) written by Ivan Ângelo during the Post-64 dictatorship. It is conceivable to comprehend sensibly and rigorously some episodes and elements of Brazilian social life, such as the role of the writer in times of freedom of expression repression by sifting through history and concepts from Literary Criticism and Comparative Literature applied to the field of Sociology of Literature. This risky and recent in sociology reflexive exercise enriches the study of literature and sociology in a way that fictional thought receives status of social object which introduces the statement of one of the most social and human features, the storytelling, in the sociological analysis of verbal and imagery texts and their production circumstances. Creative writing is a source of entry to both aesthetic and political positioning and, through these interfaces, memory, oblivion and the construction of the past are all mixed. Therefore, the practice of reading under this comprehension reframes the meaning of the present and advances another step towards literature as an undeniable and social fact and also as a fact that is at the same time individual and collective and that persists through time and through national culture.
|
Page generated in 0.0174 seconds