• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 101
  • 54
  • 20
  • 7
  • 5
  • 4
  • 4
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 237
  • 77
  • 31
  • 28
  • 28
  • 26
  • 26
  • 25
  • 21
  • 20
  • 17
  • 17
  • 16
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Visconti e Scimeca visitam Verga: a atualização da narrativa literária e as raízes das adaptações cinematográficas de I Malavoglia / Visconti and Scimeca visit Verga: the updating a literary narrative and the roots of cinematographic adaptations of I Malavoglia

Roda, Regiane Rafaela [UNESP] 23 February 2017 (has links)
Submitted by Regiane Rafaela Roda null (regiane_rafaela@yahoo.com.br) on 2017-04-16T18:58:17Z No. of bitstreams: 1 Tese_RegianeRafaelaRoda2017.pdf: 5234107 bytes, checksum: 2380aea0c7ba38cd2af27e519023be67 (MD5) / Approved for entry into archive by Luiz Galeffi (luizgaleffi@gmail.com) on 2017-04-18T17:38:38Z (GMT) No. of bitstreams: 1 roda_rr_dr_sjrp.pdf: 5234107 bytes, checksum: 2380aea0c7ba38cd2af27e519023be67 (MD5) / Made available in DSpace on 2017-04-18T17:38:38Z (GMT). No. of bitstreams: 1 roda_rr_dr_sjrp.pdf: 5234107 bytes, checksum: 2380aea0c7ba38cd2af27e519023be67 (MD5) Previous issue date: 2017-02-23 / Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) / O romance I Malavoglia (1881), de Giovanni Verga, foi adaptado duas vezes para o cinema: a primeira, pelo milanês Luchino Visconti, em 1948, La terra trema: episodio del mare, importante obra do neorrealismo italiano; e a segunda, pelo siciliano Pasquale Scimeca, em 2010, Malavoglia. Em ambas as adaptações, o momento de produção foi extremamente importante para a reescritura do romance realista do século XIX e instaurou uma profunda e significativa relação entre história e sociedade para a compreensão das obras como realizações artísticas autônomas. A partir do conceito de adaptação de Hutcheon (2013) e dos estudos de Stam (2008) e (2013), este trabalho pautou-se pela compreensão do processo adaptativo que teve como suporte a atualização, ou seja, o transporte da narrativa literária para um novo contexto histórico; por exemplo, os anos seguintes à Segunda Guerra Mundial e o primeiro decênio do Terceiro Milênio, para analisar a forma como os filmes dialogaram com o texto adaptado e ampliaram seu próprio raio de ação ao atualizar a narrativa, ancorando-se nesta, mas projetando-se para engendrar novos sentidos e significações. Segundo a análise desse deslocamento temporal, no qual são inseridas questões de relevância para a atualidade, propiciando a identificação de elementos do presente histórico e da realidade por seu público alvo, foi possível concluir que a atualização proporcionou uma íntima conexão entre o momento de produção das obras cinematográficas e o transporte de elementos da realidade para a trama ficcional que alteraram profundamente os percursos interpretativos e determinaram as bases crítico-ideológicas das releituras produzidas, a finalidade das transformações dos textos e as estratégias de reconstrução dos sentidos para a construção de obras cinematográficas que se revelaram retratos de uma época e de uma sociedade para compor, ao lado do romance, um documento humano. / The novel I Malavoglia (1881), by Giovanni Verga, has been adapted twice to film: the first one, by the Milanese Luchino Visconti, in 1948, La terra trema: episodio del mare, important masterpiece of Italian Neorealism; and the second one by the Sicilian Pasquale Scimeca, in 2010, Malavoglia. For both adaptations, the moment of production was extremely important for the rewriting of a realistic novel from the XIXth century and instituted a deep and significant relationship between history and society to understand those works as autonomous artistic realizations. From Hutcheon’s concept of adaptation (2013) and Stam’s studies (2008 and 2013), this work was based on understanding the adaptation process supported by updating, i.e., the literary narrative transported to a brand new historical context, e. g, the years after World War II and the first years of the Third Millennium, to analyze the dialogues between the films and the adapted text and how they extended their own meanings when they brought the narrative up to date, holding on the novel but enlarging its overall signification. The analysis of that temporal dislodgement, in which it’s possible to find social questions relevant to the their moments of production, indicating and identifying historical and real elements to the audience, led us to conclude that the updating allowed an intimate connection between the moment of production of those cinematographic works and the transport of elements of reality to the fictional story that deeply changed the ways of interpretation and determined the critical and ideological bases of those rewritings, the final goal of those texts’ transformations and the strategies for reconstruction of the meanings to create films that revealed themselves to be portraits of the time and society, composing, side by side with the novel, a human document. / CNPq: 140717/2013-7
42

[en] TEST-DRIVEN MAINTENANCE: AN APPROACH FOR THE MAINTENANCE OF LEGACY SYSTEMS / [pt] TEST-DRIVEN MAINTENANCE: UMA ABORDAGEM PARA MANUTENÇÃO DE SISTEMAS LEGADOS

OTÁVIO ARAUJO LEITÃO ROSA 29 September 2011 (has links)
[pt] Test-Driven Development é uma técnica de desenvolvimento de software baseada em pequenos ciclos que alternam entre a escrita de testes e a implementação da solução para que os testes sejam aprovados. O desenvolvimento orientado a testes vem apresentando excelentes resultados em diversos aspectos da construção de novos sistemas de software. Maior manutenibilidade, melhoria de design, redução da densidade de defeitos, maior documentação e maior cobertura do código são vantagens que contribuem para a diminuição do custo de desenvolvimento e, conseqüentemente, para a maximização do retorno sobre o investimento obtido quando adotamos a técnica. Todos esses benefícios têm contribuído para que Test- Driven Development se torne uma prática cada vez mais crítica na aplicação de metodologias ágeis no processo de desenvolvimento. Quando avaliamos a técnica, sob a ótica dos muitos sistemas legados existentes, nos deparamos com uma clara incompatibilidade para sua adoção neste contexto. Test-Driven Development parte da premissa de que os testes devem ser construídos antes do código e, quando trabalhamos com legados, já possuímos milhares de linhas escritas e funcionando. Diante deste cenário, apresentamos a técnica, que chamamos de Test-Driven Maintenance, resultado da adaptação de Test-Driven Development para o contexto de legados, detalhamos o processo de adaptação necessário para que chegássemos à forma descrita e realizamos uma avaliação das características da técnica original que se estenderam à técnica adaptada. Buscando obter resultados que fossem, de fato, aplicáveis, produzimos uma avaliação empírica baseada nos resultados obtidos na introdução da técnica em um sistema legado, em constante uso e evolução, de uma empresa do Rio de Janeiro. / [en] Test-Driven Development is a software development technique based on quick cycles that switch between writing tests and implementing a solution that assures that tests do pass. Test-Driven Development has produced excellent results in various aspects of building new software systems. Increased maintainability, improved design, reduced defect density, better documentation and increased code test coverage are reported as advantages that contribute to reducing the cost of development and, consequently, to increasing return on investment. All these benefits have contributed for Test-Driven Development to become an increasingly relevant practice while developing software. When evaluating test driven development from the perspective of maintaining legacy systems, we face a clear mismatch when trying to adopt this technique. Test-Driven Development is based on the premise that tests should be written before coding, but when working with legacy code we already have thousands of lines written and running. Considering this context, we discuss in this dissertation a technique, which we call Test-Driven Maintenance, that is a result of adapting Test-Driven Development to the needs of maintaining legacy systems. We describe how we have performed the adaptation that lead us to this new technique. Afterwards, we evaluate the efficacy of the technique applying it to a realistic project. To obtain realistic evaluation results, we have performed an empirical study while introducing the technique in a maintenance team working on a legacy system that is in constant evolution and use by an enterprise in Rio de Janeiro.
43

Computation of a Damping Matrix for Finite Element Model Updating

Pilkey, Deborah F. 26 April 1998 (has links)
The characterization of damping is important in making accurate predictions of both the true response and the frequency response of any device or structure dominated by energy dissipation. The process of modeling damping matrices and experimental verification of those is challenging because damping can not be determined via static tests as can mass and stiffness. Furthermore, damping is more difficult to determine from dynamic measurements than natural frequency. However, damping is extremely important in formulating predictive models of structures. In addition, damping matrix identification may be useful in diagnostics or health monitoring of structures. The objective of this work is to find a robust, practical procedure to identify damping matrices. All aspects of the damping identification procedure are investigated. The procedures for damping identification presented herein are based on prior knowledge of the finite element or analytical mass matrices and measured eigendata. Alternately, a procedure is based on knowledge of the mass and stiffness matrices and the eigendata. With this in mind, an exploration into model reduction and updating is needed to make the problem more complete for practical applications. Additionally, high performance computing is used as a tool to deal with large problems. High Performance Fortran is exploited for this purpose. Finally, several examples, including one experimental example are used to illustrate the use of these new damping matrix identification algorithms and to explore their robustness. / Ph. D.
44

Algorithm for Spectral Matching of Earthquake Ground Motions using Wavelets and Broyden Updating

Adekristi, Armen 21 May 2013 (has links)
This study focuses on creating a spectral matching algorithm that modifies the real strong ground motions in the time domain by adding wavelets adjustment to the acceleration time series. The spectral matching procedure is at its core a nonlinear problem, thus a nonlinear solving method was employed in the proposed algorithm. The Broyden updating method was selected as the nonlinear solving method because it does not require a differentiation analysis. The Broyden updating also makes use the information of spectral misfit and wavelet magnitudes vector to approximate the Jacobian matrix which expected to give an efficient calculation. A parametric study was numerically conducted to obtain a set of gain factors that reduce the computational time and minimize the spectra misfit. The study was conducted using ten different ground motions, taken from FEMA P-695 (FEMA, 2009), which represent far field, near field-pulse and near field-no pulse earthquake ground motions. A study of compatible wavelet functions was carried out to determine the appropriate wavelet function for the proposed method. The study include the baseline drift, the frequency and time resolution, and the cross correlation between wavelet adjustments during the spectra matching procedure. Based on this study, the corrected tapered cosine wavelet was selected to be used in the proposed algorithm. The proposed algorithm has been tested and compared with other methods that are commonly used in spectral matching; the RSPMatch method and the frequency domain method. The comparing parameters were the computational time, the average misfit, the maximum misfit and error, the PGA, PGV, PGD, the Arias Intensity and the frequency content for both acceleration and displacement time histories. The result showed that the proposed method is able to match the target while preserving the energy development and the frequency content of the original time histories. / Master of Science
45

Parameter Estimation Using Sensor Fusion And Model Updating

Francoforte, Kevin 01 January 2007 (has links)
Engineers and infrastructure owners have to manage an aging civil infrastructure in the US. Engineers have the opportunity to analyze structures using finite element models (FEM), and often base their engineering decisions on the outcome of the results. Ultimately, the success of these decisions is directly related to the accuracy of the finite element model in representing the real-life structure. Improper assumptions in the model such as member properties or connections, can lead to inaccurate results. A major source of modeling error in many finite element models of existing structures is due to improper representation of the boundary conditions. In this study, it is aimed to integrate experimental and analytical concepts by means of parameter estimation, whereby the boundary condition parameters of a structure in question are determined. FEM updating is a commonly used method to determine the "as-is" condition of an existing structure. Experimental testing of the structure using static and/or dynamic measurements can be utilized to update the unknown parameters. Optimization programs are used to update the unknown parameters by minimizing the error between the analytical and experimental measurements. Through parameter estimation, unknown parameters of the structure such as stiffness, mass or support conditions can be estimated, or more appropriately, "updated", so that the updated model provides for a better representation of the actual conditions of the system. In this study, a densely instrumented laboratory test beam was used to carry-out both analytical and experimental analysis of multiple boundary condition setups. The test beam was instrumented with an array of displacement transducers, tiltmeters and accelerometers. Linear vertical springs represented the unknown boundary stiffness parameters in the numerical model of the beam. Nine different load cases were performed and static measurements were used to update the spring stiffness, while dynamic measurements and additional load cases were used to verify these updated parameters. Two different optimization programs were used to update the unknown parameters and then the results were compared. One optimization tool was developed by the author, Spreadsheet Parameter Estimation (SPE), which utilized the Solver function found in the widely available Microsoft Excel software. The other one, comprehensive MATLAB-based PARameter Identification System (PARIS) software, was developed at Tufts University. Optimization results from the two programs are presented and discussed for different boundary condition setups in this thesis. For this purpose, finite element models were updated using the static data and then these models were checked against dynamic measurements for model validation. Model parameter updating provides excellent insight into the behavior of different boundary conditions and their effect on the overall structural behavior of the system. Updated FEM using estimated parameters from both optimization software programs generally shows promising results when compared to the experimental data sets. Although the use of SPE is simple and generally straight-forward, we will see the apparent limitations when dealing with complex, non-linear support conditions. Due to the inherent error associated with experimental measurements and FEM modeling assumptions, PARIS serves as a better suited tool to perform parameter estimation. Results from SPE can be used for quick analysis of structures, and can serve as initial inputs for the more in depth PARIS models. A number of different sensor types and spatial resolution were also investigated for the possible minimum instrumentation to have an acceptable model representation in terms of model and experimental data correlation.
46

Sleeping Beauty and De Nunc Updating

Kim, Namjoong 01 May 2010 (has links)
About a decade ago, Adam Elga introduced philosophers to an intriguing puzzle. In it, Sleeping Beauty, a perfectly rational agent, undergoes an experiment in which she becomes ignorant of what time it is. This situation is puzzling for two reasons: First, because there are two equally plausible views about how she will change her degree of belief given her situation and, second, because the traditional rules for updating degrees of belief don't seem to apply to this case. In this dissertation, my goals are to settle the debate concerning this puzzle and to offer a new rule for updating some types of degrees of belief. Regarding the puzzle, I will defend a view called "the Lesser view," a view largely favorable to the Thirders' position in the traditional debate on the puzzle. Regarding the general rule for updating, I will present and defend a rule called "Shifted Jeffrey Conditionalization." My discussions of the above view and rule will complement each other: On the one hand, I defend the Lesser view by making use of Shifted Jeffrey Conditionalization. On the other hand, I test Shifted Jeffrey Conditionalization by applying it to various credal transitions in the Sleeping Beauty problem and revise that rule in accordance with the results of the test application. In the end, I will present and defend an updating rule called "General Shifted Jeffrey Conditionalization," which I suspect is the general rule for updating one's degrees of belief in so-called tensed propositions.
47

Assessing Updating of Affective Content as a Potential Endophenotypic Predictor of Depressive Symptoms

Jordan, Duncan Gage 08 December 2017 (has links)
Executive functioning (EF) deficits may be associated with depressed states, although limited research has examined components of EF as endophenotypes of depression. This study assessed whether affective updating predicted depressive symptoms in a sample pre-selected for varying levels of depression via the affective n-back. In this task, participants determine whether the valence of a stimulus matches the valence of the stimulus presented two stimuli prior. Results suggested affective updating ability did not significantly predict depressive symptoms approximately over time, although higher accuracy in updating negative information was associated with more depressive symptoms approximately twelve weeks later. Moreover, accuracy in updating positive and negative information did not differ between groups. However, a trend emerged for depressed participants to be more accurate in updating negative information in the face of interfering positive information, compared to updating positive information with interfering negative information. The latter results are considered within the reward devaluation framework.
48

Spatial Updating and Set Size: Evidence for Long-Term Memory Reconstruction

Hodgson, Eric P. 19 July 2005 (has links)
No description available.
49

The business end of objects: Monitoring object orientation

Mello, Catherine 16 July 2009 (has links)
No description available.
50

Model Updating Using Neural Networks

Atalla, Mauro J. 01 April 1996 (has links)
Accurate models are necessary in critical applications. Key parameters in dynamic systems often change during their life cycle due to repair and replacement of parts or environmental changes. This dissertation presents a new approach to update system models, accounting for these changes. The approach uses frequency domain data and a neural network to produce estimates of the parameters being updated, yielding a model representative of the measured data. Current iterative methods developed to solve the model updating problem rely on minimization techniques to find the set of model parameters that yield the best match between experimental and analytical responses. Since the minimization procedure requires a fair amount of computation time, it makes the existing techniques infeasible for use as part of an adaptive control scheme correcting the model parameters as the system changes. They also require either mode shape expansion or model reduction before they can be applied, introducing errors in the procedure. Furthermore, none of the existing techniques has been applied to nonlinear systems. The neural network estimates the parameters being updated quickly and accurately without the need to measure all degrees of freedom of the system. This avoids the use of mode shape expansion or model reduction techniques, and allows for its implementation as part of an adaptive control scheme. The proposed technique is also capable of updating weakly nonlinear systems. Numerical simulations and experimental results show that the proposed method has good accuracy and generalization properties, and it is therefore, a suitable alternative for the solution of the model updating problem of this class of systems. / Ph. D.

Page generated in 0.029 seconds