• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 93
  • 28
  • 18
  • 12
  • 11
  • 8
  • 7
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 209
  • 45
  • 38
  • 32
  • 30
  • 29
  • 22
  • 22
  • 21
  • 18
  • 18
  • 17
  • 17
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

No ar: Jornal Nacional: das audiovisualidades às atualizações no audiovisual brasileiro

Souza, Karla Caroline Nery de 10 September 2010 (has links)
Submitted by Mariana Dornelles Vargas (marianadv) on 2015-05-12T14:48:41Z No. of bitstreams: 1 no_ar.pdf: 6522555 bytes, checksum: 6eb3e76a2d44f89153f7d5ada13ea5eb (MD5) / Made available in DSpace on 2015-05-12T14:48:41Z (GMT). No. of bitstreams: 1 no_ar.pdf: 6522555 bytes, checksum: 6eb3e76a2d44f89153f7d5ada13ea5eb (MD5) Previous issue date: 2010-09-10 / Nenhuma / O presente trabalho parte do pressuposto de que o Jornal Nacional é um programa televisivo de alto valor midiático tendo em vista o tempo que está no ar, bem como as marcas deixadas no telejornalismo brasileiro. Em função disso, ele se atualiza em outros tempos e espaços audiovisuais em filmes, na internet, em vídeos e em outros programas e unidades autônomas de TV. Assim, neste estudo busca-se identificar e analisar as atualizações do Jornal Nacional em meios audiovisuais, a partir das dimensões técnicas, discursivas e culturais das audiovisualidades e dos rastros deixados pelo programa em outros vídeos. Como procedimento metodológico, inicialmente realizou-se uma vasta revisão teórica, a fim de caracterizar o objeto de estudo e identificar as virtualidades e principais marcas do Jornal Nacional, inseridas nas intersecções do audiovisual e do jornalismo. Para tanto, os métodos e procedimentos utilizados foram o rizoma, a cartografia e a desconstrução. Como principais resultados, pode-se dizer que o Jornal Nacional se atualiza em outros audiovisuais, por meio das técnicas, dos discursos e da cultura próprios do audiovisual. Também utiliza modos próprios tais como a citação direta, a citação indireta e a apropriação. Por meio das desmontagens e remontagens destas atualizações, podem-se perceber ainda evidências de uma linguagem que perpassa o audiovisual, que, além de pouco conhecida, é pouco estudada, e que esse exercício de observação, composição e decomposição de um dado fazer exige um conhecimento do âmbito da cultura, da linguagem e da técnica, necessários para a interpretação dos vídeos. Por fim, foi na reflexão e recomposição dos percursos que se encontraram as conexões que configuram e dão a ver o objeto. / The present work has left of estimated of that the Jornal Nacional is a televising program of high midiatic value in view of time that is on air, as well as the marks left in the Brazilian telejournalism. In fuction of this, it modernizes in other times and audiovisuals spaces in films, the internet, videos and other programs and independent units of TV. Thus, in this study one searchs to identify and to analyze the updates of the Jornal Nacional in half audiovisuals, from the audiovisualitys dimensions of techniques, language and cultural and the tracks left for the program in other videos. The methodologic procedure, initially a vast theoretical revision was become fullfilled, in order to characterize the study object and to identify the virtualitys and the main marks of the Jornal Nacional, inserted in the interseccion of the audiovisual and the journalism. For in such a way, the used methods and procedures had been rizoma, cartography and desconstruction. As main results, it can be said that the Jornal Nacional if brings up to date in other audiovisuals, trough the techniques, speeches and culture propers of the audiovisual. Also it uses in proper ways such as the direct citation, indirect citation and appropriation. Trough the dismounts and re-assemblies of these updates, it can still be perceived evidences of a language that to pass beyond the audiovisual, that beyond little known, are little studied and that this exercise of comment, composition and decomposition of data to make demands a knowledge of the scope of the culture, the language and the technique, necessary for the interpretation of the videos. Finally, it was in the reflection and resetting of the passages that if had found the connections that configure and give to see the object.
52

Teoria da relatividade restrita : uma introdução histórico-epistemológica e conceitual voltada ao ensino médio

Fuchs, Eduardo Ismael January 2016 (has links)
Este trabalho é a narrativa de uma experiência didática de aplicação de um módulo que abordou um tópico de Física Moderna e Contemporânea, a Teoria da Relatividade Restrita, no Ensino Médio. A proposta foi aplicada em uma escola particular situada no município de Arroio do Meio, RS, em uma turma de terceira série do nível médio regular, sob o referencial teórico da teoria cognitiva de Jean William Fritz Piaget (1896-1980) e sob o referencial epistemológico de Thomas Samuel Kuhn (1922-1996). Descreve-se o planejamento das aulas, a implementação da proposta e os resultados obtidos com sua aplicação em sala de aula na modalidade de um curso extraclasse. A forma como o módulo foi pensado e o nível de profundidade que foi possível alcançar aparecem ao longo do texto, que também oferece uma revisão da literatura em que a relevância da inclusão da Física Moderna e Contemporânea no currículo do Ensino Médio é discutida. Os resultados indicam que é possível trabalhar tópicos de Física Moderna e Contemporânea no ensino regular, que os alunos apreciaram e mostraram disposição para aprender assuntos atuais e que o esforço para introduzir pequenas atualizações curriculares é válido e precisa ser incentivado como uma das possíveis alternativas para se alcançar a melhoria de qualidade de ensino na Educação Básica. Ao final, um produto educacional em formato de texto de apoio, orientação e motivação aos professores de Física é apresentado. / This work is the narrative of a didactic experience of application of a module that addressed a topic of Modern and Contemporary Physics, the Special Theory of Relativity, in High School. The proposal was applied in a private school located in the county of Arroio do Meio, RS, for one group of third grade of regular secondary level, under the theoretical framework of cognitive theory of Jean William Fritz Piaget (1896-1980) and under the epistemological framework of Thomas Samuel Kuhn (1922-1996). It describes the planning of classes, the implementation of the proposal and the results obtained from its application in the classroom in the form of an extracurricular course. The way the module has been designed and the level of depth that was achieved is discussed throughout the text, which also provide a review of the literature in which importance of inclusion of Modern and Contemporary Physics in High School curriculum is discussed. The results indicate that it is possible teach subjects of Modern and Contemporary Physics in regular education. The results also indicated that students enjoyed and showed willingness to learn current issues and the effort to introduce small curriculum updates is valid and needs to be encouraged as one of the possible alternatives to achieve the improvement of teaching quality in basic education. At the end, an educational product in form of text, guidance and motivation to physics teachers is presented.
53

Problém vlastních čísel symetrických řídkých matic v souvislosti s výpočty elektronových stavů / Special eigenvalue problems for symmetric sparse matrices related to electronic structure calculations

Novák, Matyáš January 2012 (has links)
Ab-initio methods for calculating electronic structure represent an important field of material physics. The aim of this theses - within the project focused on developing the new method for calculating electronic states in non-periodic structures based on density functional theory, pseudopotentials, and finite elements methods - is to convert Kohn-Sham equations into the form suitable for discretisation, to suggest apropriate method for solving generalized eigenproblem resulting from this discretisation and to implement an eigenvalue solver (or to modify existing one). The thesis describes a procedure for converting the many-particle Schrödinger equation into generalized rank-k-update eigenvalue problem and discusses various methods for its solution. Eigensolver Blzpack, which makes use of the block Lanczos method, has been modified, integrated into the Sfepy framework (a tool for the finite element method calculation) and resulting code has been successfully tested.
54

A High-Level Overview of How the New Accounting Standard Update on Revenue Recognition Impacts the United States Healthcare System

Johnson, Leslie 01 May 2018 (has links)
In May of 2014 the Financial Accounting Standards Board (FASB) and the International Accounting Standards Board (IASB) issued a long-awaited joint updated standard on revenue recognition, ASU 2014-09 – Revenue from Contracts with Customers. While almost all entities will be affected to some extent by the new standard, particularly the changes in required disclosures, this research seeks to examine the impact the new standard will specifically have on the healthcare industry. By highlighting areas of significant challenge a better understanding will be gained of the impact health care service entities will experience as they transition to a new standard.
55

Zooapelativum jako bázové slovo lexikalizovaných přirovnání. / Zooapelativum as a base word lexical similes.

STACHOVÁ, Milada January 2019 (has links)
This diploma thesis deals with a lexicalized comparison, whose basic words are animal names, zooapelativa. Phraseology and idiomatic language are described in the theoretical part. This part is based not only on Čermák's conception but also on the definition and classification of basic units of this linguistic discipline - set phrase and idioms. The practical part mainly describes animal appellation (bull, duck, cat, goat, cow, horse, sheep, dog, pig and hen). Linguistic analysis, inspirited by the Czech national corpus, focuses on the frequency of the given phrasal units, their diversity and update extent.
56

Machine Learning Methods for Annual Influenza Vaccine Update

Tang, Lin 26 April 2013 (has links)
Influenza is a public health problem that causes serious illness and deaths all over the world. Vaccination has been shown to be the most effective mean to prevent infection. The primary component of influenza vaccine is the weakened strains. Vaccination triggers the immune system to develop antibodies against those strains whose viral surface glycoprotein hemagglutinin (HA) is similar to that of vaccine strains. However, influenza vaccine must be updated annually since the antigenic structure of HA is constantly mutation. Hemagglutination inhibition (HI) assay is a laboratory procedure frequently applied to evaluate the antigenic relationships of the influenza viruses. It enables the World Health Organization (WHO) to recommend appropriate updates on strains that will most likely be protective against the circulating influenza strains. However, HI assay is labour intensive and time-consuming since it requires several controls for standardization. We use two machine-learning methods, i.e. Artificial Neural Network (ANN) and Logistic Regression, and a Mixed-Integer Optimization Model to predict antigenic variety. The ANN generalizes the input data to patterns inherent in the data, and then uses these patterns to make predictions. The logistic regression model identifies and selects the amino acid positions, which contribute most significantly to antigenic difference. The output of the logistic regression model will be used to predict the antigenic variants based on the predicted probability. The Mixed-Integer Optimization Model is formulated to find hyperplanes that enable binary classification. The performances of our models are evaluated by cross validation.
57

Alternative Measures for the Analysis of Online Algorithms

Dorrigiv, Reza 26 February 2010 (has links)
In this thesis we introduce and evaluate several new models for the analysis of online algorithms. In an online problem, the algorithm does not know the entire input from the beginning; the input is revealed in a sequence of steps. At each step the algorithm should make its decisions based on the past and without any knowledge about the future. Many important real-life problems such as paging and routing are intrinsically online and thus the design and analysis of online algorithms is one of the main research areas in theoretical computer science. Competitive analysis is the standard measure for analysis of online algorithms. It has been applied to many online problems in diverse areas ranging from robot navigation, to network routing, to scheduling, to online graph coloring. While in several instances competitive analysis gives satisfactory results, for certain problems it results in unrealistically pessimistic ratios and/or fails to distinguish between algorithms that have vastly differing performance under any practical characterization. Addressing these shortcomings has been the subject of intense research by many of the best minds in the field. In this thesis, building upon recent advances of others we introduce some new models for analysis of online algorithms, namely Bijective Analysis, Average Analysis, Parameterized Analysis, and Relative Interval Analysis. We show that they lead to good results when applied to paging and list update algorithms. Paging and list update are two well known online problems. Paging is one of the main examples of poor behavior of competitive analysis. We show that LRU is the unique optimal online paging algorithm according to Average Analysis on sequences with locality of reference. Recall that in practice input sequences for paging have high locality of reference. It has been empirically long established that LRU is the best paging algorithm. Yet, Average Analysis is the first model that gives strict separation of LRU from all other online paging algorithms, thus solving a long standing open problem. We prove a similar result for the optimality of MTF for list update on sequences with locality of reference. A technique for the analysis of online algorithms has to be effective to be useful in day-to-day analysis of algorithms. While Bijective and Average Analysis succeed at providing fine separation, their application can be, at times, cumbersome. Thus we apply a parameterized or adaptive analysis framework to online algorithms. We show that this framework is effective, can be applied more easily to a larger family of problems and leads to finer analysis than the competitive ratio. The conceptual innovation of parameterizing the performance of an algorithm by something other than the input size was first introduced over three decades ago [124, 125]. By now it has been extensively studied and understood in the context of adaptive analysis (for problems in P) and parameterized algorithms (for NP-hard problems), yet to our knowledge this thesis is the first systematic application of this technique to the study of online algorithms. Interestingly, competitive analysis can be recast as a particular form of parameterized analysis in which the performance of opt is the parameter. In general, for each problem we can choose the parameter/measure that best reflects the difficulty of the input. We show that in many instances the performance of opt on a sequence is a coarse approximation of the difficulty or complexity of a given input sequence. Using a finer, more natural measure we can separate paging and list update algorithms which were otherwise indistinguishable under the classical model. This creates a performance hierarchy of algorithms which better reflects the intuitive relative strengths between them. Lastly, we show that, surprisingly, certain randomized algorithms which are superior to MTF in the classical model are not so in the parameterized case, which matches experimental results. We test list update algorithms in the context of a data compression problem known to have locality of reference. Our experiments show MTF outperforms other list update algorithms in practice after BWT. This is consistent with the intuition that BWT increases locality of reference.
58

Alternative Measures for the Analysis of Online Algorithms

Dorrigiv, Reza 26 February 2010 (has links)
In this thesis we introduce and evaluate several new models for the analysis of online algorithms. In an online problem, the algorithm does not know the entire input from the beginning; the input is revealed in a sequence of steps. At each step the algorithm should make its decisions based on the past and without any knowledge about the future. Many important real-life problems such as paging and routing are intrinsically online and thus the design and analysis of online algorithms is one of the main research areas in theoretical computer science. Competitive analysis is the standard measure for analysis of online algorithms. It has been applied to many online problems in diverse areas ranging from robot navigation, to network routing, to scheduling, to online graph coloring. While in several instances competitive analysis gives satisfactory results, for certain problems it results in unrealistically pessimistic ratios and/or fails to distinguish between algorithms that have vastly differing performance under any practical characterization. Addressing these shortcomings has been the subject of intense research by many of the best minds in the field. In this thesis, building upon recent advances of others we introduce some new models for analysis of online algorithms, namely Bijective Analysis, Average Analysis, Parameterized Analysis, and Relative Interval Analysis. We show that they lead to good results when applied to paging and list update algorithms. Paging and list update are two well known online problems. Paging is one of the main examples of poor behavior of competitive analysis. We show that LRU is the unique optimal online paging algorithm according to Average Analysis on sequences with locality of reference. Recall that in practice input sequences for paging have high locality of reference. It has been empirically long established that LRU is the best paging algorithm. Yet, Average Analysis is the first model that gives strict separation of LRU from all other online paging algorithms, thus solving a long standing open problem. We prove a similar result for the optimality of MTF for list update on sequences with locality of reference. A technique for the analysis of online algorithms has to be effective to be useful in day-to-day analysis of algorithms. While Bijective and Average Analysis succeed at providing fine separation, their application can be, at times, cumbersome. Thus we apply a parameterized or adaptive analysis framework to online algorithms. We show that this framework is effective, can be applied more easily to a larger family of problems and leads to finer analysis than the competitive ratio. The conceptual innovation of parameterizing the performance of an algorithm by something other than the input size was first introduced over three decades ago [124, 125]. By now it has been extensively studied and understood in the context of adaptive analysis (for problems in P) and parameterized algorithms (for NP-hard problems), yet to our knowledge this thesis is the first systematic application of this technique to the study of online algorithms. Interestingly, competitive analysis can be recast as a particular form of parameterized analysis in which the performance of opt is the parameter. In general, for each problem we can choose the parameter/measure that best reflects the difficulty of the input. We show that in many instances the performance of opt on a sequence is a coarse approximation of the difficulty or complexity of a given input sequence. Using a finer, more natural measure we can separate paging and list update algorithms which were otherwise indistinguishable under the classical model. This creates a performance hierarchy of algorithms which better reflects the intuitive relative strengths between them. Lastly, we show that, surprisingly, certain randomized algorithms which are superior to MTF in the classical model are not so in the parameterized case, which matches experimental results. We test list update algorithms in the context of a data compression problem known to have locality of reference. Our experiments show MTF outperforms other list update algorithms in practice after BWT. This is consistent with the intuition that BWT increases locality of reference.
59

Performance Enhancement of the Erasure-Coded Storage Systems in Cloud Using the ECL-based Technique

Zhu, Jia-Zheng 16 November 2012 (has links)
Though erasure codes are widely adopted in high fault tolerance storage systems, there exists a serious small-write problem. Many algorithms are proposed to improve small-write performance in RAID systems, without considering the network bandwidth usage. However, the network bandwidth is expensive in cloud systems. In this thesis, we proposed an ECL-based (E-MBR codes, Caching and Logging-based) technique to improve the small-write performance without using extra network bandwidth. In addition, the ECL-based technique also reduces the delayed parity update and data recovery latency compared with the competing algorithm.
60

A Risk-sensitive Approach For Airline Network Revenue Management Problems

Cetiner, Demet 01 September 2007 (has links) (PDF)
In this thesis, airline network revenue management problem is considered for the case with no cancellations and overbooking. In literature, there exist several approximate probabilistic and deterministic mathematical models developed in order to maximize expected revenue at the end of the reservation period. The aim of this study is to develop models considering also the risks involved in the proposed booking control policies. Two linear programming models are proposed which incorporate the variance of the revenue. The objective of the models is to effectively balance the tradeoff between the expectation and variance of the revenue. The performances of the proposed models are compared to the previous models through a numerical study. The seat allocations resulting from the mathematical models are used in a simulation model working with several booking control policies. The probability distributions of the revenues are investigated and the revenues are compared in terms of expectation, standard deviation, coefficient of variation and probability of poor performance. It is observed that the use of the proposed models decreases the variability of the revenue and thereby the risk of probability of poor performance. Also, the expected revenues obtained by implementing the solutions of the proposed models with nested booking control policies turn out to be higher than other probabilistic models as long as the degree of variance incorporation is within some interval. When compared with the deterministic models, the proposed models provides for the decision makers with alternative, preferable policies in terms of the expectation and the variability measures.

Page generated in 0.0568 seconds