51 |
Polidez e inclusão : o ser e o parecer no discurso de professores sobre a inclusão da pessoa com deficiência na escola. / POLITENESS IS INCLUSION: "being" and "opinion" in the discourse of teachers on the inclusion of people with disabilities in school.Santos, Jorge Henrique Vieira 18 June 2012 (has links)
The linguistic politeness can be understood as the result of man's need to maintain the balance of their interpersonal relationships. The speakers employ politeness strategies in their verbal interactions, in order to keep them free of potential conflicts. Such strategies can be found in the speech of teachers on the inclusion of people with disabilities in the school environment. It is believed that these teachers make use of politeness not only to maintain the harmony of interpersonal relationships, but mainly to design, maintain and confirm images of themselves and the group to which they belong, in accordance with what is socially coded as politically
correct. The objective is to investigate the linguistic politeness in the discourse of these teachers, discussing the reasons underlying their use and their implications for effective school inclusion of people with disabilities, based on the following questions: a) What strategies of politeness can be verified in teachers' discourse on the inclusion of people with disabilities in regular classrooms? b) What reasons lead teachers to employ strategies of
politeness in his speech? c) What effect this polite discourse can produce to the process of inclusion of people with disabilities in the school environment? The research was based on concepts provided by pragmatic theories, in regard to politeness, from the model proposed by Brown and Levinson (1987 [1978]), according to the reformulations and improvements made
by Kerbrat-Orecchioni (2004, 2006) associated with other reflections provided mainly by studies of Rodriguez (2010), Goffman (2008) and Bravo (2000). These concepts were articulated to the issue of disability, which is presented and discussed based on the contributions of several authors, among them, Pessotti (1984), Pereira (2006), Diniz (2010), Mazzotta (2005), Matos (2007 ) and Souza (2009). The corpus of this research was formed
with the data from a focused discussion on the subject of inclusion of people with disabilities in school; the proposal was drawn up on a focus group of elementary school teachers in public schools of Sergipe. Although the Focus Group has been the main source, there was the
triangulation of data generated by diagnostic form, individual interviews and participant observation. The data analysis shows that politeness which occurs in the speech of teachers on the inclusion of people with disabilities in the school environment not only serves to disguise
the stigma that hangs over the image of people with disabilities, but especially to the projection, preservation and confirmation of teachers' images of themselves and the group they belong to. These images result from projections of ideological values of society over the
person, which applies to him in the form of a system of appearances. Thus, words and actions of teachers integrate a representation that person meet in order to enroll in a group and not to be classified as strange or deviant from what is considered appropriate or proper. This appearance game disguises attitudinal barriers and negative procedures for effective inclusion of people with disabilities in school and society. / A polidez linguística pode ser entendida como fruto da necessidade do homem de manter o equilíbrio de suas relações interpessoais. Os falantes empregam estratégias de polidez em suas interações verbais, com o propósito de mantê-las livres de possíveis conflitos. Tais estratégias
podem ser verificadas no discurso de professores sobre a inclusão de pessoas com deficiência no ambiente escolar. Acredita-se que esses professores façam uso da polidez não só para manter a harmonia das relações interpessoais, mas, principalmente, para projetar, preservar e confirmar imagens de si e do grupo ao qual pertencem, em conformidade com o que é codificado socialmente como politicamente correto. Objetiva-se investigar a polidez linguística presente no discurso desses professores, discutindo as razões que subjazem ao seu uso e suas implicações para o efetivo processo de inclusão escolar da pessoa com deficiência, a partir dos seguintes questionamentos: a) Quais estratégias de polidez podem ser verificadas no discurso dos professores sobre a inclusão da pessoa com deficiência em salas regulares? b) Que motivos levam os professores a empregarem estratégias de polidez em seu discurso? c)
Que efeitos esse discurso polido pode produzir para o processo de inclusão da pessoa com deficiência no ambiente escolar? A pesquisa fundamentou-se nos conceitos fornecidos pelas teorias pragmáticas, no que concerne à polidez, a partir do modelo proposto por Brown e Levinson (1987 [1978]), segundo as reformulações e os aperfeiçoamentos feitos por Kerbrat-
Orecchioni (2004; 2006), associados a outras reflexões fornecidas, sobretudo, pelos estudos de Rodriguez (2010), Goffman (2008) e Bravo (2000). Articularam-se tais conceitos à questão da deficiência, que é apresentada e discutida a partir das contribuições de diversos autores, entre os quais Pessotti (1984), Pereira (2006), Diniz (2010), Mazzotta (2005), Matos (2007) e Souza (2009). Constituiu-se o corpus desta pesquisa a partir dos dados gerados de uma discussão focalizada sobre o tema inclusão da pessoa com deficiência na escola, desencadeada em um Grupo Focal de professores do Ensino Fundamental da rede estadual de ensino de Sergipe. Embora o Grupo Focal tenha sido a principal fonte, houve ainda a triangulação dos dados gerados com questionário diagnóstico, entrevistas individuais e observação participante. A análise dos dados realizada demonstra que a polidez que se verifica no discurso dos professores sobre a inclusão da pessoa com deficiência no ambiente escolar serve à dissimulação do estigma que pesa sobre a imagem da pessoa com deficiência e à projeção, preservação e confirmação de imagens de si, dos professores e do grupo a que pertencem. Tais imagens decorrem de projeções dos valores da ideologia da sociedade sobre o indivíduo, que se impõe a este sob a forma de um sistema de aparências. Assim, atos e palavras dos professores integram uma representação que os indivíduos cumprem, a fim de se inscreverem
num grupo e de não se deixarem classificar como estranhos ou desviantes do que é considerado apropriado ou adequado. Esse jogo de aparências mascara barreiras atitudinais e procedimentos negativos à inclusão efetiva da pessoa com deficiência na escola e na sociedade.
|
52 |
On Learning k-Parities and the Complexity of k-Vector-SUMGadekar, Ameet January 2016 (has links) (PDF)
In this work, we study two problems: first is one of the central problem in learning theory of learning sparse parities and the other k-Vector-SUM is an extension of the not oriousk-SUM problem. We first consider the problem of learning k-parities in the on-line mistake-bound model: given a hidden vector ∈ {0,1}nwith|x|=kand a sequence of “questions” a ,a ,12··· ∈{0,1}n, where the algorithm must reply to each question with〈a ,xi〉(mod 2), what is the best trade off between the number of mistakes made by the algorithm and its time complexity? We improve the previous best result of Buhrman et. al. By an exp (k) factor in the timecomplexity. Next, we consider the problem of learning k-parities in the presence of classification noise of rate η∈(0,12). A polynomial time algorithm for this problem (whenη >0 andk=ω(1))is a longstanding challenge in learning theory. Grigorescu et al. Showed an algorithm running in time(no/2)1+4η2+o(1). Note that this algorithm inherently requires time(nk/2)even when the noise rateη is polynomially small. We observe that for sufficiently small noise rate, it ispossible to break the(nk/2)barrier. In particular, if for some function f(n) =ω(logn) andα∈[12,1),k=n/f(n) andη=o(f(n)−α/logn), then there is an algorithm for the problem with running time poly(n)·( )nk1−α·e−k/4.01.Moving on to the k-Vector-SUM problem, where given n vectors〈v ,v ,...,v12n〉over the vector space Fdq, a target vector tand an integer k>1, determine whether there exists k vectors whose sum list, where sum is over the field Fq. We show a parameterized reduction fromk-Clique problem to k-Vector-SUM problem, thus showing the hardness of k-Vector-SUM. In parameterized complexity theory, our reduction shows that the k-Vector-SUM problem is hard for the class W[1], although, Downey and Fellows have shown the W[1]-hardness result for k-Vector-SUM using other techniques. In our next attempt, we try to show connections between k-Vector-SUM and k-LPN. First we prove that, a variant of k-Vector-SUM problem, called k-Noisy-SUM is at least as hard as the k-LPN problem. This implies that any improvements ink-Noisy-SUM would result into improved algorithms fork-LPN. In our next result, we show a reverse reduction from k-Vector-SUM to k-LPN with high noise rate. Providing lower bounds fork-LPN problem is an open challenge and many algorithms in cryptography have been developed assuming its1 2hardness. Our reduction shows that k-LPN with noise rate η=12−12·nk·2−n(k−1k)cannot be solved in time no(k)assuming Exponential Time Hypothesis and, thus providing lower bound for a weaker version of k-LPN
|
53 |
Diagnóstico de influência local no modelo de calibração ultraestrutural com réplicas / Local influence diagnostics in ultrastructural calibration model with replicasBruno Pinheiro de Andrade 09 December 2016 (has links)
Este trabalho tem como proposta apresentar a metodologia de diagnóstico de influência local de Cook (1986) conjuntamente com a metodologia de seleção da perturbação adequada proposta por Zhu et al. (2007) no modelo de calibração ultraestrutural com réplicas. A metodologia de Cook (1986) ser´a utilizada para investigar a robustez e a sensibilidade do modelo, onde os esquemas de perturbação adotados foram ponderação de casos e na variável resposta. Perturbar o modelo e/ou os dados de forma arbitrária pode conduzir a interpretações sobre a análise de diagnóstico e a conclusões equivocadas. Portanto, este trabalho irá avaliar as perturbações propostas segundo a metodologia de Zhu et al. (2007) e caso as perturbações não sejam adequadas, iremos propor uma nova forma de fazer as perturbações. Foi utilizado como aplicação a análise de um conjunto de dados com réplicas balanceadas e foram avaliadas quais patamares e laboratórios exercem um efeito desproporcional nas inferências feitas sob o modelo. / This paper aims to present the local influence diagnostic methodology of Cook (1986) along with the selection of the appropriate perturbation schemes proposed by Zhu et al. (2007) in ultrastructural calibration model with replicas. The methodology of Cook (1986) will be used to investigate the robustness and sensitivity of model, where the adopted perturbation schemes were weighting cases and response variables. Perturbing the model and/or data in an arbitrary way can lead to miss interpretations of diagnostic analysis and wrong conclusions. Therefore, this study will evaluate the induced perturbations according to the methodology of Zhu et al. (2007) and if the perturbations are not suitable, we will propose a new way of perturbing the model or the data. As an application it was considered a data set with balanced repeated replication to evaluate which levels and laboratories exercise a disproportional effect on inferences made in the model.
|
54 |
Identifiability in Knowledge Space Theory: a survey of recent resultsDoignon, Jean-Paul 28 May 2013 (has links)
Knowledge Space Theory (KST) links in several ways to Formal Concept Analysis (FCA). Recently, the probabilistic and statistical aspects of KST have been further developed by several authors. We review part of the recent results, and describe some of the open problems. The question of whether the outcomes can be useful in FCA remains to be investigated.
|
55 |
Electoral reform: why care? Opinion formation and vote choice in six referendums on electoral reformReimink, Elwin 26 May 2015 (has links)
This PhD thesis explores the question how citizens react when they are confronted with complex institutional questions related to politics. Specifically, we look at how citizens vote when they are asked for their opinion in a referendum on amending the electoral system of their country. Traditionally, electoral systems have been considered the political playing ground of political elites. It is hence interesting to see what happens when the ‘power of decision’ shifts to citizens, who are supposed to have little interest in, or knowledge about, electoral systems. We observe that citizens partially mimic political elites in their behaviour, by following partisan considerations: citizens judge electoral reforms on the consequences for their favoured parties. Moreover, citizens tend to incorporate values when judging electoral reforms: a particular effect is caused by the left-right-distinction, with left-wing voters being more attracted towards more proportional systems. Finally, we observe that how citizens react to electoral systems is affected by their baseline knowledge on politics. More knowledgeable citizens tend to judge more on substantial grounds, while less knowledgeable citizens rather tend to judge on miscellaneous grounds. We conclude by arguing that citizens can and do form substantial opinions on complex subjects like institutional reforms, but that some baseline knowledge is nonetheless required in order to substantially participate in the democratic decision-making process. / Doctorat en Sciences politiques et sociales / info:eu-repo/semantics/nonPublished
|
56 |
Climate Change Effects on Rainfall Intensity-Duration-Frequency (IDF) Curves for the Town of Willoughby (HUC-12) Watershed Using Various Climate ModelsMainali, Samir 18 July 2023 (has links)
No description available.
|
57 |
The effects of an extended prompt versus a typical prompt on the length and quality of first draft essays written by secondary students with mild disabilitiesHessler, Theresa L. 24 August 2005 (has links)
No description available.
|
58 |
Cvičení v přírodě jako prostředek vyrovnávání svalových dysbalancí / Exercise in nature as a means of correcting muscle imbalancesBuršíková, Kateřina January 2013 (has links)
Title Outdoor excercising - muscle imbalance reduction instrument Summary This thesis main topic is pre-school children muscle imbalance problems. Theoretical part deals with importace of execercising for pre-school children and postural functions together with respiratory systems using the Matthias test are pointed out in theoretical part. Next part deals with a mapping of possible physical activities which can help to eliminate muscle imbalances. The main topic of practical part is children appliance physical system and bellows vital capacity analysis. In next part is evaluated muscle imbalances caused by outdoor games and activities and children respiratory system. Then is mentioned influence of outdoor exercising regarding to indoor activites in nursery schools. There is also mentioned possible influence of outdoor activities outside of nursery schools on pre-school childern optimal posture. The main goal of this thesis is to find out if movement activities and outdoor excercising influence posture positively and prevent muscle imbalance of pre-school childern. Key words Muscle imbalance, pre-school, correct and faulty posture, Matthias test, outdoor exercising, indoor activities in nursery school.
|
59 |
Levando o direito a sério: uma exploração hermenêutica do protagonismo judicial no processo jurisdicional brasileiroMotta, Francisco José Borges 10 July 2009 (has links)
Made available in DSpace on 2015-03-05T17:21:10Z (GMT). No. of bitstreams: 0
Previous issue date: 10 / Nenhuma / As reflexões alinhadas no presente trabalho poderiam ser resumidas na seguinte pergunta: o que a teoria do Direito de Ronald Dworkin, filtrada pela Crítica Hermenêutica do Direitode Lenio Streck, tem a dizer sobre o processo jurisdicional (civil) que se pratica no Brasil? Dworkin desenvolveu a noção, por todos conhecida, de que há, nos quadros de um Direito democraticamente produzido, uma “única resposta correta” (the one right answer) para cada um dos casos que o interpelam. Movia-lhe,desde o início, o propósito de identificar os direitos (principalmente, os individuais) que as pessoas realmente têm num ambiente democrático, e o entendimento de que o “tribunal” deveria torná-los, o quanto possível, acessíveis aos seus titulares. Agora, dezenas de anos depois, e no Brasil, vem Lenio Luiz Streck e afirma serem não só possíveis, mas também necessárias as tais “respostas corretas” em Direito. Esta pesquisa propõe-se a compreender melhor estas mensagens (tanto a de Dworkin como a de Streck) e a, com elas, conduzir uma reflexão sobre o processo jurisdicional brasileiro, que deverá ser redefinido a partir da necessidade de proporcionar a produção das tais “respostas corretas” em Direito. Trabalhar com “respostas corretas em Direito” implica reconhecer o acentuado grau de autonomia por este atingido, desde a assunção de um perfil não-autoritário (neoconstitucionalismo). Implica, portanto, entre outras coisas, reconhecer que o Direito é (bem) mais do que aquilo que os juízes dizem que ele é. As boas respostas são do Direito, compreendido como integridade, e não do juiz, individualmente considerado. De modo que uma compreensão hermenêutica do processo civil brasileiro, comprometida com estas noções todas, deverá dar conta de quebrar o “dogma” do protagonismo judicial (movimento de expansão dos poderes e liberdades do juiz na condução e solução das causas que chegam ao “tribunal”). Levar o Direito a sério, pois, é dissolver, no paradigma hermenêutico, a subjetividade do julgador em meio à intersubjetividade que é própria de um Estado Democrático. No âmbito do processo, levar o Direito a sério determina o compartilhamento decisório entre os sujeitos processuais, que deverão argumentar em favor de direitos, e em prol da construção da teoria que melhor justifique, principiologicamente, o Direito como um todo. O mínimo que se exige para que esse ideal seja atingido é a garantia de que o procedimento seja desenvolvido em efetivo contraditório, de modo que os argumentos das partes sejam decisivos para a construção da decisão judicial (o que se verificará, substantivamente, desde a exigência de uma fundamentação “completa” do provimento jurisdicional, que abranja não só os argumentos vencedores, mas também as razões pelas quais foram rejeitados os argumentos em sentido contrário). Por fim, o ato sentencial, para que reflita uma “resposta correta”, deverá espelhar um entendimento compartilhado não só entre os sujeitos processuais, mas também com os juízes do passado (história jurídico institucional exitosa). / The reflections aligned in this work could be summarized in the following question: what does the Ronald Dwokin ́s theory of law, filtered by the Lenio Streck’s Hermeneutics Critical of Law, has to say about the (civil) court process which is practiced in Brazil? Dworkin has developed the concept, known by all, that there is, at the tables of a democratically constituted law, a “single correct answer” (the one right answer) for each one of the cases that reaches the forum. He was moved, from the beginning, by the purpose of identifying the rights (especially, the individual ones) that people really have at a democratic environment, and the understanding that the “court” should make these rights, as much as possible, accessible to their holders. Now, after dozens of years, in Brazil, comes Lenio Luiz Streck and says that those “right answers” are not only possible, but also necessary in Law. This research aims to better understand these messages (both Dworkin’s and Streck’s) and, with them, lead a discussion about the brazilian judicial process, which should be redefined from the need to provide the production of such “right answers” in Law. Working with “right answers in Law” means to recognize the strong degree of autonomy that Law has achieved, since the assumption of a non-authoritarian profile (neoconstitucionalism). Therefore, it implies, amongst other things, that Law is (far) more than what judges say it is. Good answers are the ones given by the Law, understood as integrity, not by the judge, individually considered. So that a hermeneutic understanding of Brazilian procedural law, committed to all these concepts, should be able to break the “dogma” of the judicial protagonism (movement that expands the powers and freedoms of the judge in the conduction and settlement of cases that come to the “court”). Therefore, in the hermeneutic paradigm, to take Law seriously is to dissolve the subjectivity of judging in the intersubjectivity that characterizes the Democratic State. In the midst of the process, taking Law seriously implies sharing the decision between the procedural actors, which should argue in favor of rights, and for the construction of the theory that best justifies, by principles, Law as a whole. The minimum that is required for this ideal to be achieved is to guarantee the procedure to be developed in effective contradictory, so that the parties' arguments are decisivefor the construction of the court’s decision (which should be confirmed, substantively, from the requirement of a “complete” reasoning of the judicial decision, covering not only the winning arguments, but also the reasons why the arguments in the opposite direction were rejected). At last, the final decision, in order to reflect a “correct answer”, should mirror a shared understanding amongst not only the procedural actors, but also between them and the judges from the past (successful juridical and institutional history).
|
60 |
Automatic generation of proof terms in dependently typed programming languagesSlama, Franck January 2018 (has links)
Dependent type theories are a kind of mathematical foundations investigated both for the formalisation of mathematics and for reasoning about programs. They are implemented as the kernel of many proof assistants and programming languages with proofs (Coq, Agda, Idris, Dedukti, Matita, etc). Dependent types allow to encode elegantly and constructively the universal and existential quantifications of higher-order logics and are therefore adapted for writing logical propositions and proofs. However, their usage is not limited to the area of pure logic. Indeed, some recent work has shown that they can also be powerful for driving the construction of programs. Using more precise types not only helps to gain confidence about the program built, but it can also help its construction, giving rise to a new style of programming called Type-Driven Development. However, one difficulty with reasoning and programming with dependent types is that proof obligations arise naturally once programs become even moderately sized. For example, implementing an adder for binary numbers indexed over their natural number equivalents naturally leads to proof obligations for equalities of expressions over natural numbers. The need for these equality proofs comes, in intensional type theories (like CIC and ML) from the fact that in a non-empty context, the propositional equality allows us to prove as equal (with the induction principles) terms that are not judgementally equal, which implies that the typechecker can't always obtain equality proofs by reduction. As far as possible, we would like to solve such proof obligations automatically, and we absolutely need it if we want dependent types to be use more broadly, and perhaps one day to become the standard in functional programming. In this thesis, we show one way to automate these proofs by reflection in the dependently typed programming language Idris. However, the method that we follow is independent from the language being used, and this work could be reproduced in any dependently-typed language. We present an original type-safe reflection mechanism, where reflected terms are indexed by the original Idris expression that they represent, and show how it allows us to easily construct and manipulate proofs. We build a hierarchy of correct-by-construction tactics for proving equivalences in semi-groups, monoids, commutative monoids, groups, commutative groups, semi-rings and rings. We also show how each tactic reuses those from simpler structures, thus avoiding duplication of code and proofs. Finally, and as a conclusion, we discuss the trust we can have in such machine-checked proofs.
|
Page generated in 0.0576 seconds