• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 134
  • 91
  • 33
  • 32
  • 26
  • 16
  • 13
  • 13
  • 9
  • 7
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 443
  • 92
  • 84
  • 82
  • 61
  • 50
  • 39
  • 37
  • 34
  • 33
  • 32
  • 32
  • 30
  • 29
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

An LDPC error control strategy for low earth orbit satellite communication link applications

Olivier, Francois Jacobus 12 1900 (has links)
Thesis (MScEng (Electrical and Electronic Engineering))--University of Stellenbosch, 2009. / ENGLISH ABSTRACT: Low earth orbit (LEO) satellite communication presents a unique environment which inherently di ers from most other communication channels. Due to the varying orbital patterns of LEO satellites the link exhibits varying link margins. Limited communication time windows need to be optimised to maximise the volumetric data throughput. Large coding gains can be obtained by the implementation of forward error correction codes. This thesis presents a means for optimising the data throughput of LEO satellite communication through the implementation of a mission speci c error control strategy. Low density parity check (LDPC) codes are versatile and present good error performances at many di erent code rates and block lengths. With power limitations on the space segment and remote ground stations, hardware utilisation e ciency must be optimised to reduce power consumption. In response to this requirement, this thesis evaluates various algorithms for LDPC decoders. An iterative LDPC decoder, implementing an approximation algorithm, is presented as a low complexity solution with good error performance. The proposed solution provides a very good balance between required hardware complexity and coding performance. It was found that many parameters of the decoders and codes can be altered to allow the implementation of these codes in systems with varying memory and processing capabilities. / AFRIKAANSE OPSOMMING: Kommunikasiekanale van satelliete met lae wentelbane, bied 'n unieke omgewing wat inherent verskil van meeste ander kommunikasiekanale. As gevolg van veranderende wentelbaanpatrone, vertoon die kanaal 'n wisselende foutgedrag. Kommunikasievensters is beperk en moet geoptimeer word om die totale deurset van die stelsel te maksimeer. Groot koderingswinste kan verkry word deur die implementering van foutkorreksie kodes. Hierdie tesis voorsien 'n metode om die datadeurset van satelliete met lae wentelbaan te optimeer, deur middel van implementering van 'n missie-spesi eke foutbeheer strategie. Lae digtheid pariteit toetskodes (LDPC) is veelsydige kodes, bied goeie foutbeheer en is doeltre end vir verskillende kodekoerse en bloklengtes. Met drywingsbeperkinge op die ruimtesegment en afgesonderde grondstasies, moet hardeware komponente doeltreffend gebruik word om drywingsverbruik te verminder. Ten einde aan hierdie ontwerpsvereiste te voldoen, evalueer hierdie tesis verskeie LDPC dekodeerderalgoritmes. Deur 'n iteratiewe LDPC dekodeerder met 'n benaderingsalgoritme te implementeer, word 'n oplossing van lae kompleksiteit aangebied, maar wat steeds goeie foutkorreksie eienskappe toon. Die voorgestelde oplossing bied 'n baie goeie balans tussen benodigde hardeware kompleksiteit en koderingsprestasie. Daar is gevind dat heelwat parameters van die dekodeerders en kodes aangepas kan word, ten einde implementering in stelsels met 'n wye verskeidenheid van geheuespasie en verwerkingsvermoëns moontlik te maak.
242

Implementation of a protocol and channel coding strategy for use in ground-satellite applications

Wiid, Riaan 03 1900 (has links)
Thesis (MScEng)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: A collaboration between the Katholieke Universiteit van Leuven (KUL) and Stellenbosch University (SU), resulted in the development of a satellite based platform for use in agricultural sensing applications. This will primarily serve as a test platform for a digitally beam-steerable antenna array (SAA) that was developed by KUL. SU developed all flight - and ground station based hardware and software, enabling ground to flight communications and interfacing with the KUL SAA. Although most components had already been completed at the start of this M:Sc:Eng: project, final systems integration was still unfinished. Modules necessary for communication were also outstanding. This project implemented an automatic repeat and request (ARQ) strategy for reliable file transfer across the wireless link. Channel coding has also been implemented on a field programmable gate array (FPGA). This layer includes an advanced forward error correction (FEC) scheme i.e. a low-density parity-check (LDPC), which outperforms traditional FEC techniques. A flexible architecture for channel coding has been designed that allows speed and complexity trade-offs on the FPGA. All components have successfully been implemented, tested and integrated. Simulations of LDPC on the FPGA have been shown to provide excellent error correcting performance. The prototype has been completed and recently successfully demonstrated at KUL. Data has been reliably transferred between the satellite platform and a ground station, during this event. / AFRIKAANSE OPSOMMING: Tydens ’n samewerkingsooreenkoms tussen die Katholieke Universiteit van Leuven (KUL) en die Universiteit van Stellenbosch (US) is ’n satelliet stelsel ontwikkel vir sensor-netwerk toepassings in die landbou bedryf. Hierdie stelsel sal hoofsaaklik dien as ’n toetsmedium vir ’n digitaal stuurbare antenna (SAA) wat deur KUL ontwikkel is. Die US het alle hardeware en sagteware komponente ontwikkel om kommunikasie d.m.v die SAA tussen die satelliet en ’n grondstasie te bewerkstellig. Sedert die begin van hierdie M:Sc:Ing: projek was die meeste komponente alreeds ontwikkel en geïmplementeer, maar finale stelselsintegrasie moes nog voltooi word. Modules wat kommunikasie sou bewerkstellig was ook nog uistaande. Hierdie projek het ’n ARQ protokol geïmplementeer wat data betroubaar tussen die satelliet en ’n grondstasie kon oordra. Kanaalkodering is ook op ’n veld programmeerbare hekskikking (FPGA) geïmplementeer. ’n Gevorderde foutkorrigeringstelsel, naamlik ’n lae digtheids pariteit toetskode (LDPC), wat tradisionele foutkorrigeringstelsels se doeltreffendheid oortref, word op hierdie FPGA geïmplementeer. ’n Kanaalkoderingsargitektuur is ook ontwikkel om die verwerkingspoed van data en die hoeveelheid FPGA logika wat gebruik word, teenoor mekaar op te weeg. Alle komponente is suksesvol geïmplementeer, getoets en geïntegreer met die hele stelsel. Simulasies van LDPC op die FPGA het uistekende foutkorrigeringsresultate gelewer. ’n Werkende prototipe is onlangs voltooi en suksesvol gedemonstreer by KUL. Betroubare data oordrag tussen die satelliet en die grondstasie is tydens hierdie demonstrasie bevestig.
243

外匯市場的技術分析與央行干預 / Technical trading rules in the exchange rate markets and central bank intervention

吳至剛 Unknown Date (has links)
在這篇文章裡我們採用了White所提出的真實檢驗法(Reality Check)來解決探勘資料偏誤(Data-snooping bias)的問題,結果顯示從1980年到2008年間,技術分析法則的確可以幫助投資人在日圓兌美元及英鎊兌美元這兩個外匯市場獲利;我們也發現在外匯市場最普遍的技術分析方式─移動平均法(moving average)表現不如其他的技術分析法則,而通道突破法(channel break-out)的表現則明顯優於其他技術分析法則。 除了檢驗技術分析方法的獲利性之外,我們也嘗試著探討技術分析方法的獲利性與央行干預之間的關係,追隨Szacmary與Mathur在1997年所發表的論文,我們把技術分析法則擴充為在真實檢驗法中所使用到的所有法則,並且盡可能加長分析的期間。結果顯示技術分析法則的獲利與央行干預並不存在任何特定的關係。 / In this paper we construct a huge universe of simple trading rules and apply White’s Reality Check to mitigate data-snooping bias then detect the profitability of technical trading rules. We find that technical analysis is useful no matter in the full sample time or each subsample period. The channel break-out method outperforms the other methods in our finding while the profitability of the most commonly used moving average method is worse than the others. Furthermore, we inspect the relationships between the returns of technical trading rules and central bank intervention. The results suggest that there’s no evident relationship between the return series of trading rules and central bank intervention and are not consistent with the view of our following previous study.
244

Système de listes de vérification interactives du niveau de conformité des maquettes avec les recommandations des fabricants de plateformes mobiles

Ghezzal, Nadir 08 1900 (has links)
L'ensemble de mon travail a été réalisé grâce a l'utilisation de logiciel libre. / La demande d'applications pour les plateformes mobiles a explosé ces dernières années. Chaque compagnie souhaite maintenant offrir pour ces nouveaux appareils les mêmes services que ceux offerts traditionnellement sur internet. Cependant, ces entreprises n'ont bien souvent que peu ou pas de connaissances concernant le développement et le déploiement de tels services. Généralement, la solution choisie consiste à externaliser ce travail en le sous-traitant à une autre compagnie. Dans ce contexte, il est souvent compliqué de s'assurer d'une part que le sous-traitant respecte le cahier des charges et d'autre part que le travail fourni est conforme aux recommandations émises par les fabricants des plateformes mobiles. Afin de pallier au second problème, nous avons créé un système de listes de vérification interactives pour plateformes mobiles. Ce système permet d'évaluer le niveau de conformité des différents composants de l'interface d'une application développée pour une plateforme mobile avec les recommandations du fabricant de cette plateforme. La solution retenue permet de se concentrer sur certains éléments particuliers de l'interface et pallie ainsi aux limites des listes de vérification classiques qui sont souvent trop longues et peu pratiques. La solution retenue offre de plus la possibilité de comparer facilement les caractéristiques des plateformes mobiles entre elles. Pour réaliser ce système, nous avons consulté de nombreux documents portant sur l'univers des plateformes mobiles afin de mieux appréhender leurs spécificités. Suite à l'étude de différentes plateformes, deux d’entre elles, soit iOS et BlackBerry, ont été retenues (il faut noter que la compagnie où s'est effectuée la recherche disposait déjà d'applications pour ces plateformes). Nous avons ensuite analysé plus finement la documentation technique fournie par chacun des fabricants afin d'en extraire les points importants. Afin que les données soient comparables, nous avons créé une nomenclature commune regroupant les composants de l'interface graphique en grandes familles (ex. : barres, saisie d'information, …) en tenant compte également du type d'interaction avec l'appareil (ex. : écran tactile). Nous avons ensuite conçu une solution permettant d'évaluer le niveau de conformité d'une application. L'idée retenue se base sur des listes de vérification permettant de systématiser le processus d'évaluation. Pour pallier aux limites de ces listes, souvent trop longues, nous permettons à l'utilisateur, via un ensemble de filtres, de se concentrer sur un sous-ensemble de composants de l’interface qu'il souhaite évaluer. L'implémentation de cette solution a été réalisée en totalité grâce à des technologies libres et de standard ouvert. De cette façon, nous nous sommes assurés de la portabilité de la solution développée et de l’absence de coûts supplémentaires liés à l'achat de licences. Le système utilise les standards web et repose sur une architecture basée sur le système d'exploitation GNU/Linux, le serveur web Apache, la base de données MySQL et le langage de scripts PHP. Ces logiciels ont déjà fait leurs preuves aussi bien pour les entreprises que pour les particuliers. De plus, la communauté très active qui s'est constituée autour de ces logiciels assure un bon support et une grande stabilité. Après avoir fait le choix de l’environnement de développement, la phase d'implémentation s'est ensuite déroulée en plusieurs étapes. Tout d'abord, l'information a été structurée selon la nomenclature mentionnée plus haut afin de créer le schéma de la base de données. Ensuite, toutes les pages ont été codées avec le langage à balises HTML et les scripts PHP pour le côté serveur et avec JavaScript pour le côté client. Enfin, l'information peut être affichée grâce aux feuilles de style en cascade (CSS), une technologie web permettant de séparer le fond de la forme en matière de mise en page. Nous avons choisi un modèle de développement itératif qui a impliqué les principaux utilisateurs dès les premières étapes de la conception du système. Cette implication s’est poursuivie jusqu’à la fin du projet afin de s'assurer que les fonctionnalités mises en place répondaient aux attentes. L’architecture modulaire qui a été retenue permet également d’adapter facilement le système aux besoins futurs. Afin de mieux comprendre comment on utilise le système, on passe en revue les différentes étapes nécessaires à la réalisation d'une évaluation. Enfin, on a réalisé une étude avec quatre utilisateurs pour évaluer l'utilisabilité du système et recueillir leur niveau de satisfaction. / The need for mobile platforms has increased in the last decade. Companies offering traditional internet services now want to move their applications on these new devices. But, most of the time, these companies do not really have the knowledge to create such applications and often ask a third party company to do the job for them. In this context, it is difficult to evaluate if the solution developed follows the recommendations of the device maker. A system based on interactive check lists has thus been created for this purpose. This system enables the evaluation of the conformity level of an application with the recommendations of the device maker, thanks to numerous filters that let the user focus on specific parts of the graphic interface. The idea behind the project was first to address some problematic issues with classical check lists and to enable the comparison of several mobile platforms with regard to specific interface components. To create this system, a lot of information about mobile platforms has first been collected. After reviewing many mobile platforms, it was decided to focus on iOS and BlackBerry, since the company where the research was performed had already applications running on them. For each platform, the major recommandations to be satisfied were identified. Also, to be sure that the extracted recommendations could be compared, a common nomenclature has been created, where the recommandations are grouped by interface components (ex: bars, data input,…) and by the type of interaction (ex: touch screen). After these preliminary steps, a solution for evaluating the level of compliance was created. We favored a check list approach because it offers a systematic evaluation process. To avoid lists of excessive length, filters were introduced in the system to allow the user to focus on particular aspects of the interface. The implementation was totally realized with open source technologies and open standards. This choice was motivated by the portability of the developed system and by the absence of licence fees. The system relies on web standards and runs on an architecture made of the CNU/Linux operating system, Apache web server, MySQL database and the script language PHP. This software has already proven its reliability for enterprises and for home users. Furthermore, the community evolving around this software offers a good support and ensures a high level of stability. After setting up the development environment, the implementation phase was engaged and took place over a number of phases. The first phase was the creation of the database structure, using the aforementioned nomenclature. The next phase was dedicated to the coding of the different web pages, thanks to the tag language HTML and the PHP scripts on the server side and JavaScript on the client side. Finally, the web page setting was developed using the cascading style sheet (CSS), a web technology that segregates the substance from the style of the web content. We chose an iterative development model where the end users were involved from the early stages of the project. This approach provides a guarantee that the user requirements are fulfilled and that any new developments will be in accordance with the expectations. Furthermore, the system is such that it can be easily modified to tackle future needs. To be able to understand how we use the systeme, we are reviewing the different steps needed to realise an evaluation. Finally, we have made a study with four users to evaluate the usability of the system and to gather their satisfaction level.
245

Dotazník interpersonální diagnózy jako diagnostický nástroj personální psychologie / Interpersonal Check List as a Diagnostic Tool of Industrial and Organizational Psychology

Hanuš, David January 2011 (has links)
The aim of this paper is to verify the applicability of Interpersonal Check List for assessing managerial potential, namely leadership potential. The evolution and findings of leadership research are summarized in the theoretical part of the article. Possible effects of personality traits on leader effectivity are theoretically explained. There is evidence that these traits affect the perception of leaders. Empirical tests show that employees in leadership positions score higher in several dimensions of Interpersonal Check List, particularly in managerial - autocratic, competitive - exploitive and blunt - aggresive dimensions. The discussion of these findings leads to the conclusion that the mentioned dimensions could be considered as indicators of leadership potential. Interpersonal Check List is a valid instrument for assessing leadership potential.
246

Where are you? : A qualitative investigation of self-service technology in the hotel industry

Fredriksson, Sara, Schmidt, Anna January 2019 (has links)
Throughout the last years, the service encounter has gone through drastic changes due to rapid technological developments. The research area of service marketing is therefore putting a stronger focus on the academic field of technology-infused service encounters. Marketers have moreover also started to implement self-service technology (SST) within their service encounters, in order enhance their service delivery. This phenomenon has also been visible within the hospitality industry, whereby the hotel industry has experienced an increasing adoption of SST encounters. Consequently, hotels’ implementation of SST kiosks has enhanced the hotel guests’ participation level within the service encounter. Therefore, this thesis aims to gather insights on the emerging SST customer roles, by investigating the guests’ attitude towards the SST encounter. This leads to the investigation of distinctive SST customer roles within hotels’ self-check-ins.   The choice of the research topic was driven by the fact that existing research about SST implementation in the service encounter, lacks an investigation of the customers’ perspective. Thereby, a research gap was identified that outlays the customers’ enhanced participation as service co-producers. Previous research has focused on investigating customers’ technology acceptance, rather than their own identification as co-producers. Thereof, this thesis will put an emphasis on hotel guests’ attitude towards SST, as a specific aspect of the technology acceptance process. Moreover, this thesis will focus its investigation on self-check-ins within hotels that do not incorporate a human interaction point for their guests. Consequently, in order to investigate the SST customer roles more thoroughly, this thesis will also consider the viewpoint of the hotel. Therefore, qualitative interviews among nineteen hotel guests and one hotel representative were conducted.   The findings of this thesis revealed that SST customer roles cannot be predetermined within a hotel self-check-in. Instead, the hotel guests define their own SST customer roles through a reflection upon their purpose of the hotel stay, expectations of the check-in encounter, arising special needs and perceived benefits of the self-check-in. This determines their individual ‘role file card’, which can be utilized into the proposed SST customer roles archetypes of: technology enthusiast, beneficiary, traditionalist and contradictor. In regard to that, it has been identified that the information provided prior to the hotel guests’ stay, influences the reflection upon their SST customer roles. Moreover, it was discovered that with their increased responsibility over the check-in process, the hotel guests identified themselves as co-producers. Therefore, the findings outlined that the guests would want the price of the hotel stay to reflect their increased involvement in the service delivery.   From a theoretical perspective this thesis bridged the identified research gap of investigating hotel guests’ attitude towards SST, in order to classify the different customer roles that arise within a hotel’s self-check-in. Moreover, the findings revealed practical implications for hotel managers in regards of the need to provide their guests with more information about the SST encounter. Thereby, hotel managers will be enabled to increase their guests’ satisfaction with the SST encounter, as the guests’ expectations would be coherent with the hotel’s offerings.
247

Interpretable machine learning for additive manufacturing

Raquel De Souza Borges Ferreira (6386963) 10 June 2019 (has links)
<div>This dissertation addresses two significant issues in the effective application of machine learning algorithms and models for the physical and engineering sciences. The first is the broad challenge of automated modeling of data across different processes in a physical system. The second is the dilemma of obtaining insightful interpretations on the relationships between the inputs and outcome of a system as inferred from complex, black box machine learning models.</div><div><br></div><div><b>Automated Geometric Shape Deviation Modeling for Additive Manufacturing Systems</b></div><div><b><br></b></div><div>Additive manufacturing systems possess an intrinsic capability for one-of-a-kind manufacturing of a vast variety of shapes across a wide spectrum of processes. One major issue in AM systems is geometric accuracy control for the inevitable shape deviations that arise in AM processes. Current effective approaches for shape deviation control in AM involve the specification of statistical or machine learning deviation models for additively manufactured products. However, this task is challenging due to the constraints on the number of test shapes that can be manufactured in practice, and limitations on user efforts that can be devoted for learning deviation models across different shape classes and processes in an AM system. We develop an automated, Bayesian neural network methodology for comprehensive shape deviation modeling in an AM system. A fundamental innovation in this machine learning method is our new and connectable neural network structures that facilitate the transfer of prior knowledge and models on deviations across different shape classes and AM processes. Several case studies on in-plane and out-of-plane deviations, regular and free-form shapes, and different settings of lurking variables serve to validate the power and broad scope of our methodology, and its potential to advance high-quality manufacturing in an AM system.</div><div><br></div><div><b>Interpretable Machine Learning</b></div><div><b><br></b></div><div>Machine learning algorithms and models constitute the dominant set of predictive methods for a wide range of complex, real-world processes. However, interpreting what such methods effectively infer from data is difficult in general. This is because their typical black box natures possess a limited ability to directly yield insights on the underlying relationships between inputs and the outcome for a process. We develop methodologies based on new predictive comparison estimands that effectively enable one to ``mine’’ machine learning models, in the sense of (a) interpreting their inferred associations between inputs and/or functional forms of inputs with the outcome, (b) identifying the inputs that they effectively consider relevant, and (c) interpreting the inferred conditional and two-way associations of the inputs with the outcome. We establish Fisher consistent estimators, and their corresponding standard errors, for our new estimands under a condition on the inputs' distributions. The significance of our predictive comparison methodology is demonstrated with a wide range of simulation and case studies that involve Bayesian additive regression trees, neural networks, and support vector machines. Our extended study of interpretable machine learning for AM systems demonstrates how our method can contribute to smarter advanced manufacturing systems, especially as current machine learning methods for AM are lacking in their ability to yield meaningful engineering knowledge on AM processes. <br></div>
248

BSPONP2P: modelo para exploração da computação colaborativa em aplicações BSP para ambientes grades P2P

Veith, Alexandre da Silva 29 August 2014 (has links)
Submitted by Mariana Dornelles Vargas (marianadv) on 2015-05-19T19:12:55Z No. of bitstreams: 1 BSPonP2P.pdf: 2195234 bytes, checksum: 340cdfee0a4f8e27bd919cc3d5513c45 (MD5) / Made available in DSpace on 2015-05-19T19:12:55Z (GMT). No. of bitstreams: 1 BSPonP2P.pdf: 2195234 bytes, checksum: 340cdfee0a4f8e27bd919cc3d5513c45 (MD5) Previous issue date: 2014-08-29 / Nenhuma / Tecnologias constantemente estão avançando nas áreas de sistemas distribuídos e computação paralela. Isso acontece porque a compra de equipamentos eletrônicos esta acessível, por isso empresas, cada vez mais, estão apostando em soluções baratas para que todos tenham acesso. Consequentemente, existe um problema que é o desperdício na utilização destes equipamentos. Grande parte de seu tempo esses equipamentos ficam ociosos. Nesse contexto, esta dissertação apresenta o modelo BSPonP2P para minimizar o problema, buscando aproveitar esse desperdício para algum fim útil. O BSPonP2P utiliza uma abordagem de Computação em Grade P2P, ela visa utilizar os equipamentos para execução de computação útil. O objetivo e fazer as execuções de maneira concorrente. O BSPonP2P cria um ambiente com abordagens baseadas nos modelos estruturado e não estruturado vindos da arquitetura P2P, o que foi implementado para agilizar o gerenciamento da comunicação e das informações dentro da rede. Outro diferencial do modelo proposto e a utilização do modelo de programação paralela Bulk Synchronous Parallel (BSP), que cria um ambiente para execução de processos, validando as dependências e aprimorando a comunicação entre eles. A partir de avaliações de métricas como memória, computação, comunicação e dados de equipamentos, e criado um índice denominado de PM. Esse índice é avaliado periodicamente para definir migrações conforme a variável de ambiente a , que esta diretamente ligada as barreiras das supersteps. A partir de avaliações obtidas nas distribuições de processos em um ambiente heterogêneo criado, o modelo BSPonP2P se mostrou eficaz porque ele obteve bons resultados, como, por exemplo, na simples execução da aplicação, comparando com a execução do BSPonP2P, houve um aumento menor que 4% no tempo de execução. Além disso, na execução de 26 processos com 2000 Supersteps e a = 16, obteve-se um ganho de 6% a partir de 24 migrações de processos. Sendo assim, como contribuição científica, optou-se pela utilização de redes em Grades P2P com aplicações BSP, usando métricas como memória, computação, comunicação e dados de equipamentos para avaliação do ambiente. Além de, serviços como migração e checkpoint que possibilitaram um bom desempenho do modelo. / Technologies are constantly advancing in the areas of distributed systems and parallel computing. This is because the purchase of electronic equipment is accessible, so companies increasingly are betting on cheap solutions for every one to access. Accordingly, there is a problem that the wasteful use of such equipment. Most of these have access to the execution of computation, however, a large part of their time sit idle. In this context, this dissertation proposal presents BSPonP2P model to minimize the problem trying to enjoy this waste for any useful purpose. In the proposed model, a P2P Desktop Grid that seeks to use equipment to perform useful computing competitor among its users and Desktop Grid network users way approach will be used. The BSPonP2P will create an environment with models based on structured and unstructured P2P architecture approaches coming, that will be implemented to streamline the management and communication of information within the network. Another difference that the proposed model will have is the use of Bulk Synchronous Parallel (BSP) parallel programming model, which creates an environment for process execution dependencies validating and improving the communication between them. From reviews of metrics such as memory, computation, data communications equipment and an index called PM is created. This index is periodically valuated to define migration as the environment variable α, which is directly linked to the supersteps’ barriers. Based on the ratings obtained from the distributions of processes in a heterogeneous environment created, BSPonP2P model is demonstrated effective. This is because the model had good results, for example, the simple execution application compared to running the BSPonP2P there was a smaller increase than 4% in execution time. Moreover, the implementation of 26 cases with 2000 supersteps and alpha = 16 yielded a gain of 6% from 24 migration process. Thus, atom scientific contribution opted for the use of networks in P2P Grids with BSP applications using metrics such as memory, computation, communication and data equipment for environmental assessment.
249

Professores em construção: um estudo sobre a docência inicial numa faculdade confessional do sul do Brasil

Silva Junior, Pedro Paulo da 03 March 2016 (has links)
Submitted by Silvana Teresinha Dornelles Studzinski (sstudzinski) on 2016-05-09T16:39:33Z No. of bitstreams: 1 Pedro Paulo da Silva Junior_.pdf: 1387377 bytes, checksum: 406aabbaaa602ea75591dc7a69bac631 (MD5) / Made available in DSpace on 2016-05-09T16:39:33Z (GMT). No. of bitstreams: 1 Pedro Paulo da Silva Junior_.pdf: 1387377 bytes, checksum: 406aabbaaa602ea75591dc7a69bac631 (MD5) Previous issue date: 2016-03-03 / UNISINOS - Universidade do Vale do Rio dos Sinos / Esta pesquisa teve como objetivo principal compreender o processo de construção de professores iniciantes na docência do ensino superior. Tomou como lócus a Faculdade Murialdo localizada no município de Caxias do Sul, no Estado do Rio Grande do Sul. Trata-se de uma IES isolada de caráter confessional. Na perspectiva de atender ao intento investigativo, foi desenvolvida uma pesquisa de caráter qualitativo, ouvindo, através de entrevistas semiestruturadas, sete docentes de áreas distintas, com até três anos de experiência no magistério superior. Essa escolha possibilitou melhor configurar a abrangência do estudo. Para a interpretação dos dados foram utilizados os princípios da Análise de Conteúdo. Os aportes teóricos principais tiveram como base os estudos Hubermann, Marcelo Garcia, Freire, Anastasiou, Pimenta, Cunha, Mayor Ruiz, Gaeta e Masetto, Isaia, Nóvoa, Shön, Tardif, Zabalza e Zeballos. Na perspectiva de alcançar os objetivos, algumas questões foram centrais: O que motiva os profissionais de diferentes áreas a se envolverem no magistério superior? Como os docentes constroem os saberes necessários a essa profissão? Que desafios enfrentam na prática de ensinar e aprender? Que valor dão aos conhecimentos do campo da pedagogia universitária? Que expectativas têm sobre o apoio institucional no seu trabalho? Ampliando ainda o estudo, procuramos, também, compreender as principais experiências que viveram nos anos iniciais da carreira docente e o que significaram para as suas práticas cotidianas. Interessou saber, ainda, como as estratégias de formação e desenvolvimento institucional podem ser mobilizadas para fortalecer a docência. Os dados evidenciaram o “choque de realidade” por que passam os docentes iniciantes, confirmando pesquisas anteriores. Reconhecem que seus saberes são empíricos, inspirados nos seus ex-professores e na sua trajetória como estudantes. Percebem, porém, mudanças geracionais, que exige deles um esforço para compreenderem seus alunos, suas culturas e condições de estudo. Percebem que necessitam de saberes que os ajude a ter êxito no seu trabalho e apostam na troca de experiências e em formações colaborativas como alternativas importantes para tal. Entretanto, são raras as menções de necessidades que extrapolam a visão instrumental e pragmática dos saberes que requisitam. Talvez, dada a sua condição profissional de origem, ainda não tenham consciência da complexidade da docência como um espaço de formação que extrapola a dimensão do conhecimento específico. Os espaços de formação institucional, bem como políticas nesse sentido, são requeridos pelos professores que iniciam a docência superior. Ainda não ficou explicitado de forma mais intensa esse compromisso, ficando a sugestão do necessário investimento de uma formação permanente e situada na própria IES, além de incentivos a cursos de pós-graduação lato e stricto sensu. A fragilidade da formação e o incipiente apoio institucional no ambiente acadêmico retardam e afetam a construção da profissionalidade, que se situa mais numa responsabilidade individual do que nas políticas que sustentam a educação superior no país. / This research aimed to understand the construction process of beginning professors in universities. Took as the locus Murialdo School, located in the city of Caxias do Sul, state of Rio Grande do Sul. It is an isolated IES confessional character. With a view to meet the investigative purpose, a qualitative research was developed by listening, through semi-structured interviews, seven teachers from different areas, with up to three years of experience in university teaching. This choice enabled best to set the scope of the study. For the interpretation of the data, the principles of content analysis were used. The main theoretical contributions were based on the Hubermann studies, Marcelo Garcia, Freire, Anastasiou, pepper, Cunha, Mayor Ruiz, Gaetta and Maceto, Isaia, Nóvoa, Shön, Tardif, Zabalza and Zeballos. With a view to achieving the objectives, some issues were central: what motivates professionals from different areas to get involved in university teaching? How do teachers build the knowledge necessary for this profession? What are challenges in the practice of teaching and learning? What do values give the knowledge of the university pedagogy field? What expectations do they have on institutional support in their work? Further expanding the study, we seek to also understand the key experiences that lived in the early years of teaching careers and what they meant for their daily practices. Interested to know also as training and institutional development strategies can be mobilized to strengthen teaching. The data showed the "reality check" by passing the beginner teachers, confirming previous research. They recognize that their knowledge is empirical, inspired by his former teachers and his career as a student. Realize, however, generational changes, which require them an effort to understand their students, their cultures and study conditions. Realize they need knowledge to help them succeed in their work and are committed to exchange of experiences and collaborative training as important alternatives to do so. However, there are few mentions of needs that go beyond the instrumental and pragmatic view of knowledge ordering. Perhaps, given their professional status of origin, there is no awareness of the complexity of teaching as a training space that goes beyond the dimension of expertise. The spaces of institutional training and policies accordingly, are required by teachers entering the top teaching. Still it was not explained more fully in this commitment, becoming the suggestion of the investment required for ongoing formation and located in the very IES, and incentives for post-graduation courses and strict sense. The fragility of the incipient training and institutional support in the academic environment slow down and affect the construction of professionalism, that are more an individual responsibility than in policies that support higher education in the country.
250

A justificação das decisões judiciais e o estado de direito

Bastos, Rodrigo Reis Ribeiro 01 June 2012 (has links)
Made available in DSpace on 2016-04-26T20:21:01Z (GMT). No. of bitstreams: 1 Rodrigo Reis Ribeiro Bastos.pdf: 1347780 bytes, checksum: 11f0b6a765aa7c568f8cb75f1cf86311 (MD5) Previous issue date: 2012-06-01 / The primary objective of this work is to understand how they can give concreteness and effectiveness of Article 93 of the Federal Constitution which IX compliance is necessary to guarantee the democratic rule of law. For this to be done there is to discover what are the criteria to be used to identify an adequate justification for decisions. In the first chapter sought to establish the concept of knowledge and the boundary between knowledge and nonsense. At this point it was determined that the root note of knowledge is the rational justification of a statement and that rationality is defined as the construction of a logical proof in the form of inferences. This logical proof must meet three requirements: A) the axiomatic foundations must be explicit; B) the inference rules used must be clearly defined in advance; C) the semantic content (meaning) of the terms used in the premises of proof should be fixed , clarified and explained. Having established that as much knowledge as court decisions have an equal need for justification is concluded that the standards of rational justification used to distinguish the knowledge of the foolish ones are also needed to give concreteness to the constitutional imperative that determines the need to give reasons for decisions . The next step was, in chapter two, establish what the rules of inference commonly used in legal reasoning. Once established the rules of inference has passed the exposure of many reasons most commonly used in decisions. Then, in chapter four, we tried to establish a model where they explained the criteria imposed by law for fixing the best explanation contained in the minor premise and deductions contained in the major premise. Also, if you attempted to make explicit the semantic content (meaning) of the terms that are used on these premises during the construction of algorithms that operate the system. The next step will be, with the use of what has been researched and above, the formulation of "drawing" of the auxiliary system and the creation of a working prototype that can be taken to test / O objetivo primordial desse trabalho é compreender como se pode dar efetividade e concreção ao artigo 93 IX da Constituição Federal cuja observância é necessária para a garantia do Estado Democrático do Direito. Para que isso possa ser feito há que se descobrir quais são os critérios a serem usados para identificar uma adequada fundamentação das decisões. No primeiro capítulo se procurou estabelecer o conceito de conhecimento e a fronteira entre o conhecimento e a tolice. Nesse ponto se estabeleceu que a nota principal do conhecimento é a justificação racional de uma afirmação e que a racionalidade se define como a construção de uma prova lógica na forma de inferências. Essa prova lógica deve atender a três exigências: A) os fundamentos axiomáticos devem estar explícitos; B) as regras de inferência utilizadas devem ser clara e previamente definidas; C) o conteúdo semântico (significado) dos termos usados nas premissas da prova devem ser fixados, esclarecidos e explicitados. Uma vez estabelecido que tanto conhecimento quanto as decisões judiciais possuem uma igual necessidade de justificação se concluiu que os padrões de justificação racional utilizados para diferenciar o conhecimento da tolice também são aqueles necessários para se dar concreção ao imperativo constitucional que determina a necessidade de fundamentação das decisões. O passo seguinte, foi, no capítulo dois, estabelecer quais são as regras de inferência comumente usadas nos raciocínios jurídicos. Uma vez estabelecidas às regras de inferência se passou a exposição dos diversos fundamentos mais comumente usados nas decisões. Em seguida, no capítulo quatro, se tentou estabelecer um modelo onde são explicitados os critérios impostos pelo direito para a fixação da melhor explicação contida na premissa menor e para as deduções contidas na premissa maior. Além disso, se buscou explicitar o conteúdo semântico (significado) dos termos que serão usados nessas premissas quando da construção dos algoritmos que operarão o sistema. O passo seguinte será, com a utilização do que aqui foi pesquisado e exposto, a formulação do desenho do sistema auxiliar e a criação de um protótipo funcional que poderá ser levado a teste

Page generated in 0.0411 seconds