Spelling suggestions: "subject:"elevation"" "subject:"levation""
121 |
Método estruturado para aplicação das técnicas de aumento da capacidade de produção de recursos gargalo em células de manufatura / Structured method for the application of techniques to increase bottleneck resource production capacity in manufacturing cellsJosé Carlos Martins Junior 31 August 2009 (has links)
O propósito deste trabalho é estruturar uma sequência de aplicação de uma série de técnicas de aumento da capacidade de produção em um equipamento gargalo citadas na literatura. Esta sequência deverá ser colocada em prática para aumentar a capacidade de produção de linhas de fabricação em ambientes de manufatura enxuta. O trabalho foi desenvolvido através da análise das técnicas encontradas para aumento de capacidade de produção de equipamentos (Pesquisa Aplicada) e da estruturação de uma aplicação seqüencial das técnicas (Pesquisa-Ação). Esta sequência de aplicação é seguida até que a melhoria de capacidade de produção necessária seja atingida. A seqüência de aplicação desenvolvida foi aplicada em uma linha de arranjo celular numa empresa de autopeças. As metodologias de pesquisa utilizadas foram a pesquisa aplicada e pesquisa-ação. Foi possível observar que os assuntos que mais contribuíram para as bases deste trabalho foram a Teoria das Restrições e a Produção Enxuta, em particular o processo de focalização em cinco etapas da Teoria das Restrições. Foi possível identificar a necessidade de aumento da capacidade de produção tanto quando a capacidade é insuficiente para atendimento do cliente como quando uma estratégia de redução de custo impõe uma impossibilidade de atendimento à demanda do cliente. Futuros trabalhos podem explorar as diferentes incidências de aplicação de cada uma das técnicas apresentadas para cada tipo de equipamento. Tipos estes ligados à indústria, complexidade, porte, custo, etc. Novas técnicas podem surgir e ser anexadas à seqüência apresentada. Esta forma estruturada de elevação da restrição, neste caso o gargalo de fabricação, poderá ser utilizada em diferentes ambientes fabris pelos profissionais responsáveis por estas ações de melhoria. / The purpose of this work is structuring a sequence of application of a series of production capacity increase techniques mentioned in literature on an equipment bottleneck. This sequence must be placed into practice to increase production capacity of manufacturing lines in lean manufacturing environments. The work was developed through the analysis of the techniques have been found to increase equipment production capacity (applied research) and a sequential procedure to techniques application (action research). This sequential procedure is followed up until the improvement required production capacity is reached. The sequential procedure developed was applied in one line of cellular arrangement in an auto parts company. Research methodologies used were the applied research and research-action. It was possible to observe that matters most contributed to the foundations of this work were the theory of constraints and lean production, in particular the process of focus on five stages of the theory of constraints. It was possible to identify the need of using this sequential procedure when production capacity is insufficient to support the customer demand as when a cost reduction strategy imposes an inability to support the customer demand. Future work can explore the different number of using of each of the techniques presented for each type of equipment. These equipment types are related to complexity, cost, etc. New techniques may arise and be attached to the sequential procedure. This structured way of elevating the restriction in this case the bottleneck manufacturing could be used in different environments works by professionals responsible for these actions for improvement.
|
122 |
Formação inicial de trabalhadores e elevação da escolaridade: políticas públicas de qualificação profissional em discussão (1963-2011) / Initial formation of workers and schooling elevation: public policies of vocational qualification in discussion (1963-2011).Juliana Macedo Rocha 17 June 2011 (has links)
A formação inicial de trabalhadores, inserida no conjunto maior da educação profissional, entendida como modalidade de educação estabelecida na Lei de Diretrizes e Bases da Educação Nacional, isenta-se de qualquer exigência de escolaridade prévia, prescindindo da formação geral, tornando-se a única possibilidade de formação profissional para aqueles que não concluíram o ensino fundamental. A problemática desta tese incide, portanto, na relação existente entre formação profissional e elevação da escolaridade, a qual é analisada por meio de quatro políticas públicas federais de qualificação profissional implementadas no Brasil nos últimos cinquenta anos: Programa Intensivo de Preparação de Mão de Obra (PIPMO) (1963- 1982), Plano Nacional de Qualificação Profissional (PLANFOR) (1996-2002), Plano Nacional de Qualificação (PNQ) (2003-) e Programa Nacional de Integração da Educação Profissional com a Educação Básica na Modalidade de Educação de Jovens e Adultos, na Formação Inicial e Continuada com Ensino Fundamental (PROEJA FIC) (2006-). A avaliação das políticas públicas feita por meio de análise de legislação e fontes bibliográficas voltase prioritariamente para seus processos de implementação e, a fim de guiá-la, elegeram-se as categorias qualificação e formação profissional, na perspectiva das contribuições do sociólogo francês Pierre Naville. A tese resulta em uma síntese integrativa que, dentre outros elementos, propõe a classificação das políticas públicas de qualificação profissional em função de dois critérios: a relação entre formação profissional e elevação da escolaridade (o problema da tese) e o processo de implementação da política. / The initial formation of workers, inserted in a bigger group of vocational education, understood as a modality of education established by the National Law of Directives and Basis, exempts of any prior schooling, becoming the only possibility for any vocational training for those who doesn\'t have a complete elementary school. The problematic of this thesis is based on the relationship between vocational training and schooling elevation, which is analyzed using four federal public policies of vocational training implemented in the last fifty years in Brazil: Intensive Program for Preparation of Hand Labor (1963-1982), National Plan of Vocational Qualification (1996-2002), Qualification Plan (2003-) and National Program of Vocational Education with Young and Adults, in the initial and continued formation with Elementary School (2006-). The evaluation of the public policies made by the analysis of the legislation and bibliographic sources turns mainly by its implementation process in the categories of vocational qualification and vocational training, those are contributions of the intellectual production of the French sociologist Pierre Naville. The thesis results in an integrate synthesis which, among other elements, proposes a classification of the public policies of vocational qualification based in two criteria: the relationship between the vocational training and the schooling elevation (the thesis problem) and the process of its politics implementation.
|
123 |
Automatic digital surface model generation using graphics processing unitVan der Merwe, Dirk Jacobus 05 June 2012 (has links)
M. Ing. / Digital Surface Models (DSM) are widely used in the earth sciences for research, visu- alizations, construction etc. In order to generate a DSM for a speci c area, specialized equipment and personnel are always required which leads to a costly and time consuming exercise. Image processing has become a viable processing technique to generate terrain models since the improvements of hardware provided adequate processing power to complete such a task. Digital Surface Models (DSM) can be generated from stereo imagery, usually obtained from a remote sensing platform. The core component of a DSM generating system is the image matching algorithm. Even though there are a variety of algorithms to date which can generate DSMs, it is a computationally complex calculation and does tend to take some time to complete. In order to achieve faster DSMs, an investigation into an alternative processing platform for the generation of terrain models has been done. The Graphics Processing Unit (GPU) is usually used in the gaming industry to manipulate display data and then render it to a computer screen. The architecture is designed to manipulate large amounts of oating point data. The scientic community has begun using the GPU processing power available for technical computing, hence the term, General Purpose computing on a Graphics Processing Unit (GPGPU). The GPU is investigated as alternative processing platform for the image matching procedure since the processing capability of the GPU is so much higher than the CPU but only for a conditioned set of input data. A matching algorithm, derived from the GC3 algorithm has been implemented on both a CPU platform and a GPU platform in order to investigate the viability of a GPU processing alternative. The algorithm makes use of a Normalized Cross Correlation similarity measurement and the geometry of the image acquisition contained in the sensor model to obtain conjugate point matches in the two source images. The results of the investigation indicated an improvement of up to 70% on the processing time required to generate a DSM. The improvements varied from 70% to some cases where the GPU has taken longer to generate the DSM. The accuracy of the automatic DSM generating system could not be clearly determined since only poor quality reference data was available. It is however shown the DSMs generated using both the CPU and GPU platforms relate to the reference data and correlate to each other. The discrepancies between the CPU and the GPU results are low enough to prove the GPU processing is bene cial with neglible drawbacks in terms of accuracy. The GPU will definitely provide superior processing capabilites for DSM generation above a CPU implementation if a matching algorithm is speci cally designed to cater for the bene ts and limitations of the GPU.
|
124 |
The Influence of Hypnotically-Induced Elevation of Mood on Learned Helplessness DeficitsTassey, John Richard 08 1900 (has links)
This study evaluated the efficacy of hypnoticallyinduced mood elevation techniques for individuals exposed previously to an experimental learned helplessness condition. The treatment conditions in this investigation included the mood elevation with hypnotic induction group as well as a mood elevation group without the benefit of hypnotic induction. As experimental controls, a group was exposed to hypnotic relaxation and an attention-only treatment group was used. Measures of treatment success included the administration of•the Depression Adjective Checklist, backward digit span, and five—letter anagrams. In a series of factorial analysis of variance procedures no significant interaction was noted although the main effect for the presence of hypnotic induction was significant with the Depression Adjective Checklist. Post hoc analysis to examine gender differences demonstrated no significant performance discrepancy between the sexes. Limitations of the study were explored and avenues of further research discussed.
|
125 |
How Cinderella Became a Queen: Theorizing Radical Status ChangeDelmestri, Giuseppe, Greenwood, Royston January 2016 (has links) (PDF)
Using a case study of the Italian spirit grappa, we examine status recategorization - the vertical extension and reclassification of an entire market category. Grappa was historically a low-status product, but in the 1970s one regional distiller took steps that led to a radical break from its traditional image, so that in just over a decade high-quality grappa became an exemplar of cultured Italian lifestyle and held a market position in the same class as cognac and whisky. We use this context to articulate "theorization by allusion", which
occurs through three mechanisms: category detachment-distancing a social
object from its existing category; category emulation-presenting that object
so that it hints at the practices of a high-status category; and category
sublimation-shifting from local, field-specific references to broader, societal-level frames. This novel theorization is particularly appropriate for explaining
change from low to high status because it avoids resistance to and contestation of such change (by customers, media, and other sources) as a result of status imperatives, which may be especially strong in mature fields. Unlike prior
studies that have examined the status of organizations within a category, ours foregrounds shifts in the status and social meaning of a market category itself. (authors' abstract)
|
126 |
The effect of data reduction on LiDAR-based DEMsImmelman, Jaco 02 November 2012 (has links)
M.Sc. / Light Detection and Ranging (LiDAR) provide decidedly accurate datasets with high data densities, in a very short time-span. However, the high volumes of data associated with LiDAR often require some form of data reduction to increase the data handling efficiency of these datasets, of which the latter could affect the feasibility of Digital Elevation Models (DEMs). Critically, when DEM processing times are reduced, the resultant DEM should still represent the terrain adequately. This study investigated three different data reduction techniques, (1) random point reduction, (2) grid resolution reduction, and (3) combined data reduction, in order to assess their effects on the accuracy, as well as the data handling efficiency of derived DEMs. A series of point densities of 1 %, 10 %, 25 %, 50 % and 75 % were interpolated along a range of horizontal grid resolutions (1-, 2-, 3-, 4-, 5-, 10- and 30- m). Results show that, irrespective of terrain complexity, data points can be randomly reduced up to 25 % of the data points in the original dataset, with minimal effects on the remaining dataset. However, when these datasets are interpolated, data points can only be reduced to 50 % of the original data points, before showing large deviations from the original DEM. A reduction of the grid resolution of DEMs showed that the grid resolution could be lowered to 4 metres before showing significant deviations. When combining point density reduction with grid resolution reduction, results indicate that DEMs can be derived from 75 % of the data points, at a grid resolution of 3 metres, without sacrificing more than 15 percent of the accuracy of the original DEM. Ultimately, data reduction should result in accurate DEMs that reduce the processing time. When considering the effect on the accuracy, as well as the processing times of the data reduction techniques, results indicate that resolution reduction is the most effective data reduction technique. When reducing the grid resolution to 4 metres, data handling efficiencies improved by 94 %, while only sacrificing 10 % of the data accuracy. Furthermore, this study investigated data reduction on a variety of terrain complexities and found that the reduction thresholds established by this study were applicable to both complex and non-complex terrain.
|
127 |
A mixed methods study investigating re-presentation, symptom attribution and psychological health in primary percutaneous coronary intervention patientsIles-Smith, Heather January 2012 (has links)
Introduction: Following ST-elevation myocardial infarction (STEMI) and treatment with Primary Percutaneous Coronary Intervention (PPCI), some patients re-present with potential ischaemic heart disease (IHD) symptoms. Symptoms may be related to cardiac ischaemia, reduced psychological health or a comorbid condition, which share similar symptoms and may lead patients to seek help via acute services. The purpose of the study was to investigate the proportion of PPCI patients who re-presented to acute services due to potential IHD symptoms within 6 months of STEMI, and to explore associated factors. Methods: An explanatory mixed methods study was conducted. Quantitative data were collected at baseline and 6 months from consecutive patients attending two centres in Manchester. Variables were carefully considered based on a conceptual model for re-presentation. These included potential IHD symptom and psychological health assessments using self-report measures: the Seattle Angina Questionnaire (SAQ) and the Hospital and Anxiety and Depression Scale (HADS). Physiological health was measured using the Global Registry of Acute Coronary Events (GRACE) and the Charleson Comorbidity Index (CCI) at baseline. At 6 months re-presentation data were collected using patient records, a telephone interview and a self-report diary card. The experiences of some who re-presented (purposeful sampling) were explored through semi-structured interviews conducted at least 6 months following PPCI. Framework analysis was adopted to analyse data. Results: 202 PPCI patients returned baseline questionnaires [mean age 59.7 years (SD 13.9), 75.7% male]; 38 (18.8%; 95% CI 14.0% to 24.8%) participants re-presented due to potential IHD symptoms at 6 months; 16 (42.1%) re-presented due to a cardiac event and 22 (57.9%) did not receive a diagnosis. At both baseline and 6 months, mean HADS anxiety scores were higher for the re-presentation group compared to the non-representation group (baseline 9.5 vs 7.1, p=0.006; 6 months 9.4 vs 6.0, p<0.001). Angina symptoms were stable and infrequent at both time points for the groups. Multivariate regression modelling with the inclusion of predictors HADS anxiety, SAQ angina stability, SAQ angina frequency, GRACE and CCI, determined HADS anxiety as a predictor of re-presentation with an adjusted odds ratio of 1.12 (95% CI 1.03 to 1.22, p=0.008). The qualitative interviews with re-presenters included 25 participants (14 men, 27-79 years). Four themes were identified: fear of experiencing a further heart attack, uncertainty and inability to determine cause of symptoms, insufficient opportunity to validate self-construction of illness and difficulty adapting to life after a heart attack. Conclusion: Elevated levels of anxiety at baseline were predictive of re-presentation with potential IHD symptoms at 6 months. Factors such as shock at experiencing a heart attack, hypervigilance of symptoms and difficulty with symptom attribution appeared to play a role in raised anxiety levels for the re-presentation group. Findings suggested that changes are needed to cardiac rehabilitation and post-STEMI follow-up to address educational needs and psychological issues and changes in STEMI treatment.
|
128 |
Le sapin pectiné (Abies alba Mill., PINACEAE) en contexte méditerranéen : développement architectural et plasticité phénotypique / Mediterranean silver fir (Abies alba Mill., PINACEAE) : architectural development and phenotypic plasticityTaugourdeau, Olivier 29 November 2011 (has links)
L'objectif de cette thèse est l'étude de la plasticité phénotypique du sapin pectiné à l'aide de l'approche architecturale. Ces travaux se placent dans le contexte plus large de compréhension du développement des plantes pérennes en lien avec leur environnement et l'impact des changements climatiques en région méditerranéenne. Pour mener à bien ces objectifs, des études ont été menées ex-situ en conditions contrôlées (gradient d'ombrage et de disponibilité en eau) et in-situ au Mont Ventoux (gradient d'ombrage et altitudinal). Ces études ont consisté en l'analyse de la variabilité de différents traits architecturaux, généralement mesurés à l'échelle de la pousse annuelle, en lien avec l'architecture et l'environnement. Ces études ont permis de caractériser quantitativement le développement du sapin jusqu'à l'expression durable de la sexualité et de quantifier la réponse plastique à l'environnement lumineux, hydrique et climatique. Enfin, ce travail a permis d'approfondir le concept de plasticité architecturale et ses implications. / The aim of this work is to assess silver fir phenotypic plasticity with an architectural approach. This work take place in the context of understanding perennial plant development linked with their environment and the context of climate change in the Mediterranean region.For this, ex-situ (shading and watering treatments) and in-situ (shading and elevation gradients) studies were performed. They consist of the study of architectural traits variability, mainly measured at annual shoot scale, linked with plant architecture and environment.The results are the quantification of silver firs development up to reproductive tree and the quantification of plastic responses to light and hydric environment and climate. The concept of architectural plasticity and it consequences was also discussed.
|
129 |
Melhorias qualitativas na modelagem de levantamentos batimétricos em reservatórios por meio da ferramenta computacional \"CAV-NH\" / Qualitative improvements in the modeling of bathymetric survey in reservoirs through computational tool \"CAV-NH\"Artur José Soares Matos 23 August 2012 (has links)
A água doce é um recurso mineral finito e essencial para sustentar a vida na terra. Para um bom gerenciamento dos recursos hídricos em uma bacia hidrográfica é importante que se conheça a curva Cota-Área-Volume (CAV) dos reservatórios. Concessionárias de energia e empresas públicas se pautam nesta relação em suas tomadas de decisões considerando os usos múltiplos que os reservatórios apresentam atualmente. Devido ao processo de assoreamento que o reservatório apresenta, é necessário que esta relação seja atualizada com certa frequência, tendo como base os levantamentos batimétricos atualizados. Entretanto, quanto maior o espaçamento entre as seções batimétricas maiores são as falhas que a modelagem do terreno apresenta e maiores os erros que as curvas CAV apresentam. Este estudo visa aprimorar os procedimentos realizados para modelagem de dados batimétricos em reservatórios, apresentando o método denominado Inserção de Malha de Pontos (IMP), que gera uma malha de pontos entre as seções levantadas corrigindo os efeitos de borda gerados pelo interpolador TIN. Com o intuito de otimizar e reduzir o tempo de obtenção dos resultados também é apresentada a ferramenta computacional CAV-NH, desenvolvida em linguagem Python, que utiliza a biblioteca do Arcgis 9.3 para realizar a modelagem do terreno e calcular a curva CAV como resultado final. Para a avaliação e validação do método foi realizada uma batimetria detalhada do reservatório do Lobo-SP, com intervalos entre as seções batimétricas de 10 metros, a qual foi comparada com diferentes espaçamentos até o intervalo de 600 metros. O método também foi aplicado aos reservatórios de Bariri e Ibitinga para a verificação de sua eficácia. Para todos os 1830 casos analisados houve uma melhora significativa na precisão, tendo uma redução no erro do volume calculado maior que 50% para o caso do distanciamento de 200 metros entre as seções, para a cota 703 metros. As análises indicam que o método IMP provê uma modelagem mais condizente com o relevo do terreno de fundo, obtendo-se uma maior acurácia e possibilitando a diminuição do trabalho de coleta de dados em campo sem prejudicar o cálculo do volume do reservatório. O CAV-NH foi utilizado em todas as simulações e se mostrou uma ferramenta de fácil utilização, proporcionando uma maior agilidade em todo processo. Conclui-se que o método IMP juntamente com a ferramenta computacional CAV-NH são importantes contribuições para a definição de uma curva Cota-Área-Volume de qualidade, sendo esta de grande importância para sistemas de suporte à decisão e para um bom gerenciamento dos recursos hídricos. / Freshwater is a finite mineral resource and essential for sustaining life on earth. For proper water resource management in the watershed, it is important to know the reservoirs Elevation-Area-Volume (EAV) curve. Energy companies and government agencies use this curve for their decision-making, considering the multiple uses of the reservoirs. Due to the reservoir sedimentation process, it is necessary to update this curve frequently, based on the bathymetric surveys. However, a greater space between transects causes failures in the terrain model and also in the EAV curve. This study aims to improve the procedures used for modeling reservoir survey data, introducing the method called Insertion of Mesh Points (IMP), that generates a mesh of points between transects correcting the edge effects caused by TIN interpolation. In order to optimize and reduce work time for obtaining the results, a computational tool CAV-NH, which was developed in Python language, is also presented. It employs the Arcgis 9.3 library to generate the terrain model and obtain the EAV table. For the evaluation and validation of the method, a detailed bathymetry of the Lobo reservoir was carried out, with 10 meter intervals of survey lines, which was compared with different survey lines spacing up to 600 meters. To verify its effectiveness the method was also applied to Bariri and Ibitinga reservoirs. For all 1830 cases analyzed there was a significant accuracy improvement. For the case of 200 meter line spacing and 703 meters elevation, the error of volume obtained was reduced by more than 50%. The analysis showed that the IMP method provides a terrain model more suited, resulting in greater accuracy and allowing the work reduction of reservoir survey without affecting the volume calculation. The CAV-NH was used in all simulations and proved an easy tool to use, providing greater agility in the whole process. It is concluded that the IMP method and the computational tool CAV-NH are important contributions to achieve a quality relationship of elevation-area-volume.
|
130 |
Servomechanické řízení pohybu fotovoltaických panelů / Servomechanical movement control of photovoltaic panelUher, Ondřej January 2011 (has links)
Thesis deals with the introduction to the production of electricity using photovoltaic panels and the relative position between panels and the Sun. The project aim was to design equipment to ensure the vertical position of the solar panels towards the Sun during the day so as to achieve maximum energy gain from sunlight. Pointing mechanism will be governed by the date and time.
|
Page generated in 0.1005 seconds