• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 23
  • 17
  • 6
  • 4
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 63
  • 12
  • 10
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Remote application support in a multilevel environment

Cooper, Robert C. 03 1900 (has links)
Approved for public release, distribution is unlimited / The use of specialized single-level networks in current military operations is inadequate to meet the need to share information envisioned by the Global Information Grid (GIG). Multilevel security (MLS) is a key Information Assurance enabler for the GIG vision. The Monterey Security Architecture (MYSEA), a distributed MLS network, eliminates the need to use separate equipment to connect to many networks at different classification levels. It allows users to view data at different sensitivities simultaneously. MYSEA also allows commercial software and hardware to be used at clients. To address the threat of residual data on the client after a user session change in security state, the MYSEA clients are required to be "stateless", i.e., there is no non-volatile writable memory. Hence the MYSEA server must provide the clients with the ability to execute server-resident client-side applications to access data at different security levels over the MLS Local Area Network (LAN). The MYSEA server currently does not support such capability. This thesis addresses this limitation. A new trusted process family is introduced to provide a pseudo-socket interface for the single level remote application to access the MLS LAN interface. Detailed design specifications were created to facilitate implementation of the remote application support. / Lieutenant, United States Navy
52

Modelos numéricos aplicados à análise viscoelástica linear e à otimização topológica probabilística de estruturas bidimensionais: uma abordagem pelo Método dos Elementos de Contorno / Numerical models applied to the analysis of linear viscoelasticity and probabilistic topology optimization of two-dimensional structures: a Boundary Element Method approach

Oliveira, Hugo Luiz 31 March 2017 (has links)
O presente trabalho trata da formulação e implementação de modelos numéricos baseados no Método dos Elementos de Contorno (MEC). Inspirando-se em problemas de engenharia, uma abordagem multidisciplinar é proposta como meio de representação numérica mais realista. Há materiais de uso corrente na engenharia que possuem resposta dependente do tempo. Nesta tese os fenômenos dependentes do tempo são abordados por meio da Mecânica Viscoelástica Linear associada a modelos reológicos. Neste trabalho, se apresenta a dedução do modelo constitutivo de Maxwell para ser utilizado via MEC. As equações deduzidas são verificadas em problemas de referência. Os resultados mostram que a formulação deduzida pode ser utilizada para representar estruturas compostas, mesmo em casos envolvendo uma junção entre materiais viscoelásticos e não viscoelásticos. Adicionalmente as formulações apresentadas se mantém estáveis na presença de fissuras de domínio e bordo. Verifica-se que a formulação clássica dual pode ser utilizada para simular o comportamento de fissuras com resposta dependente do tempo. Essa constatação serve de base para maiores investigações no campo da Mecânica da Fratura de materiais viscoelásticos. Na sequência, mostra-se como o MEC pode ser aliado a conceitos probabilísticos para fazer estimativas de comportamentos a longo prazo. Estas estimativas incluem as incertezas inerentes nos processos de engenharia. As incertezas envolvem os parâmetros materiais, de carregamento e de geometria. Por meio do conceito de probabilidade de falha, os resultados mostram que as incertezas relacionadas às estimativas das cargas atuantes apresentam maior impacto no desempenho esperado a longo prazo. Esta constatação serve para realizar estudos que colaborem para a melhoria dos processos de concepção estrutural. Outro aspecto de interesse desta tese é a busca de formas otimizadas, por meio da Otimização Topológica. Neste trabalho, um algoritmo alternativo de otimização topológica é proposto. O algoritmo é baseado no acoplamento entre o Método Level Set (MLS) e o MEC. A diferença entre o algoritmo aqui proposto, e os demais presentes na literatura, é forma de obtenção do campo de velocidades. Nesta tese, os campos normais de velocidades são obtidos por meio da sensibilidade à forma. Esta mudança torna o algoritmo propício a ser tratado pelo MEC, pois as informações necessárias para o cálculo das sensibilidades residem exclusivamente no contorno. Verifica-se que o algoritmo necessita de uma extensão particular de velocidades para o domínio a fim de manter a estabilidade. Limitando-se a casos bidimensionais, o algoritmo é capaz de obter os conhecidos casos de referência reportados pela literatura. O último aspecto tratado nesta tese retrata a maneira pela qual as incertezas geométricas podem influenciar na determinação das estruturas otimizadas. Utilizando o MEC, propõe-se um critério probabilístico que permite embasar escolhas levando em consideração a sensibilidade geométrica. Os resultados mostram que os critérios deterministas, nem sempre, conduzem às escolhas mais adequadas sob o ponto de vista de engenharia. Assim, este trabalho contribui para a expansão e difusão das aplicações do MEC em problemas de engenharia de estruturas. / The present work deals with the formulation and implementation of numerical models based on the Boundary Element Method (BEM). Inspired by engineering problems, a multidisciplinary combination is proposed as a more realistic approach. There are common engineering materials that have time-dependent response. In this thesis, time-dependent phenomena are approached through the Linear Viscoelastic Mechanics associated with rheological models. In this work, the formulation of Maxwell\'s constitutive model is presented to be used via MEC. The resultant equations are checked on reference problems. The results show that the presented formulation can be used to represent composite structures, even in cases involving a junction between viscoelastic and non-viscoelastic materials. Additionally the formulations presented remain stable in the presence of cracks. It is found that the classical DUAL-BEM formulation can be used to simulate cracks with time-dependent behaviour. This result serves as the basis for further investigations in the field of Fracture Mechanics of viscoelastic materials. In the sequence, it is shown how the BEM can be associated with probabilistic concepts to make predictions of long-term behaviour. These predictions include the inherent uncertainties in engineering processes. The uncertainties involve the material, loading and geometry parameters. Using the concept of probability of failure, the results show that the uncertainties related to the estimations of loads have important impact on the long-term expected performance. This finding serves to carry out studies that collaborate for the improvement of structural design processes. Another aspect of interest of this thesis is the search for optimized forms through Topological Optimization. In this work, an alternative topological optimization algorithm is proposed. The algorithm is based on the coupling between the Level Set Method (LSM) and BEM. The difference between the algorithm proposed here, and the others present in the literature, is a way of obtaining the velocity field. In this thesis, the normal fields of velocities are obtained by means of shape sensitivity. This change makes the algorithm adequate to be treated by the BEM, since the information necessary for the calculation of the sensitivities resides exclusively in the contour. It is found that the algorithm requires a particular velocity extension in order to maintain stability. Limiting to two-dimensional cases, the algorithm is able to obtain the known benchmark cases reported in the literature. The last aspect addressed in this thesis involves the way in which geometric uncertainties can influence the determination of optimized structures. Using the BEM, it is proposed a probabilistic criterion that takes into consideration the geometric sensitivity. The results show that deterministic criteria do not always lead to the most appropriate choices from an engineering point of view. In summary, this work contributes to the expansion and diffusion of MEC applications in structural engineering problems.
53

Contribuição à classificação de pequenas bacias hidrográficas em função da área de drenagem / Contribution to the classification of small hydrographic bowls in the drainage area

SANTANA, Laila Rover 25 May 2018 (has links)
Submitted by Kelren Mota (kelrenlima@ufpa.br) on 2018-08-31T20:51:26Z No. of bitstreams: 2 license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Dissertacao_ContribuicaoClassificacaoPequenas.pdf: 5568509 bytes, checksum: 0f163878e1bab16cf61a8f50a955b832 (MD5) / Approved for entry into archive by Kelren Mota (kelrenlima@ufpa.br) on 2018-08-31T20:52:04Z (GMT) No. of bitstreams: 2 license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Dissertacao_ContribuicaoClassificacaoPequenas.pdf: 5568509 bytes, checksum: 0f163878e1bab16cf61a8f50a955b832 (MD5) / Made available in DSpace on 2018-08-31T20:52:04Z (GMT). No. of bitstreams: 2 license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Dissertacao_ContribuicaoClassificacaoPequenas.pdf: 5568509 bytes, checksum: 0f163878e1bab16cf61a8f50a955b832 (MD5) Previous issue date: 2018-05-25 / A adoção de um valor de área para definir grandes, médias ou pequenas bacias hidrográficas deve levar em consideração os diversos processos envolvidos no comportamento das bacias. Em pequenas bacias, os fenômenos de conversão chuva-vazão são descritos utilizando técnicas mais simples. Buscando classificar pequenas bacias hidrográficas em função de suas áreas de drenagem, o estudo teve como principal objetivo identificar e classificar as pequenas bacias da Amazônia Legal, utilizando um modelo linear simples (MLS). O modelo é aplicado aos dados de chuva e vazão de bacias testes selecionadas, a fim de verificar a linearidade entre essas variáveis. O MLS utilizado neste estudo é baseado em um sistema linear e invariável no tempo, que estabelece uma relação de causa e efeito entre os dados de chuva e vazão. O desempenho do modelo foi avaliado através do RMS (raiz quadrada do erro quadrático médio), e a partir dos resultados pequenas bacias hidrográficas foram classificadas em função da área de drenagem. O método de Otto Pfafstetter é aplicado buscando identificar em que nível de codificação são encontradas apenas pequenas bacias hidrográficas. Os resultados indicaram que nas bacias com áreas de drenagem menores ou iguais a 620 km² o ajuste entre as curvas de permanência das vazões observadas e simuladas foi melhor, apresentando valores de RMS abaixo de 3 m³/s. Já nas bacias com dimensões acima de 620 km², os resultados de RMS ultrapassaram 4 m³/s, e as curvas de permanência não apresentaram bom ajuste, demonstrando que o MLS falha quando aplicado aos dados hidrológicos dessas bacias. Assim, as pequenas bacias hidrográficas da Amazônia Legal foram classificadas com áreas de drenagem menor ou igual a 620 km². O método de Ottocodificação foi aplicado até o nível 9, onde foram delimitadas 51.319 ottobacias, todas classificadas como pequenas bacias. / The adoption of an area value to define large, medium or small river basins should take into account the various processes involved in the basin behavior. In small basins, rainfall-flow conversion phenomena are described using simpler techniques. In order to classify small river basins according to their drainage areas, the main objective of this study was to identify and classify the small basins of the Amazon using a simple linear model (MLS). The model is applied to the rainfall and flow data from selected test basins in order to verify the linearity between these variables. The MLS used in this study is based on a linear and time invariant system that establishes a cause and effect relationship between rainfall and flow data. The performance of the model was evaluated through the RMS (square root of the mean square error), and from the results, small hydrographic basins were classified as a function of the drainage area. The method of Otto Pfafstetter is applied searching to identify at what level of codification are found only small river basins. The results indicated that in the basins with drainage areas less than or equal to 620 km², the adjustment between the observed and simulated flow duration curves was higher, presenting RMS values below 3 m³ / s. In the basins with dimensions over 620 km ², the RMS results exceeded 4 m³ / s, and the flow duration curves did not present a good fit, demonstrating that the MLS fails when applied to the hydrological data of these basins. Thus, the small hydrographic basins of the Amazon were classified with areas of drainage less or equal to 620 km ². The Ottocoding method was applied until level 9, where 51,319 ottobacias were delimited, all classified as small basins.
54

Co-located analysis of ice clouds detected from space and their impact on longwave energy transfer

Nankervis, Christopher James January 2013 (has links)
A lack of quality data on high clouds has led to inadequate representations within global weather and climate models. Recent advances in spaceborne measurements of the Earth’s atmosphere have provided complementary information on the interior of these clouds. This study demonstrate how an array of space-borne measurements can be used and combined, by close co-located comparisons in space and time, to form a more complete representation of high cloud processes and properties. High clouds are found in the upper atmosphere, where sub-zero temperatures frequently result in the formation of cloud particles that are composed of ice. Weather and climate models characterise the bulk properties of these ice particles to describe the current state of the cloud-sky atmosphere. By directly comparing measurements with simulations undertaken at the same place and time, this study demonstrates how improvements can be made to the representation of cloud properties. The results from this study will assist in the design of future cloud missions to provide a better quality input. These improvements will also help improve weather predictions and lower the uncertainty in cloud feedback response to increasing atmospheric temperature. Most clouds are difficult to monitor by more than one instrument due to continuous changes in: large-scale and sub-cloud scale circulation features, microphysical properties and processes and characteristic chemical signatures. This study undertakes co-located comparisons of high cloud data with a cloud ice dataset reported from the Microwave Limb Sounder (MLS) instrument onboard the Aura satellite that forms part of the A-train constellation. Data from the MLS science team include vertical profiles of temperature, ice water content (IWC) and the mixing ratios of several trace gases. Their vertical resolutions are 3 to 6 km. Initial investigations explore the link between cloud-top properties and the longwave radiation budget, developing methods for estimating cloud top heights using; longwave radiative fluxes, and IWC profiles. Synergistic trios of direct and indirect high cloud measurements were used to validate detections from the MLS by direct comparisons with two different A-train instruments; the NASA Moderate-resolution Imaging Spectroradiometer (MODIS) and the Clouds and the Earth’s Radiant Energy System (CERES) onboard on the Aqua satellite. This finding focuses later studies on two high cloud scene types that are well detected by the MLS; deep convective plumes that form from moist ascent, and their adjacent outflows that emanate outwards several hundred kilometres. The second part of the thesis identifies and characterises two different high cloud scenes in the tropics. Direct observational data is used to refine calculations of the climate sensitivity to upper tropospheric humidity and high cloud in different conditions. The data reveals several discernible features of convective outflows are identified using a large sample of MLS data. The key finding, facilitated by the use of co-location, reveals that deep convective plumes exert a large longwave warming effect on the local climate of 52 ± 28Wm−2, with their adjacent outflows presenting a more modest warming of 33 ± 20Wm−2.
55

Modelos numéricos aplicados à análise viscoelástica linear e à otimização topológica probabilística de estruturas bidimensionais: uma abordagem pelo Método dos Elementos de Contorno / Numerical models applied to the analysis of linear viscoelasticity and probabilistic topology optimization of two-dimensional structures: a Boundary Element Method approach

Hugo Luiz Oliveira 31 March 2017 (has links)
O presente trabalho trata da formulação e implementação de modelos numéricos baseados no Método dos Elementos de Contorno (MEC). Inspirando-se em problemas de engenharia, uma abordagem multidisciplinar é proposta como meio de representação numérica mais realista. Há materiais de uso corrente na engenharia que possuem resposta dependente do tempo. Nesta tese os fenômenos dependentes do tempo são abordados por meio da Mecânica Viscoelástica Linear associada a modelos reológicos. Neste trabalho, se apresenta a dedução do modelo constitutivo de Maxwell para ser utilizado via MEC. As equações deduzidas são verificadas em problemas de referência. Os resultados mostram que a formulação deduzida pode ser utilizada para representar estruturas compostas, mesmo em casos envolvendo uma junção entre materiais viscoelásticos e não viscoelásticos. Adicionalmente as formulações apresentadas se mantém estáveis na presença de fissuras de domínio e bordo. Verifica-se que a formulação clássica dual pode ser utilizada para simular o comportamento de fissuras com resposta dependente do tempo. Essa constatação serve de base para maiores investigações no campo da Mecânica da Fratura de materiais viscoelásticos. Na sequência, mostra-se como o MEC pode ser aliado a conceitos probabilísticos para fazer estimativas de comportamentos a longo prazo. Estas estimativas incluem as incertezas inerentes nos processos de engenharia. As incertezas envolvem os parâmetros materiais, de carregamento e de geometria. Por meio do conceito de probabilidade de falha, os resultados mostram que as incertezas relacionadas às estimativas das cargas atuantes apresentam maior impacto no desempenho esperado a longo prazo. Esta constatação serve para realizar estudos que colaborem para a melhoria dos processos de concepção estrutural. Outro aspecto de interesse desta tese é a busca de formas otimizadas, por meio da Otimização Topológica. Neste trabalho, um algoritmo alternativo de otimização topológica é proposto. O algoritmo é baseado no acoplamento entre o Método Level Set (MLS) e o MEC. A diferença entre o algoritmo aqui proposto, e os demais presentes na literatura, é forma de obtenção do campo de velocidades. Nesta tese, os campos normais de velocidades são obtidos por meio da sensibilidade à forma. Esta mudança torna o algoritmo propício a ser tratado pelo MEC, pois as informações necessárias para o cálculo das sensibilidades residem exclusivamente no contorno. Verifica-se que o algoritmo necessita de uma extensão particular de velocidades para o domínio a fim de manter a estabilidade. Limitando-se a casos bidimensionais, o algoritmo é capaz de obter os conhecidos casos de referência reportados pela literatura. O último aspecto tratado nesta tese retrata a maneira pela qual as incertezas geométricas podem influenciar na determinação das estruturas otimizadas. Utilizando o MEC, propõe-se um critério probabilístico que permite embasar escolhas levando em consideração a sensibilidade geométrica. Os resultados mostram que os critérios deterministas, nem sempre, conduzem às escolhas mais adequadas sob o ponto de vista de engenharia. Assim, este trabalho contribui para a expansão e difusão das aplicações do MEC em problemas de engenharia de estruturas. / The present work deals with the formulation and implementation of numerical models based on the Boundary Element Method (BEM). Inspired by engineering problems, a multidisciplinary combination is proposed as a more realistic approach. There are common engineering materials that have time-dependent response. In this thesis, time-dependent phenomena are approached through the Linear Viscoelastic Mechanics associated with rheological models. In this work, the formulation of Maxwell\'s constitutive model is presented to be used via MEC. The resultant equations are checked on reference problems. The results show that the presented formulation can be used to represent composite structures, even in cases involving a junction between viscoelastic and non-viscoelastic materials. Additionally the formulations presented remain stable in the presence of cracks. It is found that the classical DUAL-BEM formulation can be used to simulate cracks with time-dependent behaviour. This result serves as the basis for further investigations in the field of Fracture Mechanics of viscoelastic materials. In the sequence, it is shown how the BEM can be associated with probabilistic concepts to make predictions of long-term behaviour. These predictions include the inherent uncertainties in engineering processes. The uncertainties involve the material, loading and geometry parameters. Using the concept of probability of failure, the results show that the uncertainties related to the estimations of loads have important impact on the long-term expected performance. This finding serves to carry out studies that collaborate for the improvement of structural design processes. Another aspect of interest of this thesis is the search for optimized forms through Topological Optimization. In this work, an alternative topological optimization algorithm is proposed. The algorithm is based on the coupling between the Level Set Method (LSM) and BEM. The difference between the algorithm proposed here, and the others present in the literature, is a way of obtaining the velocity field. In this thesis, the normal fields of velocities are obtained by means of shape sensitivity. This change makes the algorithm adequate to be treated by the BEM, since the information necessary for the calculation of the sensitivities resides exclusively in the contour. It is found that the algorithm requires a particular velocity extension in order to maintain stability. Limiting to two-dimensional cases, the algorithm is able to obtain the known benchmark cases reported in the literature. The last aspect addressed in this thesis involves the way in which geometric uncertainties can influence the determination of optimized structures. Using the BEM, it is proposed a probabilistic criterion that takes into consideration the geometric sensitivity. The results show that deterministic criteria do not always lead to the most appropriate choices from an engineering point of view. In summary, this work contributes to the expansion and diffusion of MEC applications in structural engineering problems.
56

Förstudie till implementering av ISO 14001:2015 : Examensarbete 2020 / Framework for implementing ISO 14001:2015 : Bachelor Thesis 2020

Holmström, Fredrik, Hoffsten, Jakob January 2020 (has links)
Syfte – Syftet med studien är att skapa ett underlag för små företag som ska implementera ISO 14001, genom att undersöka och identifiera kritiska faktorer samt lösningar för att överkomma dessa. Resultatet av studien kommer sedan användas för att skapa ett ramverk till små företag när de implementerar miljöledningssystemet ISO 14001:2015. Metod – För att nå studiens syfte, har en litteratursökning utförts för att etablera ett teoretiskt ramverk. Syftet med detta var att identifiera kritiska faktorer när en implementering av ISO 14001:2015 utförs. En fallstudie har även utförts på ett tillverkande företag för att kunna samla in empiriska data. Metoderna som har använts i studien är intervjuer, enkäter och observationer. Den empiriska data och det teoretiska ramverket jämfördes och analyserades för att svara på studiens syfte och frågeställningar. Resultat – I det teoretiska ramverket och fallstudien, identifierades fem kritiska faktorer; kunskap och kompetens, involvering av anställda, resurser, identifiering av miljömål och företagets miljöpåverkan och företagskultur. Engagemanget från ledningen på företaget identifierades även som en kritisk faktor, men var starkt kopplat till kunskap och kompetens. Faktorn valdes därför att inkluderas i kunskap och kompetens. För att överkomma dessa, identifierades lösningar till dem kritiska faktorerna. Resultatet av lösningarna var att alla kritiska faktorer har en stark korrelation med varandra och att lösningarna var liknande. Träning och utbildning av dem anställda var en nyckel-lösning för att lösa dem kritiska faktorerna. Implikationer – Studien kan användas som ett ramverk och bidra som underlag till små företag som vill implementera ISO 14001:2015. Studien har identifierat kritiska faktorer vid implementeringen och hur man överkommer dessa, oavsett vilken bransch som företagen arbetar i, vilket gör att studien kan användas av olika typer av företag. Begränsningar – Eftersom restriktionerna som samhället har satt på grund av COVID19 pandemin, har antalet observationer och intervjuer blivit begränsade. Detta har vidare påverkat trovärdigheten på studien negativt. Fler observationer och intervjuer hade gett ett mer trovärdigt resultat. / Purpose – The purpose of this study is to examine and identify critical factors as well as solutions to overcome these. The results will then be used to create a framework for small businessesto use when implementing the environmental management system ISO 14001:2015. Method – To reach the purpose of this study, a literature review has been conducted in order to establish a theoretical framework. The purpose of the theoretical frameworks was to identify critical factors when implementing ISO 14001:2015. A case study has also been conducted on a manufacturing company in order to collect empirical data. The methods used in the case study were interviews, surveys and observations. The empirical data and the theoretical framework were later compared and analysed to answer the study’s purpose and research questions. Findings – In the theoretical framework and the case study, five critical factors were identified: knowledge and competence, involvement of employees, resources, identifying environmental goals and the company’s environmental impact and corporate culture. The commitment of the senior management was also identified as a critical factor, but was strongly related to knowledge and competence, and therefore it was included in that critical factor. To overcome the critical factors, solutions to each critical factor was identified. With all the solutions identified, it was concluded that all the critical factors are strongly related. Furthermore, the solutions to overcome the critical factors were similar, where training and education of employees was a key solution to overcome the critical factors. Implications – The study can be used as a framework and provide help for small business that want to implement ISO 14001:2015. The study has identified critical factors in the implementation and how to overcome these, regardless of the area that companies work within and can therefore be used by different types of companies. Limitations – Because of the restrictions in the society following the COVID-19 pandemic, the amount of observations and interviews that should have been conducted was limited and therefore the credibility of the study was affected in a negative way. More observations and interviews would have given a more credibly result.
57

EFFICIENT CONFIDENCE SETS FOR DISEASE GENE LOCATIONS

Sinha, Ritwik 19 March 2007 (has links)
No description available.
58

Automatic segmentation and reconstruction of traffic accident scenarios from mobile laser scanning data

Vock, Dominik 08 May 2014 (has links) (PDF)
Virtual reconstruction of historic sites, planning of restorations and attachments of new building parts, as well as forest inventory are few examples of fields that benefit from the application of 3D surveying data. Originally using 2D photo based documentation and manual distance measurements, the 3D information obtained from multi camera and laser scanning systems realizes a noticeable improvement regarding the surveying times and the amount of generated 3D information. The 3D data allows a detailed post processing and better visualization of all relevant spatial information. Yet, for the extraction of the required information from the raw scan data and for the generation of useable visual output, time-consuming, complex user-based data processing is still required, using the commercially available 3D software tools. In this context, the automatic object recognition from 3D point cloud and depth data has been discussed in many different works. The developed tools and methods however, usually only focus on a certain kind of object or the detection of learned invariant surface shapes. Although the resulting methods are applicable for certain practices of data segmentation, they are not necessarily suitable for arbitrary tasks due to the varying requirements of the different fields of research. This thesis presents a more widespread solution for automatic scene reconstruction from 3D point clouds, targeting street scenarios, specifically for the task of traffic accident scene analysis and documentation. The data, obtained by sampling the scene using a mobile scanning system is evaluated, segmented, and finally used to generate detailed 3D information of the scanned environment. To realize this aim, this work adapts and validates various existing approaches on laser scan segmentation regarding the application on accident relevant scene information, including road surfaces and markings, vehicles, walls, trees and other salient objects. The approaches are therefore evaluated regarding their suitability and limitations for the given tasks, as well as for possibilities concerning the combined application together with other procedures. The obtained knowledge is used for the development of new algorithms and procedures to allow a satisfying segmentation and reconstruction of the scene, corresponding to the available sampling densities and precisions. Besides the segmentation of the point cloud data, this thesis presents different visualization and reconstruction methods to achieve a wider range of possible applications of the developed system for data export and utilization in different third party software tools.
59

Les nouvelles techniques de billetterie pour augmenter les revenus des clubs professionnels de football en France / Increase Matchday revenues for French football club

Perri, Pascal 06 July 2017 (has links)
Le football professionnel est devenu une industrie du spectacle audiovisuel dont il tire une partie importante de ses revenus. Cependant, les recettes dites Matchday et les revenus annexes de la billetterie constituent un gisement de croissance important pour les clubs français. Ceux ci devraient pouvoir maitriser les capacités offertes au public du spectacle vivant dans les stades et devenir propriétaires de leurs enceintes en utilisant la technique des baux emphytéotiques. Les politiques de prix variables ou de prix dynamiques conduites dans d’autres secteurs comme les transports, l’hôtellerie ou les centres de loisir sont applicables dans la gestion de la billetterie. La digitalisation de l’offre ouvre de nouvelles perspectives de relation client. Elle améliore la traçabilité des consommateurs et permet de déterminer leur propension optimale à payer. Les solutions de CRM, Customer Relationship Management améliorent la connaissance client et permettent de mieux segmenter l’offre pour mieux adresser les différents publics du stade. Dans une activité fondée sur l’incertitude du résultat mais sur la certitude des coûts de production, les ressources digitales permettent de fidéliser les différentes catégories de fans et d’augmenter le panier moyen. Les clubs français très engagés dans la gestion à court terme ont négligé les outils du pricing et tardent à adopter les solutions digitales qui ont donné des résultats satisfaisants dans des secteurs comparables. Nous formulons des propositions adossées à des expérimentations concrètes pour augmenter les performances de la billetterie dans le secteur de l’industrie du football en France. / Football has become a major industry of entertainment for TV networks and also for companies running football squads. TV rights represent at least 50% of the French clubs incomes. Meanwhile, most of them have disregarded Matchday revenues. For a large majority of them, they don’t own their arenas. Moreover, they play in (too) large stadiums with overcapacities according to average attendances. This is why average prices are below the European average price when we compare French League One with the other major’s championships in Europe. In this field, we suggest long-term leases between public owners and football firms in order to transfer both property and ability to refit arenas and stadiums. In addition, French firms running football clubs have not yet fully used technics of variable prices and dynamic prices. They should also display CRM resources in order to address each segment of costumers, including fans, year ticket holders, walk in customer or families. The target is to hit as close as possible the willingness to pay of each category of customers. We have experienced such policies for Year ticket holders in French third division. Digital resources increase customer insights and sustain cross selling policies increasing revenues as it is done in other comparable sectors such as air transportation, leisure parks, hotels and resorts. We make some suggestions and recommendations to strengthen home revenues in the French professional football League.
60

Automatic segmentation and reconstruction of traffic accident scenarios from mobile laser scanning data

Vock, Dominik 18 December 2013 (has links)
Virtual reconstruction of historic sites, planning of restorations and attachments of new building parts, as well as forest inventory are few examples of fields that benefit from the application of 3D surveying data. Originally using 2D photo based documentation and manual distance measurements, the 3D information obtained from multi camera and laser scanning systems realizes a noticeable improvement regarding the surveying times and the amount of generated 3D information. The 3D data allows a detailed post processing and better visualization of all relevant spatial information. Yet, for the extraction of the required information from the raw scan data and for the generation of useable visual output, time-consuming, complex user-based data processing is still required, using the commercially available 3D software tools. In this context, the automatic object recognition from 3D point cloud and depth data has been discussed in many different works. The developed tools and methods however, usually only focus on a certain kind of object or the detection of learned invariant surface shapes. Although the resulting methods are applicable for certain practices of data segmentation, they are not necessarily suitable for arbitrary tasks due to the varying requirements of the different fields of research. This thesis presents a more widespread solution for automatic scene reconstruction from 3D point clouds, targeting street scenarios, specifically for the task of traffic accident scene analysis and documentation. The data, obtained by sampling the scene using a mobile scanning system is evaluated, segmented, and finally used to generate detailed 3D information of the scanned environment. To realize this aim, this work adapts and validates various existing approaches on laser scan segmentation regarding the application on accident relevant scene information, including road surfaces and markings, vehicles, walls, trees and other salient objects. The approaches are therefore evaluated regarding their suitability and limitations for the given tasks, as well as for possibilities concerning the combined application together with other procedures. The obtained knowledge is used for the development of new algorithms and procedures to allow a satisfying segmentation and reconstruction of the scene, corresponding to the available sampling densities and precisions. Besides the segmentation of the point cloud data, this thesis presents different visualization and reconstruction methods to achieve a wider range of possible applications of the developed system for data export and utilization in different third party software tools.

Page generated in 0.0616 seconds