• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 75
  • 16
  • 10
  • 9
  • 5
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 148
  • 28
  • 28
  • 20
  • 19
  • 18
  • 17
  • 16
  • 13
  • 11
  • 11
  • 10
  • 10
  • 10
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Improving The Sub-cortical Gm Segmentation Using Evolutionary Hierarchical Region Merging

Ciftcioglu, Mustafa Ulas 01 June 2011 (has links) (PDF)
Segmentation of sub-cortical Gray Matter (GM) structures in magnetic resonance brain images is crucial in clinic and research for many purposes such as early diagnosis of neurological diseases, guidance of surgical operations and longitudinal volumetric studies. Unfortunately, the algorithms that segment the brain into 3 tissues usually suffer from poor performance in the sub-cortical region. In order to increase the detection of sub-cortical GM structures, an evolutionary hierarchical region merging approach, abbreviated as EHRM, is proposed in this study. Through EHRM, an intensity based region merging is utilized while merging is allowed to proceed among disconnected regions. Texture information is also incorporated into the scheme to prevent the region merging between tissues with similar intensity but different texture properties. The proposed algorithm is tested on real and simulated datasets. The performance is compared with a popular segmentation algorithm, which is also intensity driven: the FAST algorithm [1] in the widely used FSL suite. EHRM is shown to make a significant improvement the detection of sub-cortical GM structures. Average improvements of 10%, 36% and 22% are achieved for caudate, putamen and thalamus respectively. The accuracy of volumetric estimations also increased for GM and WM. Performance of EHRM is robust in presence of bias field. In addition, EHRM operates in O(N) complexity. Furthermore, the algorithm proposed here is simple, because it does not incorporate spatial priors such as an atlas image or intensity priors. With these features, EHRM may become a favorable alternative to the existing brain segmentation tools.
52

Scalable frameworks and algorithms for cluster ensembles and clustering data streams

Hore, Prodip 01 June 2007 (has links)
Clustering algorithms are an important tool for data mining and data analysis purposes. Clustering algorithms fall under the category of unsupervised learning algorithms, which can group patterns without an external teacher or labels using some kind of similarity metric. Clustering algorithms are generally iterative in nature and computationally intensive. They will have disk accesses in every iteration for data sets larger than memory, making the algorithms unacceptably slow. Data could be processed in chunks, which fit into memory, to provide a scalable framework. Multiple processors may be used to process chunks in parallel. Clustering solutions from each chunk together form an ensemble and can be merged to provide a global solution. So, merging multiple clustering solutions, an ensemble, is important for providing a scalable framework. Combining multiple clustering solutions or partitions, is also important for obtaining a robust clustering solution, merging distributed clustering solutions, and providing a knowledge reuse and privacy preserving data mining framework. Here we address combining multiple clustering solutions in a scalable framework. We also propose algorithms for incrementally clustering large or very large data sets. We propose an algorithm that can cluster large data sets through a single pass. This algorithm is also extended to handle clustering infinite data streams. These types of incremental/online algorithms can be used for real time processing as they don't revisit data and are capable of processing data streams under the constraint of limited buffer size and computational time. Thus, different frameworks/algorithms have been proposed to address scalability issues in different settings. To our knowledge we are the first to introduce scalable algorithms for merging cluster ensembles, in terms of time and space complexity, on large real world data sets. We are also the first to introduce single pass and streaming variants of the fuzzy c means algorithm. We have evaluated the performance of our proposed frameworks/algorithms both on artificial and large real world data sets. A comparison of our algorithms with other relevant algorithms is discussed. These comparisons show the scalability and effectiveness of the partitions created by these new algorithms.
53

Αναδιάρθρωση τραπεζικών δικτύων μετά από εξαγορές ή συγχωνεύσεις με χρήση μοντέλων μαθηματικού προγραμματισμού

Σακελλαρίου, Παναγιώτα 07 October 2014 (has links)
Ο τραπεζικός τομέας υφίσταται σημαντικές αλλαγές τα τελευταία χρόνια σε διεθνές επίπεδο, ως απόρροια των αλλαγών στο γενικότερο οικονομικό γίγνεσθαι. Ειδικότερα στη χώρα μας είναι έκδηλη τα τελευταία χρόνια η ανάγκη για αναπροσαρμογή του τρόπου λειτουργίας των πιστωτικών ιδρυμάτων, ώστε να προσαρμοστούν στις μεταβαλλόμενες απαιτήσεις της πελατειακής τους βάσης. Στόχος της παρούσας εργασίας είναι να προσεγγιστεί το ζήτημα της αναδιάρθρωσης των τραπεζικών δικτύων, όπως αυτά έχουν προκύψει μετά από εξαγορές ή συγχωνεύσεις, με απώτερο στόχο την βέλτιστη εξυπηρέτηση των πελατών με το μικρότερο δυνατό κόστος λειτουργίας. Για να επιτευχθεί η αναδιάρθρωση προτείνονται δύο μαθηματικά μοντέλα χωροθέτησης εγκαταστάσεων, που εξετάζουν το πρόβλημα με βάση δύο διαφορετικούς άξονες: το ένα στοχεύοντας στην ελαχιστοποίηση του κόστους και το δεύτερο στην μεγιστοποίηση της καλυπτόμενης ζήτησης με δεδομένο προϋπολογισμό. Για την επίλυση των μαθηματικών μοντέλων αξιοποιήθηκαν δεδομένα μεγάλου τραπεζικού ομίλου που δραστηριοποιείται στην Ελλάδα, επικεντρώνοντας στο υπάρχον εγκατεστημένο δίκτυο εντός του Δήμου Πατρέων. Από την εμπειρική εφαρμογή διαπιστώθηκε σαφής πλεονασμός του υπάρχοντος δικτύου σε μονάδες και προσωπικό, ενώ παράλληλα τα μοντέλα καταλήγουν σε διαφορετικές λύσεις χωροθέτησης για κάθε εκδοχή του προβλήματος. Με όποιο τρόπο όμως και να προσεγγιστεί το πρόβλημα της αναδιάρθρωσης, φαίνεται ότι τα αποτελέσματα αναφορικά με το πλήθος των μονάδων και του απασχολούμενου προσωπικού συγκλίνουν, εξασφαλίζοντας ταυτόχρονα σημαντική μείωση του κόστους λειτουργίας. / The banking sector is undergoing significant changes recently at an international level, as a consequence of changes in the general economic status. Especially in our country, the need for adjusting the operation of credit institutions to adapt to the changing demands of their customer base is obvious. The aim of this paper is to approach the issue of restructuring of banking networks, as they have arisen after mergers or acquisitions, aiming to optimal customer service at the lowest possible operating cost. To achieve the restructuring we propose two mathematical models of facility location, considering the problem on two different axes: firstly aiming at minimizing the cost and secondly at maximizing the demand covered with pre-given budget. To solve the mathematical models we utilized data from a large banking group operating in Greece, focusing on existing established network within the Municipality of Patras. The empirical application showed a clear redundancy of the present network units and personnel, while models result in different solutions of facility location for each version of the problem. Whichever way used to approach the problem of restructuring, it seems that the results regarding the number of units and personnel converge, while ensuring a significant reduction in operating costs.
54

The transformation of the South African higher education sector through mergers - the case study of the Durban University of Technology

Chetty, Gopalkrishna January 2010 (has links)
Submitted in fulfilment for the requirements for the degree of Doctor of Technology: Human Resource Management, Durban University of Technology, 2010. / Since the advent of democracy in 1994, South African society has been undergoing a rapid transformation. By the time the second democratic elections had come around, the focus had turned to transforming and restructuring the higher education sector. Mergers became an instrument in the hands of the Government to restructure and configure the higher education landscape. Mergers are not new and have been used by many countries to transform or restructure their higher education sector. However, comparison with higher education mergers in other countries would point to the most ambitious change programme ever undertaken in recent times, especially given the large scale of mergers and incorporations that were planned for the South African higher education sector. The Durban University of Technology merger which took place in 2002 preceded the main wave of the Government decreed mergers that took place in 2004 and in 2005. This thesis focuses on contributing to the knowledge of higher education mergers by investigating higher education mergers in South Africa from a micro as well as macro perspective. At a micro level the study undertaken is a case study of the Durban University of Technology (DUT), the first merged higher institution in the country, while from a macro perspective, an examination into the impact that mergers have had on the transformation of the higher education sector. This second part has been undertaken through a perception survey of staff at merged institutions. The emphasis is on providing a detailed analysis of DUT merger in terms of its problems, pitfalls and peculiarities. It covers an examination of the processes, procedures, practices and trials and tribulations when two or more higher education institutions merge. This study is guided by two central research questions. The first is: What can be learned about higher education mergers as examined through the Durban University of Technology (DUT) case study? The second question, whether higher education mergers have been successful or otherwise in South Africa? In particular this question seeks to elicit whether the broader goals and objectives of higher education transformation have been or are being achieved or accomplished through mergers in the sector. The literature review emphasized a conceptual understanding of higher education mergers, merger processes and steps. The review also touches on the limitations and gaps in higher education literature when examined from a iv South African perspective. In particular a detailed study of the South African merger policy development was undertaken. The DUT merger experience allows one to draw a distinction between what is termed the technical merger issues and the soft issues. Technical aspects of a merger would simply be those aspects that have to be done to ensure the physical merger of two or more institutions. In a sense it is the soft issues which shape the merger process. These soft issues largely relate to people, the consultation with stakeholders, the negotiations and the cultural dimensions. Despite the assumption that voluntary mergers are generally easier to negotiate and execute because of the perceived greater involvement of stakeholders, the DUT case provides no evidence to support this position. If anything, the DUT merger although voluntary was at best difficult and beset with people and human relations problems. The findings from the case study point to the following: Government initiated mergers (forced mergers) are less likely to failure because of a greater national agenda; that an all new embracing style of leadership is needed when institutions undergo great changes such as through a merger; that where trade unions exist they will become powerful forces if people management issues are neglected or dealt with poorly during the merger. A genuinely co-operative and consultative process is the way to go; that to overcome past cultural differences a new cultural identity must be established early in the merger by sharing the new vision and mission of the new entity; that mergers could be understood and managed in three distinct phases, which are, the pre-merger phase, the integration phase and the consolidation phase. Equally there are a number of merger steps. By reviewing the processes, procedures and practices of the DUT merger, a model has been developed to understand how mergers take place. In respect of the perception survey of merged institutions, respondents saw some benefits accrue to the academic goals and there were some efficiency gains. More students came into the higher education system. Not many were complementary about their state of physical resources including teaching and learning facilities. Some even felt that teaching and learning were set back during the merger period and this is backed up by key statistics for example like pass rates and dropout rates. Given the difficulty with the softer issues in mergers, much of the blame seems to have been directed at Management and Leadership. Nearly 50% of respondents felt that mergers did not result in high quality Management, while a slightly lower percentage of 42% thought that v mergers did not help establish high quality Councils. Many thought that their governance structures and systems were also weakened. Mergers are complex and it invariably affects the entire institution. It requires careful planning and preparation, inclusivity and a developmental approach to mergers. It also requires effective leaders to manage change of such magnitude. These are the ingredients to ensuring successful higher education mergers.
55

Assisting in the reuse of existing materials to build adaptive hypermedia

Zemirline, Nadjet 12 July 2011 (has links) (PDF)
Nowadays, there is a growing demand for personalization and the "one-size-fits-all" approach for hypermedia systems is no longer applicable. Adaptive hypermedia (AH) systems adapt their behavior to the needs of individual users. However due to the complexity of their authoring process and the different skills required from authors, only few of them have been proposed. These last years, numerous efforts have been put to propose assistance for authors to create their own AH. However, as explained in this thesis some problems remain.In this thesis, we tackle two particular problems. A first problem concerns the integration of authors' materials (information and user profile) into models of existing systems. Thus, allowing authors to directly reuse existing reasoning and execute it on their materials. We propose a semi-automatic merging/specialization process to integrate an author's model into a model of an existing system. Our objectives are twofold: to create a support for defining mappings between elements in a model of existing models and elements in the author's model and to help creating consistent and relevant models integrating the two models and taking into account the mappings between them.A second problem concerns the adaptation specification, which is famously the hardest part of the authoring process of adaptive web-based systems. We propose an EAP framework with three main contributions: a set of elementary adaptation patterns for the adaptive navigation, a typology organizing the proposed elementary adaptation patterns and a semi-automatic process to generate adaptation strategies based on the use and the combination of patterns. Our objectives are to define easily adaptation strategies at a high level by combining simple ones. Furthermore, we have studied the expressivity of some existing solutions allowing the specification of adaptation versus the EAP framework, discussing thus, based on this study, the pros and cons of various decisions in terms of the ideal way of defining an adaptation language. We propose a unified vision of adaptation and adaptation languages, based on the analysis of these solutions and our framework, as well as a study of the adaptation expressivity and the interoperability between them, resulting in an adaptation typology. The unified vision and adaptation typology are not limited to the solutions analysed, and can be used to compare and extend other approaches in the future. Besides these theoretical qualitative studies, this thesis also describes implementations and experimental evaluations of our contributions in an e-learning application.
56

Introdução de dados auxiliares na classificação de imagens digitais de sensoriamento remoto aplicando conceitos da teoria da evidência

Lersch, Rodrigo Pereira January 2008 (has links)
Nesta tese investiga-se uma nova abordagem visando implementar os conceitos propostos na Teoria da Evidencia para fins de classificação de imagens digitais em Sensoriamento Remoto. Propõe-se aqui a utilização de variáveis auxiliares, estruturadas na forma de Planos de Informação (P.I.s) como em um SIG para gerar dados de confiança e de plausibilidade. São então aplicados limiares aos dados de confiança e de plausibilidade, com a finalidade de detectar erros de inclusão e de omissão, respectivamente, na imagem temática. Propõe-se nesta tese que estes dois limiares sejam estimados em função das acurácias do usuário e do produtor. A metodologia proposta nesta tese foi testada em uma área teste, coberta pela classe Mata Nativa com Araucária. O experimento mostrou que a metodologia aqui proposta atinge seus objetivos. / In this thesis we investigate a new approach to implement concepts developed by the Theory of Evidence to Remote Sensing digital image classification. In the proposed approach auxiliary variables are structured as layers in a GIS-like format to produce layers of belief and plausibility. Thresholds are applied to the layers of belief and plausibility to detect errors of commission and omission, respectively on the thematic image. The thresholds are estimated as functions of the user’s and producer’s accuracy. Preliminary tests were performed over an area covered by natural forest with Araucaria, showing some promising results.
57

Introdução de dados auxiliares na classificação de imagens digitais de sensoriamento remoto aplicando conceitos da teoria da evidência

Lersch, Rodrigo Pereira January 2008 (has links)
Nesta tese investiga-se uma nova abordagem visando implementar os conceitos propostos na Teoria da Evidencia para fins de classificação de imagens digitais em Sensoriamento Remoto. Propõe-se aqui a utilização de variáveis auxiliares, estruturadas na forma de Planos de Informação (P.I.s) como em um SIG para gerar dados de confiança e de plausibilidade. São então aplicados limiares aos dados de confiança e de plausibilidade, com a finalidade de detectar erros de inclusão e de omissão, respectivamente, na imagem temática. Propõe-se nesta tese que estes dois limiares sejam estimados em função das acurácias do usuário e do produtor. A metodologia proposta nesta tese foi testada em uma área teste, coberta pela classe Mata Nativa com Araucária. O experimento mostrou que a metodologia aqui proposta atinge seus objetivos. / In this thesis we investigate a new approach to implement concepts developed by the Theory of Evidence to Remote Sensing digital image classification. In the proposed approach auxiliary variables are structured as layers in a GIS-like format to produce layers of belief and plausibility. Thresholds are applied to the layers of belief and plausibility to detect errors of commission and omission, respectively on the thematic image. The thresholds are estimated as functions of the user’s and producer’s accuracy. Preliminary tests were performed over an area covered by natural forest with Araucaria, showing some promising results.
58

Automatické spojování mračen bodů / Automatic Point Clouds Merging

Hörner, Jiří January 2018 (has links)
Multi-robot systems are an established research area with a growing number of applications. Efficient coordination in such systems usually requires knowledge of robot positions and the global map. This work presents a novel map-merging algorithm for merging 3D point cloud maps in multi-robot systems, which produces the global map and estimates robot positions. The algorithm is based on feature- matching transformation estimation with a novel descriptor matching scheme and works solely on point cloud maps without any additional auxiliary information. The algorithm can work with different SLAM approaches and sensor types and it is applicable in heterogeneous multi-robot systems. The map-merging algorithm has been evaluated on real-world datasets captured by both aerial and ground-based robots with a variety of stereo rig cameras and active RGB-D cameras. It has been evaluated in both indoor and outdoor environments. The proposed algorithm was implemented as a ROS package and it is currently distributed in the ROS distribution. To the best of my knowledge, it is the first ROS package for map-merging of 3D maps.
59

Merging meshes using dynamic regular triangulation / Combinação de malhas utilizando triangulações regulares dinâmicas

Silva, Luis Fernando Maia Santos January 2010 (has links)
Malhas simpliciais são utilizadas em várias áreas da Computação Gráfica e Engenharia, como por exemplo, em vizualização, simulação, prototipação, além de outras aplicações. Este tipo de malha é, geralmente, utilizada como aproximações discretas de espaços contínuos, onde eles oferecem representações flexíveis e eficientes. Muito esforço é gasto visando gerar malhas de boa qualidade, porém, em alguns casos as malhas acabam sendo modificadas. Entretanto, este tipo de operação é geralmente custosa e inflexível, o que pode resultar na geraão de malhas bem diferentes das originais. A habilidade de manipular cenas dinâmicas revela-se um dos problemas mais desafiadores da computação gráfica. Este trabalho propõe um método alternativo para atualizar malhas simpliciais que vai além de mudanças geométricas e topológicas. Tal método explora uma das propriedade das Tringulações de Delaunay com Pesos, que permite a usá-las para definir implicitamente as relações de conectividade de uma malha. Ao contrário de manter as informações de conectividade explicitamente, a atual abordagem simplesmente armazena uma coleção de pesos associados a cada vértice. Além disso, criamos um algoritmo para calcular uma Tringulação de Delaunay com Pesos a partir de uma dada triangulação. O algoritmo consiste em uma busca em largura que atribui pesos aos vértices, e uma estratégia de de subdivisão para assegurar que a triangulação reconstruída será correspondente à original. Este método apresenta diversas aplicações e, em particular, permite a criação de um sistema simples de realizar combinação entre triangulações, que será ilustrada com exemplos em 2D e 3D. / Simplicial meshes are used in many fields of Computer Graphics and Engineering, for instance, in visualization, simulation, prototyping, among other applications. This kind of mesh is often used as discrete approximations of continuous spaces, where they offer flexible and efficient representations. Considerable effort is spent in generating good quality meshes, but in some applications the meshes can be modified over time. However, this kind of operation is often very expensive and inflexible, sometimes leading to results very different from the original meshes. The ability to handle dynamic scenes reveals itself as one of the most challenging problems in computer graphics. This work proposes an alternative technique for updating simplicial meshes that undergo geometric and topological changes. It explores the property that a Weighted Delaunay Triangulation (WDT) can be used to implicitly define the connectivity of a mesh. Instead of explicitly maintaining connectivity information, this approach simply keeps a collection of weights associated to each vertex. It consists of an algorithm to compute a WDT from any given triangulation, which relies on a breadth-first traversal to assign weights to vertices, and a subdivision strategy to ensure that the reconstructed triangulation conforms with the original one. This technique has many applications and, in particular, it allows for a very simple method of merging triangulations, which is illustrated with both 2D and 3d examples.
60

Merging meshes using dynamic regular triangulation / Combinação de malhas utilizando triangulações regulares dinâmicas

Silva, Luis Fernando Maia Santos January 2010 (has links)
Malhas simpliciais são utilizadas em várias áreas da Computação Gráfica e Engenharia, como por exemplo, em vizualização, simulação, prototipação, além de outras aplicações. Este tipo de malha é, geralmente, utilizada como aproximações discretas de espaços contínuos, onde eles oferecem representações flexíveis e eficientes. Muito esforço é gasto visando gerar malhas de boa qualidade, porém, em alguns casos as malhas acabam sendo modificadas. Entretanto, este tipo de operação é geralmente custosa e inflexível, o que pode resultar na geraão de malhas bem diferentes das originais. A habilidade de manipular cenas dinâmicas revela-se um dos problemas mais desafiadores da computação gráfica. Este trabalho propõe um método alternativo para atualizar malhas simpliciais que vai além de mudanças geométricas e topológicas. Tal método explora uma das propriedade das Tringulações de Delaunay com Pesos, que permite a usá-las para definir implicitamente as relações de conectividade de uma malha. Ao contrário de manter as informações de conectividade explicitamente, a atual abordagem simplesmente armazena uma coleção de pesos associados a cada vértice. Além disso, criamos um algoritmo para calcular uma Tringulação de Delaunay com Pesos a partir de uma dada triangulação. O algoritmo consiste em uma busca em largura que atribui pesos aos vértices, e uma estratégia de de subdivisão para assegurar que a triangulação reconstruída será correspondente à original. Este método apresenta diversas aplicações e, em particular, permite a criação de um sistema simples de realizar combinação entre triangulações, que será ilustrada com exemplos em 2D e 3D. / Simplicial meshes are used in many fields of Computer Graphics and Engineering, for instance, in visualization, simulation, prototyping, among other applications. This kind of mesh is often used as discrete approximations of continuous spaces, where they offer flexible and efficient representations. Considerable effort is spent in generating good quality meshes, but in some applications the meshes can be modified over time. However, this kind of operation is often very expensive and inflexible, sometimes leading to results very different from the original meshes. The ability to handle dynamic scenes reveals itself as one of the most challenging problems in computer graphics. This work proposes an alternative technique for updating simplicial meshes that undergo geometric and topological changes. It explores the property that a Weighted Delaunay Triangulation (WDT) can be used to implicitly define the connectivity of a mesh. Instead of explicitly maintaining connectivity information, this approach simply keeps a collection of weights associated to each vertex. It consists of an algorithm to compute a WDT from any given triangulation, which relies on a breadth-first traversal to assign weights to vertices, and a subdivision strategy to ensure that the reconstructed triangulation conforms with the original one. This technique has many applications and, in particular, it allows for a very simple method of merging triangulations, which is illustrated with both 2D and 3d examples.

Page generated in 0.0545 seconds