• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 2
  • 2
  • 1
  • Tagged with
  • 10
  • 10
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Using Ensemble Machine Learning Methods in Estimating Software Development Effort

Kanneganti, Alekhya January 2020 (has links)
Background: Software Development Effort Estimation is a process that focuses on estimating the required effort to develop a software project with a minimal budget. Estimating effort includes interpretation of required manpower, resources, time and schedule. Project managers are responsible for estimating the required effort. A model that can predict software development effort efficiently comes in hand and acts as a decision support system for the project managers to enhance the precision in estimating effort. Therefore, the context of this study is to increase the efficiency in estimating software development effort. Objective: The main objective of this thesis is to identify an effective ensemble method to build and implement it, in estimating software development effort. Apart from this, parameter tuning is also implemented to improve the performance of the model. Finally, we compare the results of the developed model with the existing models. Method: In this thesis, we have adopted two research methods. Initially, a Literature Review was conducted to gain knowledge on the existing studies, machine learning techniques, datasets, ensemble methods that were previously used in estimating Software Development Effort. Then a controlled Experiment was conducted in order to build an ensemble model and to evaluate the performance of the ensemble model for determining if the developed model has a better performance when compared to the existing models.   Results: After conducting literature review and collecting evidence, we have decided to build and implement stacked generalization ensemble method in this thesis, with the help of individual machine learning techniques like Support vector regressor (SVR), K-Nearest Neighbors regressor (KNN), Decision Tree Regressor (DTR), Linear Regressor (LR), Multi-Layer Perceptron Regressor (MLP) Random Forest Regressor (RFR), Gradient Boosting Regressor (GBR), AdaBoost Regressor (ABR), XGBoost Regressor (XGB). Likewise, we have decided to implement Randomized Parameter Optimization and SelectKbest function to implement feature section. Datasets like COCOMO81, MAXWELL, ALBERCHT, DESHARNAIS were used. Results of the experiment show that the developed ensemble model performs at its best, for three out of four datasets. Conclusion: After evaluating and analyzing the results obtained, we can conclude that the developed model works well with the datasets that have continuous, numeric type of values. We can also conclude that the developed ensemble model outperforms other existing models when implemented with COCOMO81, MAXWELL, ALBERCHT datasets.
2

Effort Modeling and Programmer Participation in Open Source Software Projects

Koch, Stefan January 2005 (has links) (PDF)
This paper analyses and develops models for programmer participation and effort estimation in open source software projects. This has not yet been a centre of research, although any results would be of high importance for assessing the efficiency of this model and for various decision-makers. In this paper, a case study is used for hypotheses generation regarding manpower function and effort modeling, then a large data set retrieved from a project repository is used to test these hypotheses. The main results are that Norden-Rayleigh-based approaches need to be complemented to account for the addition of new features during the lifecycle to be usable in this context, and that programmer-participation based effort models show significantly less effort than those based on output metrics like lines-of-code. (author's abstract) / Series: Working Papers on Information Systems, Information Business and Operations
3

Integrating dependencies into the technology portfolio: a feed-forward case study for near-earth asteroids

Taylor, Christianna Elizabeth 15 November 2011 (has links)
Technology Portfolios are essential to the evolution of large complex systems. In an effort to harness the power of new technologies, technology portfolios are used to predict the value of integrating them into the project. This optimization of the technology portfolio creates large complex design spaces; however, many processes operate on the assumption that their technology elements have no dependency on each other, because dependencies are not well defined. This independence assumption simplifies the process, but suggests that these environments are missing out on decision power and fidelity. Therefore, this thesis proposed a way to explain the variations in Portfolio recommendations as a function of adding dependencies. Dependencies were defined in accordance with their development effort figures of merit and possible relationships. The thesis then went on to design a method to integrate two dependency classes into the technology portfolio framework to showcase the effect of incorporating dependencies. Results indicated that Constraint Dependencies reduced the portfolio or stayed the same, while Value Dependencies changed the portfolio optimization completely; making the user compare two different optimization results. Both indicated that they provided higher fidelity with the inclusion of the information added. Furthermore, the upcoming NASA Near-Earth Asteroid Campaign was studied as a case study. This campaign is the plan to send humans to an asteroid by 2025 announced by President Obama in April 2010. The campaign involves multiple missions, capabilities, and technologies that must be demonstrated to enable deep-space human exploration. Therefore, this thesis capitalized on that intention to show how adopting technology in earlier missions can act as a feed-forward method to demonstrate technology for future missions. The thesis showed the baseline technology portfolio, integrated dependencies into the process, compared its findings to the baseline case, and ultimately showed how adding higher fidelity into the process changes the user's decisions. Findings concerning the Near-Earth Asteroid Campaign, the use of dependencies to add fidelity and implications for future work are discussed.
4

Addressing software-managed cache development effort in GPGPUs

Lashgar, Ahmad 29 August 2017 (has links)
GPU Computing promises very high performance per watt for highly-parallelizable workloads. Nowadays, there are various programming models developed to utilize the computational power of GPGPUs. Low-level programming models provide full control over GPU resources and allow programmers to achieve peak performance of the chip. In contrast, high-level programming models hide GPU-specific programming details and allow programmers to mainly express parallelism. Later, the compiler parses the parallelization notes and translates them to low-level programming models. This saves tremendous development effort and improves productivity, often achieved at the cost of sacrificing performance. In this dissertation, we investigate the limitations of high-level programming models in achieving a performance near to low-level models. Specifically, we study the performance and productivity gap between high-level OpenACC and low-level CUDA programming models and aim at reducing the performance gap, while maintaining the productivity advantages. We start this study by developing our in-house OpenACC compiler. Our compiler, called IPMACC, translates OpenACC for C to CUDA and uses the system compile to generate GPU binaries. We develop various micro-benchmarks to understand GPU structure and implement a more efficient OpenACC compiler. By using IPMACC, we evaluate the performance and productivity gap between a wide set of OpenACC and CUDA kernels. From our findings, we conclude that one of the major reasons behind the big performance gap between OpenACC and CUDA is CUDA’s flexibility in exploiting the GPU software-managed cache. Identifying this key benefit in low-level CUDA, we follow three effective paths in utilizing software-managed cache similar to CUDA, but at a lower development effort (e.g. using OpenACC instead). In the first path, we explore the possibility of employing existing OpenACC directives in utilizing software-managed cache. Specifically, the cache directive is devised in OpenACC API standard to allow the use of software-managed cache in GPUs. We introduce an efficient implementation of OpenACC cache directive that performs very close to CUDA. However, we show that the use of the cache directive is limited and the directive may not offer the full-functionality associated with the software-managed cache, as existing in CUDA. In the second path, we build on our observation on the limitations of the cache directive and propose a new OpenACC directive, called the fcw directive, to address the shortcomings of the cache directive, while maintaining OpenACC productivity advantages. We show that the fcw directive overcomes the cache directive limitations and narrows down the performance gap between CUDA and OpenACC significantly. In the third path, we propose fully-automated hardware/software approach, called TELEPORT, for software-managed cache programming. On the software side, TELEPORT statically analyzes CUDA kernels and identifies opportunities in utilizing the software-managed cache. The required information is passed to the GPU via API calls. Based on this information, on the hardware side, TELEPORT prefetches the data to the software-managed cache at runtime. We show that TELEPORT can improve performance by 32% on average, while lowering the development effort by 2.5X, compared to hand-written CUDA equivalent. / Graduate
5

Functional Similarity Impact On The Relation Between Functional Size And Software Development Effort

Ozcan Top, Ozden 01 September 2008 (has links) (PDF)
In this study, we identified one of the reasons of the low correlation between functional size and development effort which is overlooking the similarity of the functions during the mapping of the functional size and development effort. We developed a methodology (SiRFuS) that is based on the idea of the reuse of the similar functions internally to provide high correlation between functional size and development effort. The method is developed for the identification of the similar functions based on the method of Santillo and Abran. Similarity percentages among the functional processes and Similarity Reflective Functional Sizes are computed to attain adjusted functional sizes. The similarity reflective functional sizes were named as Discrete Similarity Reflective Functional Size and Continuous Similarity Reflective Functional Size based on the characteristics of the adjusted functional sizes. The SiRFuS method consists of three stages: measurement of the software product with COSMIC Functional Size Measurement (FSM) method / identification of the functional similarities bases on the measurement results and calculation of the similarity reflective functional sizes. In order to facilitate the detection of similar functions, calculation of the percentage of the similarities and similarity reflective functional sizes / a software tool is developed based on the SiRFuS method. Two case studies were performed in order to identify the improvement opportunities and evaluate the applicability of the method and the tool.
6

FUNDAMENTOS PARA FORMULAÇÃO DE ESTRATÉGIAS COMPETITIVAS EM ORGANIZAÇÕES DE DESENVOLVIMENTO DE SOFTWARE

Fabrin, Alex Onacli Moreira 08 December 2005 (has links)
Made available in DSpace on 2016-12-12T20:32:00Z (GMT). No. of bitstreams: 1 DISSERTACAO ALEX ONACLI MOREIRA FABRIN.pdf: 469292 bytes, checksum: 2875f4e293515a533b6ee436bceb6d42 (MD5) Previous issue date: 2005-12-08 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / One of the most important steps in planning software is budgeting, usually considered a complex task due to the difficulty in precision. To ensure revenue, generally a margin of error is added to the overall price, an act which could harm the organization s competitiveness in the market. Based on this scenario, this work presents a study on the competitiveness among organizations of software development which use the objectoriented approach. In order to identify the fundamentals for formulating competitive strategies for these organizations, the exploratory research method was adopted. The organizations processes and procedures were taken into consideration, so as to form prices for their products and services in this industry, so that methodologies can be formulated to help avoid diversion and error. Therefore, in order to assist the organization in formulating such strategies, the influence of the five competitive forces in the software development industry was also taken into consideration, according to the analysis of industries proposed by Michael Porter. The characteristics of software quality have been presented according to the ISO/IEC 9126-1 Standard and, as a basis for the continuous improvement of quality in the software development industry, the Capability Maturity Model Integration and MPS.BR were used. The Use Case Points technique was considered for measuring the necessary efforts for software development, presenting some factors that influence its appropriateness for different organizations. It is hoped that this work may contribute towards bringing together both Management and Computer Science, by providing a text which may assist management professionals in their activities in software development organizations, and guide the managers of such organizations who do not have a degree in Management. / Uma das etapas importantes no planejamento de um software é o seu orçamento que é considerada uma tarefa de grande complexidade apresentando muita dificuldade de precisão. Para garantir a rentabilidade, geralmente um software é acrescido em seu preço de uma margem de erro que pode prejudicar a competitividade da organização no mercado. Com base neste cenário, este trabalho apresenta um estudo sobre o ambiente competitivo das organizações de desenvolvimento de software que utilizam a abordagem de orientação a objetos. Com o objetivo de identificar os fundamentos para a formulação de estratégias competitivas para estas organizações, adotou-se como metodologia a pesquisa exploratória. Considerou-se que, para a formação dos preços dos produtos e serviços dessa indústria, é preciso conhecer suas atividades, seus processos e seus procedimentos, para formular metodologias que permitam evitar desvios e erros. Assim, para orientar as formulações de estratégias das organizações, considerou-se a influência das cinco forças competitivas na indústria de desenvolvimento de software, conforme a técnica de análise de indústrias proposta de Michael Porter. As características de qualidade de software foram apresentadas de acordo com a Norma ISO/IEC 9126-1 e, como base para o aperfeiçoamento contínuo da qualidade na indústria de desenvolvimento de software, utilizou-se o Capability Maturity Model Integration e o MPS.BR. Para a mensuração dos esforços necessários para o desenvolvimento de software, considerou-se a técnica de pontos de caso de uso, apresentando-se alguns fatores que influenciam sua adequação para as diferentes organizações. Espera-se que este trabalho possa contribuir para a aproximação das áreas da Administração e da Computação, fornecendo um texto que possa auxiliar os profissionais da administração nas suas atividades em organizações de desenvolvimento de software, e orientar os gestores de organizações de desenvolvimento de software que não possuem formação na área de Administração.
7

Mellanchefer som utvecklar : om förutsättningar för hållbart utvecklingsarbete inom vård och omsorg / Middle managers who implement changes : prerequisites for sustainable development efforts within healthcare and caring

Larsson, Kjerstin January 2008 (has links)
Syftet med avhandlingen är att beskriva, analysera och diskutera vilka organisatoriska och interorganisatoriska förhållanden som har betydelse för mellanchefers möjlighet att driva hållbar verksamhetsutveckling. Avsikten är att studera samspelet mellan organisatoriska och interorganisatoriska förhållanden kopplat till mellanchefers förändringskompetens. Forskningen har skett inom ramen för ett omfattande nationellt FoU-projekt inom vård och omsorg. Projektets syfte var att arbeta med ett hållbart utvecklingsarbete som byggde på en bred delaktighet bland anställda och chefer. Studien är en kvalitativ studie med ett interaktivt förhållningssätt, vilket innebär en ambition att forska med de berörda inom ramen för en gemensam kunskapsbildning. Den interaktiva forskningen strävar efter att vara till praktisk nytta, men framförallt är syftet att bidra till teoriutvecklingen. De datainsamlingsmetoder som använts är arbetsplatsbesök, intervjuer, ”critical incident”-metoden, reflektions-PM (inom ramen för två utbildningar), analysdialog- och reflektionsseminarier, informella samtal samt återföringsträffar. Ett antal gemensamma aktiviteter har anordnats inom ramen för det nationella projektet, vilka även har fungerat som datainsamling och gemensam analys, såsom nationella konferenser, styrgruppsmöten m.m. Avhandlingens resultat visar att mellancheferna inom vård och omsorgssektorn kan ha en strategisk roll i verksamhetsutvecklingen. Avsikten med min forskning har inte varit att generalisera resultaten i traditionell mening, men att visa på mekanismer och förutsättningar för mellancheferna vid organisering av utvecklingsarbetet. De organisatoriska villkoren – i form av ekonomi, ansvar, befogenheter, professionalisering och företagskultur – påverkar starkt mellanchefernas förutsättningar att driva ett utvecklingsarbete. Min forskning visar på intressanta möjligheter för mellancheferna när det gäller att åstadkomma en vertikal och horisontell integrering av utvecklingsarbetet. Det finns exempel på hur mellanchefer kunnat påverka och utnyttja dessa förutsättningar för att skapa ett handlingsutrymme i organisationen för att driva ett utvecklingsarbete. De har använt sig av målstyrning, skapat utrymme för lärande och reflektion samt skapat en organisationskultur som stöder utveckling. De interorganisatoriska förutsättningarna – i form av externt nätverk, forskarstöd och partsamverkan – har varit viktiga för mellancheferna i deras utvecklingsarbete. / The aim of the dissertation is to describe, analyse and discuss those organisational and interorganisational conditions which are of importance for the enablement of middle managers to drive sustainable development in activity areas. The intention is to study the interplay between organisational and inter-organisational conditions coupled to the change competence of the middle managers. The research has been carried out within the framework of an extensive national R&D project within healthcare and caring. The aim of the project was to work with a sustainable development effort which had as its foundation a broad participation between employees and management. The study is a qualitative study with an interactive type of approach, which implies an ambition to carry out research with those affected within the framework of a joint knowledge gathering. The interactive research attempts to be of practical use, but the aim is primarily to contribute to theoretical development. The data collection methods which have been used are visits to the workplaces, interviews, “critical-incident” methods, reflective-PMs (within the framework of two education areas), analytical dialog and reflective seminars, informal conversations and feedback meetings. A number of common activities have been organised within the framework of the national project, which have also functioned as data collection and common analyses, as well as national conferences, steering group meetings etc. The results of the dissertation show that middle managers within the healthcare and caring sector can have a strategic role in the development of this activity area. The intention of my research has not been to generalise the results in the traditional interpretation, but to reveal mechanisms and prerequisites for the middle managers in organising development efforts. The organisational conditions – in the form of economy, responsibility, authority, professionalisation and company culture – strongly affect the prerequisites of the middle manager for driving a development effort. My research demonstrates interesting opportunities for the middle managers concerning achieving a vertical and horizontal integration of the development effort. There are examples of how the middle managers have been able to influence and utilise these prerequisites in order to create room for action in the organisation for driving a development effort. They have availed themselves of goal orientation, created room for learning and reflection and have created an organisational culture that supports development. The inter-organisational prerequisitess – in the form of an external network, research support and collaboration of parties – has been important for the middle managers in their development efforts.
8

Mellanchefer som utvecklar : om förutsättningar för hållbart utvecklingsarbete inom vård och omsorg

Larsson, Kjerstin January 2008 (has links)
<p>Syftet med avhandlingen är att beskriva, analysera och diskutera vilka organisatoriska och interorganisatoriska förhållanden som har betydelse för mellanchefers möjlighet att driva hållbar verksamhetsutveckling. Avsikten är att studera samspelet mellan organisatoriska och interorganisatoriska förhållanden kopplat till mellanchefers förändringskompetens. Forskningen har skett inom ramen för ett omfattande nationellt FoU-projekt inom vård och omsorg. Projektets syfte var att arbeta med ett hållbart utvecklingsarbete som byggde på en bred delaktighet bland anställda och chefer.</p><p>Studien är en kvalitativ studie med ett interaktivt förhållningssätt, vilket innebär en ambition att forska med de berörda inom ramen för en gemensam kunskapsbildning. Den interaktiva forskningen strävar efter att vara till praktisk nytta, men framförallt är syftet att bidra till</p><p>teoriutvecklingen. De datainsamlingsmetoder som använts är arbetsplatsbesök, intervjuer, ”critical incident”-metoden, reflektions-PM (inom ramen för två utbildningar), analysdialog- och reflektionsseminarier, informella samtal samt återföringsträffar. Ett antal gemensamma aktiviteter har anordnats inom ramen för det nationella projektet, vilka även har fungerat som datainsamling och gemensam analys, såsom nationella konferenser, styrgruppsmöten m.m.</p><p>Avhandlingens resultat visar att mellancheferna inom vård och omsorgssektorn kan ha en strategisk roll i verksamhetsutvecklingen. Avsikten med min forskning har inte varit att generalisera resultaten i traditionell mening, men att visa på mekanismer och förutsättningar för mellancheferna vid organisering av utvecklingsarbetet. De organisatoriska villkoren – i form av ekonomi, ansvar, befogenheter, professionalisering och företagskultur – påverkar starkt mellanchefernas förutsättningar att driva ett utvecklingsarbete. Min forskning visar på intressanta möjligheter för mellancheferna när det gäller att åstadkomma en vertikal och horisontell integrering av utvecklingsarbetet. Det finns exempel på hur mellanchefer kunnat påverka och utnyttja dessa förutsättningar för att skapa ett handlingsutrymme i organisationen för att driva ett utvecklingsarbete. De har använt sig av målstyrning, skapat utrymme för lärande och reflektion samt skapat en organisationskultur som stöder utveckling. De interorganisatoriska förutsättningarna – i form av externt nätverk, forskarstöd och partsamverkan – har varit viktiga för mellancheferna i deras utvecklingsarbete.</p> / <p>The aim of the dissertation is to describe, analyse and discuss those organisational and interorganisational conditions which are of importance for the enablement of middle managers to drive sustainable development in activity areas. The intention is to study the interplay between organisational and inter-organisational conditions coupled to the change competence of the middle managers. The research has been carried out within the framework of an extensive national R&D project within healthcare and caring. The aim of the project was to work with a sustainable development effort which had as its foundation a broad</p><p>participation between employees and management.</p><p>The study is a qualitative study with an interactive type of approach, which implies an ambition to carry out research with those affected within the framework of a joint knowledge gathering. The interactive research attempts to be of practical use, but the aim is primarily to contribute to theoretical development. The data collection methods which have been used are visits to the workplaces, interviews, “critical-incident” methods, reflective-PMs (within the framework of two education areas), analytical dialog and reflective seminars, informal conversations and feedback meetings. A number of common activities have been organised within the framework of the national project, which have also functioned as data collection and common analyses, as well as national conferences, steering group meetings etc.</p><p>The results of the dissertation show that middle managers within the healthcare and caring sector can have a strategic role in the development of this activity area. The intention of my research has not been to generalise the results in the traditional interpretation, but to reveal mechanisms and prerequisites for the middle managers in organising development efforts. The organisational conditions – in the form of economy, responsibility, authority, professionalisation and company culture – strongly affect the prerequisites of the middle manager for driving a development effort. My research demonstrates interesting opportunities for the middle managers concerning achieving a vertical and horizontal integration of the development effort. There are examples of how the middle managers have been able to influence and utilise these prerequisites in order to create room for action in the organisation for driving a development effort. They have availed themselves of goal orientation, created room for learning and reflection and have created an organisational culture that supports development. The inter-organisational prerequisitess – in the form of an external network, research support and collaboration of parties – has been important for the middle managers in their development efforts.</p>
9

Squelettes algorithmiques pour la programmation et l'exécution efficaces de codes parallèles / Algorithmic skeletons for efficient programming and execution of parallel codes

Legaux, Joeffrey 13 December 2013 (has links)
Les architectures parallèles sont désormais présentes dans tous les matériels informatiques, mais les programmeurs ne sont généralement pas formés à leur programmation dans les modèles explicites tels que MPI ou les Pthreads. Il y a un besoin important de modèles plus abstraits tels que les squelettes algorithmiques qui sont une approche structurée. Ceux-ci peuvent être vus comme des fonctions d’ordre supérieur synthétisant le comportement d’algorithmes parallèles récurrents que le développeur peut ensuite combiner pour créer ses programmes. Les développeurs souhaitent obtenir de meilleures performances grâce aux programmes parallèles, mais le temps de développement est également un facteur très important. Les approches par squelettes algorithmiques fournissent des résultats intéressants dans ces deux aspects. La bibliothèque Orléans Skeleton Library ou OSL fournit un ensemble de squelettes algorithmiques de parallélisme de données quasi-synchrones dans le langage C++ et utilise des techniques de programmation avancées pour atteindre une bonne efficacité. Nous avons amélioré OSL afin de lui apporter de meilleures performances et une plus grande expressivité. Nous avons voulu analyser le rapport entre les performances des programmes et l’effort de programmation nécessaire sur OSL et d’autres modèles de programmation parallèle. La comparaison rigoureuse entre des programmes parallèles dans OSL et leurs équivalents de bas niveau montre une bien meilleure productivité pour les modèles de haut niveau qui offrent une grande facilité d’utilisation tout en produisant des performances acceptables. / Parallel architectures have now reached every computing device, but software developers generally lackthe skills to program them through explicit models such as MPI or the Pthreads. There is a need for moreabstract models such as the algorithmic skeletons which are a structured approach. They can be viewed ashigher order functions that represent the behaviour of common parallel algorithms, and those are combinedby the programmer to generate parallel programs. Programmers want to obtain better performances through the usage of parallelism, but the development time implied is also an important factor. Algorithmic skeletons provide interesting results in both those fields. The Orléans Skeleton Library or OSL provides a set of algorithmic skeletons for data parallelism within the bulk synchronous parallel model for the C++ language. It uses advanced metaprogramming techniques to obtain good performances. We improved OSL in order to obtain better performances from its generated programs, and extended its expressivity. We wanted to analyze the ratio between the performance of programs and the development effort needed within OSL and other parallel programming models. The comparison between parallel programs written within OSL and their equivalents in low level parallel models shows a better productivity for high level models : they are easy to use for the programmers while providing decent performances.
10

Modélisation et implémentation de parallélisme implicite pour les simulations scientifiques basées sur des maillages / Model and implementation of implicit parallélism for mesh-based scientific simulations

Coullon, Hélène 29 September 2014 (has links)
Le calcul scientifique parallèle est un domaine en plein essor qui permet à la fois d’augmenter la vitesse des longs traitements, de traiter des problèmes de taille plus importante ou encore des problèmes plus précis. Ce domaine permet donc d’aller plus loin dans les calculs scientifiques, d’obtenir des résultats plus pertinents, car plus précis, ou d’étudier des problèmes plus volumineux qu’auparavant. Dans le monde plus particulier de la simulation numérique scientifique, la résolution d’équations aux dérivées partielles (EDP) est un calcul particulièrement demandeur de ressources parallèles. Si les ressources matérielles permettant le calcul parallèle sont de plus en plus présentes et disponibles pour les scientifiques, à l’inverse leur utilisation et la programmation parallèle se démocratisent difficilement. Pour cette raison, des modèles de programmation parallèle, des outils de développement et même des langages de programmation parallèle ont vu le jour et visent à simplifier l’utilisation de ces machines. Il est toutefois difficile, dans ce domaine dit du “parallélisme implicite”, de trouver le niveau d’abstraction idéal pour les scientifiques, tout en réduisant l’effort de programmation. Ce travail de thèse propose tout d’abord un modèle permettant de mettre en oeuvre des solutions de parallélisme implicite pour les simulations numériques et la résolution d’EDP. Ce modèle est appelé “Structured Implicit Parallelism for scientific SIMulations” (SIPSim), et propose une vision au croisement de plusieurs types d’abstraction, en tentant de conserver les avantages de chaque vision. Une première implémentation de ce modèle, sous la forme d’une librairie C++ appelée SkelGIS, est proposée pour les maillages cartésiens à deux dimensions. Par la suite, SkelGIS, et donc l’implémentation du modèle, est étendue à des simulations numériques sur les réseaux (permettant l’application de simulations représentant plusieurs phénomènes physiques). Les performances de ces deux implémentations sont évaluées et analysées sur des cas d’application réels et complexes et démontrent qu’il est possible d’obtenir de bonnes performances en implémentant le modèle SIPSim. / Parallel scientific computations is an expanding domain of computer science which increases the speed of calculations and offers a way to deal with heavier or more accurate calculations. Thus, the interest of scientific computations increases, with more precised results and bigger physical domains to study. In the particular case of scientific numerical simulations, solving partial differential equations (PDEs) is an especially heavy calculation and a perfect applicant to parallel computations. On one hand, it is more and more easy to get an access to very powerfull parallel machines and clusters, but on the other hand parallel programming is hard to democratize, and most scientists are not able to use these machines. As a result, high level programming models, framework, libraries, languages etc. have been proposed to hide technical details of parallel programming. However, in this “implicit parallelism” field, it is difficult to find the good abstraction level while keeping a low programming effort. This thesis proposes a model to write implicit parallelism solutions for numerical simulations such as mesh-based PDEs computations. This model is called “Structured Implicit Parallelism for scientific SIMulations” (SIPSim), and proposes an approach at the crossroads of existing solutions, taking advantage of each one. A first implementation of this model is proposed, as a C++ library called SkelGIS, for two dimensional Cartesian meshes. A second implementation of the model, and an extension of SkelGIS, proposes an implicit parallelism solution for network-simulations (which deals with simulations with multiple physical phenomenons), and is studied in details. A performance analysis of both these implementations is given on real case simulations, and it demonstrates that the SIPSim model can be implemented efficiently.

Page generated in 0.0788 seconds