11 |
PTPV1 and PTPV2 Translation in FTI SystemsLefevre, D., Cranley, N., Holmeide, Ø. 10 1900 (has links)
ITC/USA 2013 Conference Proceedings / The Forty-Ninth Annual International Telemetering Conference and Technical Exhibition / October 21-24, 2013 / Bally's Hotel & Convention Center, Las Vegas, NV / A Flight Test Instrumentation (FTI) system may consist of equipment that either supports PTPv1 (IEEE 1588 Std 2002) or PTPv2 (IEEE 1588 Std 2008). The challenge in such time distributed system is the poor compatibility between the two PTP protocol versions. This paper describes how to combine the PTP versions in the same network with minimum or no manual configuration.
|
12 |
Merge-Trees: Visualizing the integration of commits into LinuxWilde, Evan 11 September 2018 (has links)
Version control systems are an asset to software development, enabling
developers to keep snapshots of the code as they work.
Stored in the version control system is the entire history of the
software project, rich in information about who is contributing to the
project, when contributions are made, and to what part of the project
they are being made.
Presented in the right way, this information can be made invaluable in
helping software developers continue the development of the project,
and maintainers to understand how the changes to the current version
can be applied to older versions of projects.
Maintainers are unable to effectively use the information stored
within a software repository to assist with the maintanance older
versions of that software in highly-collaborative projects.
The Linux kernel repository is an example of such a project.
This thesis focuses on improving visualizations of the Linux kernel
repository, developing new visualizations that help answer questions
about how commits are integrated into the project.
Older versions of the kernel are used in a variety of systems where it
is impractical to update to the current version of the kernel.
Some of these applications include the controllers for spacecrafts,
the
core of mobile phones, the operating system driving internet routers,
and as Internet-Of-Things (IOT) device firmware.
As vulnerabilities are discovered in the kernel, they are patched in
the current version.
To ensure that older versions are also protected against the
vulnerabilities, the patches applied to the current version of the
kernel must be applied back to the older version.
To do this, maintainers must be able to understand how the patch that
fixed the vulnerability was integrated into the kernel so that they
may apply it to the old version as well.
This thesis makes four contributions:
(1) a new tree-based model, the \mt{}, that abstracts the commits in the repository,
(2) three visualizations that use this model,
(3) a tool called \tool{} that uses these visualizations,
(4) a user study
that evaluates whether the tool is effective in helping users answer
questions related to how commits are integrated about the Linux
repository.
The first contribution includes the new tree-based model, the
algorithm that constructs the trees from the repository,
and the evaluation of the results of the algorithm.
the second contribution demonstrates some of the potential
visualizations of the repository that are made possible by the model,
and how these visualizations can be used depending on the structure of
the tree.
The third contribution is an application that applies the
visualizations to the Linux kernel repository.
The tool was able to help the participants of the study with
understanding how commits were integrated into the Linux kernel
repository.
Additionally, the participants were able to summarize information
about merges,
including who made the most contributions,
which file were altered the most,
more quickly and accurately than with Gitk and the command line tools. / Graduate
|
13 |
Leveraging Defects Life-Cycle for Labeling Defective ClassesVandehei, Bailey R 01 December 2019 (has links) (PDF)
Data from software repositories are a very useful asset to building dierent kinds of
models and recommender systems aimed to support software developers. Specically,
the identication of likely defect-prone les (i.e., classes in Object-Oriented systems)
helps in prioritizing, testing, and analysis activities. This work focuses on automated
methods for labeling a class in a version as defective or not. The most used methods
for automated class labeling belong to the SZZ family and fail in various circum-
stances. Thus, recent studies suggest the use of aect version (AV) as provided by
developers and available in the issue tracker such as JIRA. However, in many cir-
cumstances, the AV might not be used because it is unavailable or inconsistent. The
aim of this study is twofold: 1) to measure the AV availability and consistency in
open-source projects, 2) to propose, evaluate, and compare to SZZ, a new method
for labeling defective classes which is based on the idea that defects have a stable
life-cycle in terms of proportion of versions needed to discover the defect and to x
the defect. Results related to 212 open-source projects from the Apache ecosystem,
featuring a total of about 125,000 defects, show that the AV cannot be used in the
majority (51%) of defects. Therefore, it is important to investigate automated meth-
ods for labeling defective classes. Results related to 76 open-source projects from the
Apache ecosystem, featuring a total of about 6,250,000 classes that are are aected
by 60,000 defects and spread over 4,000 versions and 760,000 commits, show that the
proposed method for labeling defective classes is, in average among projects and de-
fects, more accurate, in terms of Precision, Kappa, F1 and MCC than all previously
proposed SZZ methods. Moreover, the improvement in accuracy from combining SZZ
with defects life-cycle information is statistically signicant but practically irrelevant
(
overall and in average, more accurate via defects' life-cycle than any SZZ method.
|
14 |
Automatic Restoration and Management of Computational NotebooksVenkatesan, Satish 03 March 2022 (has links)
Computational Notebook platforms are very commonly used by programmers and data scientists. However, due to the interactive development environment of notebooks, developers struggle to maintain effective code organization which has an adverse effect on their productivity. In this thesis, we research and develop techniques to help solve issues with code organization that developers face in an effort to improve productivity. Notebooks are often executed out of order which adversely effects their portability. To determine cell execution orders in computational notebooks, we develop a technique that determines the execution order for a given cell and if need be, attempt to rearrange the cells to match the intended execution order. With such a tool, users would not need to manually determine the execution orders themselves. In a user study with 9 participants, our approach on average saves users about 95% of the time required to determine execution orders manually. We also developed a technique to support insertion of cells in rows in addition to the standard column insertion to help better represent multiple contexts. In a user study with 9 participants, this technique on a scale of one to ten on average was judged as a 8.44 in terms of representing multiple contexts as opposed to standard view which was judged as 4.77. / Master of Science / In the field of data science computational notebooks are a very commonly used tool. They allow users to create programs to perform computations and to display graphs, tables and other visualizations to supplement their analysis. Computational Notebooks have some limitations in the development environment which can make it difficult for users to organize their code. This can make it very difficult to read through and analyze the code to find or fix any errors which in turn can have a very negative effect on developer productivity. In this thesis, we research methods to improve the development environment and increase developer productivity. We achieve this by offering tools to the user that can help organize and cleanup their code making it easier to comprehend the code and make any necessary changes.
|
15 |
Quantification of force applied during external cephalic version. / CUHK electronic theses & dissertations collectionJanuary 2005 (has links)
External cephalic version (ECV) involves turning a fetus in utero by manipulation through the maternal abdomen and the uterine wall. / Many clinicians and patients, however, still decline ECV in favour of Caesarean section. This could be due to the lack of experience of ECV, and fear of complications or pain during the version. / Summary. The force applied during ECV can be measured and analysed using a customized pair of gloves incorporating piezo-resistive pressure sensors and suitable analytical software. The degree of force required for a successful version is highly variable. Failure of version is not usually due to insufficient force. Uterine tone is the most important factor affecting the degree of force applied during a version attempt. The degree of force applied is associated with the changes in fetal cerebral blood flow after ECV, and the amount of pain perceived by the patients. (Abstract shortened by UMI.) / The lack of information in this area is primarily due to the lack of a suitable device that would allow measurements of the force applied without interfering with the ECV. A suitable device would therefore have to be sufficiently robust so that it could be worn on the hands, durable so that it could be used repeatedly, incorporate multiple individual sensors, each of which is capable of making dynamic and mutually independent measurements during the version procedure. / There is no report in the literature on quantification of the force applied during ECV. It is also unknown whether the degree of force applied is related to the version outcome. In particular, it is unclear whether a failed attempt is related to insufficient force, or whether an increase in force may help to achieve version after a failure. Furthermore, it is also not known if any patients' factors may influence how much force is applied through the operator's hands. Although the chance of successful version could be predicted by some clinical factors, whether these factors may also affect the degree of applied force is not known. / This thesis reports on the design and development of a suitable measuring device fulfilling the requirements described above. In addition, it will test a number of hypotheses relating to the degree of force applied during ECV and clinical feto-maternal parameters and outcomes, in a study cohort of 92 patients. / Leung Tak Yeung. / "April 2005." / Source: Dissertation Abstracts International, Volume: 67-07, Section: B, page: 3717. / Thesis (M.D.)--Chinese University of Hong Kong, 2005. / Includes bibliographical references (p. 155-174). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / School code: 1307.
|
16 |
Lättlästa Lagerlöf -En komparativ analys av Gösta Berlings saga i två versionerLindberg, Victor January 2019 (has links)
The purpose of this thesis is to examine the differences and similarities between the original version and an easy-to-read version of Selma Lagerlöfs novel Gösta Berlings saga. The method used is close reading of the two versions and thereby analysing descriptions of the environment, descriptions and psychological depth of the protagonist, missing chapters and changes of the plot. The results show that 21 chapters along with large parts from some of the remaining chapters are missing in the easy-to-read version due to not being necessary to the main story line. An alteration has been made in order to make the story line chronological. The use of invocation and personification of nature by Lagerlöf are missing in the easy-to-read version. There is also fewer details mentioned about the environments in the novel, which is resulting in the reader having to fill in the gaps on his own. The descriptions of the protagonist are few and over all quite similar, though some details are left out in the easy-to-read version and some clarifications have been made. There is a bigger difference between how thoughts and feelings are described in the texts, resulting in Gösta being a round and dynamic character in the original version and somewhat more flat and static in the easy-to-read version. These differances effect how a classroom discussion might be implemented, since the teacher must adjust the conversation so that it may be relevant for all participants.
|
17 |
A Comparison of Least-Squares Finite Element Models with the Conventional Finite Element Models of Problems in Heat Transfer and Fluid MechanicsNellie Rajarova, 2009 May 1900 (has links)
In this thesis, least-squares based finite element models (LSFEM) for the Poisson equation and Navier-Stokes equation are presented. The least-squares method is simple, general and reliable. Least-squares formulations offer several computational and theoretical advantages. The resulting coefficient matrix is symmetric and positive-definite. Using these formulations, the choice of approximating space is not subject to any compatibility condition.
The Poisson equation is cast as a set of first order equations involving gradient of the primary variable as auxiliary variables for the mixed least-square finite element model. Equal order C0 continuous approximation functions is used for primary and auxiliary variables. Least-squares principle was directly applied to develop another model which requires C1continous approximation functions for the primary variable. Each developed model is compared with the conventional model to verify its performance.
Penalty based least-squares formulation was implemented to develop a finite element for the Navier Stokes equations. The continuity equation is treated as a constraint on the velocity field and the constraint is enforced using the penalty method. Velocity gradients are introduced as auxiliary variables to get the first order equivalent system. Both the primary and auxiliary variables are interpolated using equal order C0 continuous, p-version approximation functions. Numerical examples are presented to demonstrate the convergence characteristics and accuracy of the method.
|
18 |
Der Schutz des Musikurhebers bei Coverversionen /Riekert, Stephan. January 2003 (has links) (PDF)
Technische Univ., Diss.--Dresden, 2002.
|
19 |
De l'énigme au paradigme : La psychanalyse n'est pas homophobe. / The enigma to the paradigm : psycholysis is not homofobicRodríguez Diéguez, María Paz 20 January 2015 (has links)
Cette recherche vise à répondre aux critiques que certains auteurs de la théorie queer ont adressé à la théorie lacanienne, en l´accusant d´homophobe. Nous proposons une nouvelle approche de la sexualité humaine qui ne sera plus instaurée à partir de l´ordre symbolique du Complexe d´Œdipe. À la fin de son enseignement, Lacan reconnaît qu´ « Il n´y a pas de rapport sexuel », et avec cet aphorisme il nous ouvre la porte vers un nouveau paradigme orienté vers le réel : la jouissance qui vise l´impossible de la relation sexuelle. Cette jouissance substitutive, comme l´a nommé Jacques-Alain Miller, ne distingue pas entre névrose et perversion. Cette nouvelle lecture qui va au-delà de la clinique structurelle surgit du nœud borroméen, c´est-à-dire, de ce qui fait tenir ensemble les registres symbolique, réel et imaginaire du désir et de la jouissance. Nous prétendons surmonter l´Œdipe par le biais du tout dernier enseignement de Lacan. Pour ce faire, nous avons revisité le célèbre cas de « la jeune homosexuelle » de Freud, à partir des nouveaux éléments de son histoire publiés dans sa biographie intitulée Sidonie Csillag, Homosexuelle chez Freud. Lesbienne dans le siècle. Grâce à cette nouvelle conception borroméenne nous regarderons l´homosexualité sous un autre angle. Notre but sera de trouver les convergences entre ce nouveau paradigme borroméen de la psychanalyse et la théorie queer. / This investigation aims to respond to the critiques certain authors of queer theory have addressed to Lacanian theory, namely the accusation of homophobia. At the end of his teaching, Lacan recognized « There is no sexual relation »; and with this aphorism, he opened the door to a new paradigm oriented by the real: jouissance that aims at the impossible of the sexual relation. This substitutive jouissance, as Jacques-Alain Miller named it, doesn’t distinguish between neurosis and perversion. This new reading which goes beyond the structural clinic springs from the Borromean knot, in other words, that which holds the symbolic, real, and imaginary registers of desire and jouissance together. We purport to overcome the Oedipus complex by way of the very last Lacanian teaching. In order to do so, we revisited Freud’s extremely well-known case of the “young homosexual woman”, starting from new historical elements published in her biography entitled, Sidonie Csillag: Jeune Homosexuelle chez Freud, lesbienne dans le siècle. We will regard homosexuality from another angle thanks to the new Borromean conception. Our goal shall be to find the convergences between this new Borromean paradigm of psychoanalysis and Queer theory.
|
20 |
Gestion des versions pour la construction incrémentale et partagée de bases de connaissancesTayar, Nina 21 September 1995 (has links) (PDF)
Dans de nombreux domaines scientifiques ou techniques, la quantité et la complexité croissantes des connaissances manipulées rendent inadéquate l'utilisation de supports classiques tels que le papier. L'informatique peut apporter une réponse en proposant un cadre pour modéliser, gérer et exploiter ces connaissances : on parle alors de bases de connaissances. Pour la plupart des domaines, la connaissance n'est pas statique et immédiatement disponible mais est au contraire en constante évolution au fur et à mesure des découvertes. Dés lors, une base de connaissances est construite de manière incrémentale. Incrémentale veut dire que l'on commence par construire une petite base qui est enrichie petit à petit par l'acquisition de nouvelles connaissances. Cette construction est partagée puisque plusieurs personnes doivent pouvoir travailler simultanément sur cette base afin de la bâtir. Cette thèse aborde la problématique de la gestion des versions pour les bases de connaissances. Ses apports sont de deux ordres. Nous avons tout d'abord conçu et réalisé un système de gestion de versions de bases de connaissances. Par définition, une version reflète un état de l'évolution de la base, c'est-à-dire de l'ensemble des structures et des valeurs des connaissances contenues dans celle-ci (à un moment donné). Les versions aident ainsi à contrôler l'historique des changements effectués au cours de la construction de la base. Elles permettent également un éventuel retour en arrière. Finalement, les versions permettent la formulation de différentes hypothèses de recherche et la gestion de leur évolution parallèle au cours du temps. Nous avons également contribué à la conception de l'environnement de construction incrémentale et partagée de bases de connaissances et montré comment notre système de versions s'intègre au sein de cet environnement.
|
Page generated in 0.0593 seconds