• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 47
  • 12
  • 11
  • 9
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 104
  • 41
  • 21
  • 20
  • 13
  • 13
  • 13
  • 12
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Mathematical Formula Recognition and Automatic Detection and Translation of Algorithmic Components into Stochastic Petri Nets in Scientific Documents

Kostalia, Elisavet Elli January 2021 (has links)
No description available.
82

A JavaScript Backend for the Miking Compiler

Rågstad, William January 2022 (has links)
This thesis presents the design and implementation of an extension of the self-hosted Miking compiler to enable the generation of JavaScript code for different runtime environments and web browsers. Miking is a framework for developing domain-specific and general-purpose programming languages through sound language fragment composition, among other things, to create efficient compilers. Miking Core encapsulates a meta-language called Miking Lang and a fundamental functional Miking Expression language which Miking Lang itself is also lowered to. The presented backend translates Miking expressions into semantically equivalent effective, readable, sound, and correct JavaScript programs. In this report, development challenges, implementation methods, and techniques are also discussed and evaluated along with a review of the final compiler backend. Likewise, details on JavaScript-specific optimizations and pattern-matching compilation are presented, including how tail recursion is handled to enable complex Miking programs to be compiled into flexible and efficient JavaScript. / Detta examensarbete presenterar design och implementation för utveckling av Miking-kompilatorn, med syfte att möjliggöra generering av JavaScript-kod för olika exekveringsmiljöer och webbläsare. Miking är ett ramverk för att utveckla domänspecifika och generella programmeringsspråk genom sund komposition av språksfragment som kan används för att skapa effektiva kompilatorer. Miking Core ramverket innehåller ett metaspråk kallat Miking Lang, vilket ”sänks” till det mer grundläggande funktionella Miking Expression-språket. ”Sänkning” betyder i huvudsak att skriva om mer komplexa semantiska konstruktioner i form av enklare. Den backend som presenteras översätter Miking-uttryck till semantiskt ekvivalenta JavaScript program som också är effektiva, läsbara, sunda och korrekta. I denna rapport diskuteras och utvärderas även utvecklingsutmaningar, implementeringsmetod och andra tekniker som har använts under arbetet. På samma sätt presenteras detaljer om JavaScript-specifika optimeringar och mönstermatchningskompilering, inklusive hur svansrekursion hanteras för att möjliggöra kompilering av komplexa Miking-program till flexibel och effektiv JavaScript med hjälp av ”trampoline” teknik.
83

Эвфемизмы и дисфемизмы в англоязычных новостных публикациях, посвященных конфликту в Нагорном Карабахе : магистерская диссертация / Euphemisms and Dysphemisms in English-Language News Publications on the Nagorno-Karabakh Conflict

Габибли, Г. Б., Gabibli, G. B. January 2021 (has links)
Диссертация посвящена идентификации стратегий эвфемизации/дисфемизации при освещении конфликта в Нагорном Карабахе англоязычными СМИ. Показано, что обе стратегии в равной степени связаны с цитированием участников Второй Карабахской войны (Азербайджана и Армении). Осуществлена попытка идентифицировать «медийное поведение» конфликтующих сторон, не желающих брать на себя ответственность за эскалацию конфликта, но готовых отстаивать свои права на спорную территорию. / The paper is devoted to the identification of euphemization/dysphemization strategies in Nagorno-Karabakh conflict as covered by the English-language media. It is shown that both strategies are equally related to quoting the participants of the Second Karabakh War (Azerbaijan and Armenia). An attempt was made to identify the «media behavior» of the conflicting parties, who do not want to take responsibility for the escalation of the conflict, but are ready to defend their territorial rights.
84

A quality assurance reference model for object-orientation

Thornton, Deborah 06 1900 (has links)
The focus of the dissertation is on software quality assurance for object-oriented information systems development. A Quality Assurance Reference Model is proposed with aspects dealing with technical and managerial issues. A revised Spiral life cycle model is adopted as well as the Object Modelling Technique. The Quality Assurance Reference Model associates quality factors at various levels, quality criteria and metrics into a matrix framework that may be used to achieve quality assurance for all cycles of the Spiral Model. / Computing / M. Sc. (Information Systems)
85

A Verification Framework for Component Based Modeling and Simulation : “Putting the pieces together”

Mahmood, Imran January 2013 (has links)
The discipline of component-based modeling and simulation offers promising gains including reduction in development cost, time, and system complexity. This paradigm is very profitable as it promotes the use and reuse of modular components and is auspicious for effective development of complex simulations. It however is confronted by a series of research challenges when it comes to actually practice this methodology. One of such important issue is Composability verification. In modeling and simulation (M&amp;S), composability is the capability to select and assemble components in various combinations to satisfy specific user requirements. Therefore to ensure the correctness of a composed model, it is verified with respect to its requirements specifications.There are different approaches and existing component modeling frameworks that support composability however in our observation most of the component modeling frameworks possess none or weak built-in support for the composability verification. One such framework is Base Object Model (BOM) which fundamentally poses a satisfactory potential for effective model composability and reuse. However it falls short of required semantics, necessary modeling characteristics and built-in evaluation techniques, which are essential for modeling complex system behavior and reasoning about the validity of the composability at different levels.In this thesis a comprehensive verification framework is proposed to contend with some important issues in composability verification and a verification process is suggested to verify composability of different kinds of systems models, such as reactive, real-time and probabilistic systems. With an assumption that all these systems are concurrent in nature in which different composed components interact with each other simultaneously, the requirements for the extensive techniques for the structural and behavioral analysis becomes increasingly challenging. The proposed verification framework provides methods, techniques and tool support for verifying composability at its different levels. These levels are defined as foundations of a consistent model composability. Each level is discussed in detail and an approach is presented to verify composability at that level. In particular we focus on theDynamic-Semantic Composability level due to its significance in the overallcomposability correctness and also due to the level of difficulty it poses in theprocess. In order to verify composability at this level we investigate the application ofthree different approaches namely (i) Petri Nets based Algebraic Analysis (ii) ColoredPetri Nets (CPN) based State-space Analysis and (iii) Communicating SequentialProcesses based Model Checking. All the three approaches attack the problem ofverifying dynamic-semantic composability in different ways however they all sharethe same aim i.e., to confirm the correctness of a composed model with respect to itsrequirement specifications. Beside the operative integration of these approaches inour framework, we also contributed in the improvement of each approach foreffective applicability in the composability verification. Such as applying algorithmsfor automating Petri Net algebraic computations, introducing a state-space reductiontechnique in CPN based state-space analysis, and introducing function libraries toperform verification tasks and help the molder with ease of use during thecomposability verification. We also provide detailed examples of using each approachwith different models to explain the verification process and their functionality.Lastly we provide a comparison of these approaches and suggest guidelines forchoosing the right one based on the nature of the model and the availableinformation. With a right choice of an approach and following the guidelines of ourcomponent-based M&amp;S life-cycle a modeler can easily construct and verify BOMbased composed models with respect to its requirement specifications. / <p>Overseas Scholarship for PHD in selected Studies Phase II Batch I</p><p>Higher Education Commision of Pakistan.</p><p>QC 20130224</p>
86

Den politiska läroboken : Bilden av USA och Sovjetunionen i norska, svenska och finländska läroböcker under Kalla kriget / Political textbooks : The depiction of the USA and the Soviet Union in Norwegian, Swedish, and Finnish schoolbooks during the Cold War

Holmén, Janne Sven-Åke January 2006 (has links)
During the Cold War, Norway was a member of NATO, Sweden was neutral but depended on Western support in the event of a crisis, while Finland's foreign policy priority was to win and retain the Soviet Union's confidence. The purpose of the thesis is to study whether the three small states' different foreign policy choices had consequences for the ways in which the Soviet Union and the USA were depicted in school textbooks for history, geography, and social sciences in the period 1930 to 2004. To this end, a theory derived from small states' strategies to maintain their independence was applied to textbook production. The study demonstrates that there was a link between small state foreign policy and textbooks' accounts of the USA and Soviet Union. Swedish and Norwegian textbooks portray international conflicts from a legalistic perspective, taking the part of small states exposed to superpower aggression such as Vietnam and Afghanistan. In Finnish textbooks, however, an interest in defending small state's rights yielded to the need to demonstrate their goodwill towards the Soviet Union, which was described in far less critical terms than in Swedish and Norwegian textbooks. In time, in the name of neutrality, depictions of the USA also became increasingly uncritical. All three Nordic states had government authorities charged with inspecting and approving school textbooks. Foreign policy's chief influence on textbooks was not effected by direct oversight, however; instead, it was established indirectly by means of the social climate, which determined what was considered politically correct in the three countries, and it was to this that the textbooks' authors adapted their work. Textbooks are often said to be conservative and slow to change, but the thesis shows that in parts they were politically sensitive, rapidly adapting to changes in what society held to be politically correct.
87

Practical and Foundational Aspects of Secure Computation

Ranellucci, Samuel 02 1900 (has links)
Il y a des problemes qui semblent impossible a resoudre sans l'utilisation d'un tiers parti honnete. Comment est-ce que deux millionnaires peuvent savoir qui est le plus riche sans dire a l'autre la valeur de ses biens ? Que peut-on faire pour prevenir les collisions de satellites quand les trajectoires sont secretes ? Comment est-ce que les chercheurs peuvent apprendre les liens entre des medicaments et des maladies sans compromettre les droits prives du patient ? Comment est-ce qu'une organisation peut ecmpecher le gouvernement d'abuser de l'information dont il dispose en sachant que l'organisation doit n'avoir aucun acces a cette information ? Le Calcul multiparti, une branche de la cryptographie, etudie comment creer des protocoles pour realiser de telles taches sans l'utilisation d'un tiers parti honnete. Les protocoles doivent etre prives, corrects, efficaces et robustes. Un protocole est prive si un adversaire n'apprend rien de plus que ce que lui donnerait un tiers parti honnete. Un protocole est correct si un joueur honnete recoit ce que lui donnerait un tiers parti honnete. Un protocole devrait bien sur etre efficace. Etre robuste correspond au fait qu'un protocole marche meme si un petit ensemble des joueurs triche. On demontre que sous l'hypothese d'un canal de diusion simultane on peut echanger la robustesse pour la validite et le fait d'etre prive contre certains ensembles d'adversaires. Le calcul multiparti a quatre outils de base : le transfert inconscient, la mise en gage, le partage de secret et le brouillage de circuit. Les protocoles du calcul multiparti peuvent etre construits avec uniquements ces outils. On peut aussi construire les protocoles a partir d'hypoth eses calculatoires. Les protocoles construits a partir de ces outils sont souples et peuvent resister aux changements technologiques et a des ameliorations algorithmiques. Nous nous demandons si l'efficacite necessite des hypotheses de calcul. Nous demontrons que ce n'est pas le cas en construisant des protocoles efficaces a partir de ces outils de base. Cette these est constitue de quatre articles rediges en collaboration avec d'autres chercheurs. Ceci constitue la partie mature de ma recherche et sont mes contributions principales au cours de cette periode de temps. Dans le premier ouvrage presente dans cette these, nous etudions la capacite de mise en gage des canaux bruites. Nous demontrons tout d'abord une limite inferieure stricte qui implique que contrairement au transfert inconscient, il n'existe aucun protocole de taux constant pour les mises en gage de bit. Nous demontrons ensuite que, en limitant la facon dont les engagements peuvent etre ouverts, nous pouvons faire mieux et meme un taux constant dans certains cas. Ceci est fait en exploitant la notion de cover-free families . Dans le second article, nous demontrons que pour certains problemes, il existe un echange entre robustesse, la validite et le prive. Il s'effectue en utilisant le partage de secret veriable, une preuve a divulgation nulle, le concept de fantomes et une technique que nous appelons les balles et les bacs. Dans notre troisieme contribution, nous demontrons qu'un grand nombre de protocoles dans la litterature basee sur des hypotheses de calcul peuvent etre instancies a partir d'une primitive appelee Transfert Inconscient Veriable, via le concept de Transfert Inconscient Generalise. Le protocole utilise le partage de secret comme outils de base. Dans la derniere publication, nous counstruisons un protocole efficace avec un nombre constant de rondes pour le calcul a deux parties. L'efficacite du protocole derive du fait qu'on remplace le coeur d'un protocole standard par une primitive qui fonctionne plus ou moins bien mais qui est tres peu couteux. On protege le protocole contre les defauts en utilisant le concept de privacy amplication . / There are seemingly impossible problems to solve without a trusted third-party. How can two millionaires learn who is the richest when neither is willing to tell the other how rich he is? How can satellite collisions be prevented when the trajectories are secret? How can researchers establish correlations between diseases and medication while respecting patient confidentiality? How can an organization insure that the government does not abuse the knowledge that it possesses even though such an organization would be unable to control that information? Secure computation, a branch of cryptography, is a eld that studies how to generate protocols for realizing such tasks without the use of a trusted third party. There are certain goals that such protocols should achieve. The rst concern is privacy: players should learn no more information than what a trusted third party would give them. The second main goal is correctness: players should only receive what a trusted third party would give them. The protocols should also be efficient. Another important property is robustness, the protocols should not abort even if a small set of players is cheating. Secure computation has four basic building blocks : Oblivious Transfer, secret sharing, commitment schemes, and garbled circuits. Protocols can be built based only on these building blocks or alternatively, they can be constructed from specific computational assumptions. Protocols constructed solely from these primitives are flexible and are not as vulnerable to technological or algorithmic improvements. Many protocols are nevertheless based on computational assumptions. It is important to ask if efficiency requires computational assumptions. We show that this is not the case by building efficient protocols from these primitives. It is the conclusion of this thesis that building protocols from black-box primitives can also lead to e cient protocols. This thesis is a collection of four articles written in collaboration with other researchers. This constitutes the mature part of my investigation and is my main contributions to the field during that period of time. In the first work presented in this thesis we study the commitment capacity of noisy channels. We first show a tight lower bound that implies that in contrast to Oblivious Transfer, there exists no constant rate protocol for bit commitments. We then demonstrate that by restricting the way the commitments can be opened, we can achieve better efficiency and in particular cases, a constant rate. This is done by exploiting the notion of cover-free families. In the second article, we show that for certain problems, there exists a trade-off between robustness, correctness and privacy. This is done by using verifiable secret sharing, zero-knowledge, the concept of ghosts and a technique which we call \balls and bins". In our third contribution, we show that many protocols in the literature based on specific computational assumptions can be instantiated from a primitive known as Verifiable Oblivious Transfer, via the concept of Generalized Oblivious Transfer. The protocol uses secret sharing as its foundation. In the last included publication, we construct a constant-round protocol for secure two-party computation that is very efficient and only uses black-box primitives. The remarkable efficiency of the protocol is achieved by replacing the core of a standard protocol by a faulty but very efficient primitive. The fault is then dealt with by a non-trivial use of privacy amplification.
88

Den politiska läroboken : Bilden av USA och Sovjetunionen i norska, svenska och finländska läroböcker under Kalla kriget / Political textbooks : The depiction of the USA and the Soviet Union in Norwegian, Swedish, and Finnish schoolbooks during the Cold War

Holmén, Janne Sven-Åke January 2006 (has links)
<p>During the Cold War, Norway was a member of NATO, Sweden was neutral but depended on Western support in the event of a crisis, while Finland's foreign policy priority was to win and retain the Soviet Union's confidence. The purpose of the thesis is to study whether the three small states' different foreign policy choices had consequences for the ways in which the Soviet Union and the USA were depicted in school textbooks for history, geography, and social sciences in the period 1930 to 2004. To this end, a theory derived from small states' strategies to maintain their independence was applied to textbook production. </p><p>The study demonstrates that there was a link between small state foreign policy and textbooks' accounts of the USA and Soviet Union. Swedish and Norwegian textbooks portray international conflicts from a legalistic perspective, taking the part of small states exposed to superpower aggression such as Vietnam and Afghanistan. In Finnish textbooks, however, an interest in defending small state's rights yielded to the need to demonstrate their goodwill towards the Soviet Union, which was described in far less critical terms than in Swedish and Norwegian textbooks. In time, in the name of neutrality, depictions of the USA also became increasingly uncritical.</p><p>All three Nordic states had government authorities charged with inspecting and approving school textbooks. Foreign policy's chief influence on textbooks was not effected by direct oversight, however; instead, it was established indirectly by means of the social climate, which determined what was considered politically correct in the three countries, and it was to this that the textbooks' authors adapted their work. </p><p>Textbooks are often said to be conservative and slow to change, but the thesis shows that in parts they were politically sensitive, rapidly adapting to changes in what society held to be politically correct.</p>
89

Grammatical correctness and communicative ability : a performance analysis of the written and spoken English of Swedish learners

Bergström, Inger January 1987 (has links)
Written and oral material produced by a group of low-achieving learners of English from the 2-year lines of the Swedish upper secondary school was analysed from the perspective of grammatical correctness and communicative ability. The grammatical analysis focussed on the verb phrase and tests included both free production in speech and writing and elicitation tests. Communicative ability was assessed ‘ ‘objectively* ’ by identifying such parameters as fluency, copiousness and span, and “holistically” by using non-expert evaluators.The scores thus obtained were correlated. Grammatical correctness was correlated with communicative ability both in speech and in writing and writing proficiency was correlated with speech proficiency with respect to both grammatical correctness and communicative ability.Our findings are that there is a positive correlation between grammatical correctness and communicative ability. A remarkable finding is that the percentage of correct verb phrases correlates very weakly with communicative ability in written data. In oral data, the correlation is in fact slightly negative. The learner’s competence in grammar is reflected in both his written and oral performance. On the other hand, there is no correlation between communicative ability in writing and communicative ability in speech.The study shows that a working command of a set of syntactic rules is essential for communication. Errors are, however, an integral part of the learning process. The major part of errors are accounted for by the learner’s use of compensatory strategies. Among these low-achievers, communicative ability in conversation is distinct from writing ability. / digitalisering@umu
90

A quality assurance reference model for object-orientation

Thornton, Deborah 06 1900 (has links)
The focus of the dissertation is on software quality assurance for object-oriented information systems development. A Quality Assurance Reference Model is proposed with aspects dealing with technical and managerial issues. A revised Spiral life cycle model is adopted as well as the Object Modelling Technique. The Quality Assurance Reference Model associates quality factors at various levels, quality criteria and metrics into a matrix framework that may be used to achieve quality assurance for all cycles of the Spiral Model. / Computing / M. Sc. (Information Systems)

Page generated in 0.0519 seconds