• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 77
  • 28
  • 10
  • 9
  • Tagged with
  • 422
  • 80
  • 74
  • 44
  • 40
  • 40
  • 40
  • 39
  • 39
  • 29
  • 28
  • 27
  • 26
  • 24
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Concern-based specification and runtime verification of declarative process models

Montague, S. January 2012 (has links)
An organisation has a number of business processes that when carried out achieve its business goals. A business process defines a specific ordering of activities. It can be modelled using a process model. The process model is constructed using a modelling language. In practice, business processes can be complex. They can consist of dozens of activities with complex ordering dependencies. In this thesis, we claim that such complexity can be handled by the principle of separation of concerns. We introduce a concern-based framework called MIC (Modelling Interactions using Concerns). In the MIC framework a business process is modelled in a declarative process model as a set of interrelated concerns. Computational logic is used to represent and reason about the concerns and relations among them. It is argued that the declarative process models constructed by the MIC framework can be understood, maintained and reused.
152

Exploring the fundamental differences between compiler optimisations for energy and for performance

Pallister, James January 2016 (has links)
The efficiency of an application can have a huge effect on how long a device will run on its batteries. With no significant increases in battery capacity, yet applications requiring longer battery life, it falls to software to be more efficient with its use of energy. This is particularly salient in deeply embedded systems which may have to run on batteries for years. The high-level nature of writing a computer program detaches the programmer from the underlying hardware, allowing programmers to quickly write code that will be functional on a diverse set of devices. However, it also complicates writing efficient software, since the code may go through many transformations before it is executed, each of which can result in larger energy consumption without this being easily observable. Therefore, methods of increasing software energy efficiency are needed. Compiler optimisations provide an ideal way to achieve this, providing the ability to automatically transform software into a more efficient form. Typically, compilers have focused on making an application fast - there exist hundreds of optimisations to decrease runtime, and often these are effective at reducing energy. However, few optimisations exist to specifically decrease energy consumption. This thesis explores the differences between automated ways to reduce energy consumption and execution time at the compiler level. By exploring an extensive selection of existing compiler optimisations, through statistical techniques and genetic algorithms, it is discovered that current optimisations only reduce energy consumption because they reduce the execution time; while the optimisations affect the power dissipation of the device, this effect is incidental. The lack of optimisations which affect power dissipation implies a class of compiler optimisations has been missed from conventional compilers. This thesis develops two new optimisations belonging to this class. To create these optimisations, low-level hardware-specific energy characteristics are identified, focusing on embedded systems. The characteristics are rigorously modelled, allowing the compiler to make optimisation decisions at a higher level of abstraction. One of the optimisations manages to save up to 26% of the energy consumption, achieving this by focusing on average power reduction, rather than execution time reduction. The combination of the optimisation for energy with the existing optimisations for time is explored and found not to interact significantly with other optimisations, proving to be linearly composable. This again suggests that the energy optimisation belongs to a different class of compiler optimisation. To achieve further energy efficiency gains, a thorough vertical integration process needs to be introduced. This requires identifying (possibly) hardware-specific energy characteristics, modelling them at a level accessible to the compiler, and then making optimisation decisions based on these models. This level of integration is essential to bridge the levels of abstraction that exist between the hardware and software, and allows compilers to be successful at reducing applications' energy consumption and consequently increasing battery life.
153

Unpacking collaboration in pair programming in industrial settings

Plonka, Laura January 2012 (has links)
No description available.
154

Design of reliable networks with flows

Rappos, Efstratios January 2004 (has links)
No description available.
155

Computer systems for interactive design of three-dimensional shapes

Armit, Andrew Philip January 1970 (has links)
No description available.
156

J2ME provisioning architecture

Glass, Holger January 2006 (has links)
No description available.
157

Configuration management process maturity : definition and maturation of configuration management for aerospace and defence industries

Ali, Usman January 2014 (has links)
This research focuses on the effective implementation and continuous improvement methodologies for Configuration Management practices within aerospace and defence industries. The research is conducted mainly to develop a Configuration Management Maturity Model which is based on Critical Success Factors and Barriers to Configuration Management implementation. The motives behind this research were the lack of understanding and problems in the implementation of high-grade Configuration Management systems as highlighted by other researchers. The research is conducted in three phases through interviews and questionnaire surveys with experienced Configuration Management professionals working in aerospace and defence industries. The first part of this research identifies, prioritizes, and categorizes the Critical Success Factors for Configuration Management and devises a Configuration Management Activity Model to help practationers in the effective implementation and continuous improvement of the process. The second part of the research sets out to identify and prioritize the obstacles to effective implementation of Configuration Management practices, categorized these obstacles into more manageable groups of factors, and analysed the effects of multiple factors on identification and rating of these barriers. Both studies were conducted through mixed method research with in-depth interviews followed by questionnaire surveys. The governance aspect of the process is also investigated to a great deal in the second part through interviews to conclude on process governance in various setups. The third part of this research is related to the development of a Configuration Management Maturity Model. It is important to note that other maturity models on the topic are generic in nature and emphasis on ‘what’ to implement instead of ‘how’ to implement which has left a gap of uncertainty that forced us to devise a suitable framework. The Configuration Management Maturity Model is an assessment tool which not only provides benchmark information but also helps to identify the strengths and weaknesses of the process. This maturity framework is unique in its presentation and unlike previous maturity models, is based on current Configuration Management practices, Critical Success Factors, and Barriers to Configuration Management implementation. This maturity model will help organizations to assess their current level of maturity, identify rational targets for improvements, and will help in providing action plans for enhancing their configuration management process capability. Like the previous two studies, this part of the research is conducted through semi-structured interviews followed by questionniare surveys.
158

Automated unit testing of evolving software

Shamshiri, Sina January 2016 (has links)
As software programs evolve, developers need to ensure that new changes do not affect the originally intended functionality of the program. To increase their confidence, developers commonly write unit tests along with the program, and execute them after a change is made. However, manually writing these unit-tests is difficult and time-consuming, and as their number increases, so does the cost of executing and maintaining them. Automated test generation techniques have been proposed in the literature to assist developers in the endeavour of writing these tests. However, it remains an open question how well these tools can help with fault finding in practice, and maintaining these automatically generated tests may require extra effort compared to human written ones. This thesis evaluates the effectiveness of a number of existing automatic unit test generation techniques at detecting real faults, and explores how these techniques can be improved. In particular, we present a novel multi-objective search-based approach for generating tests that reveal changes across two versions of a program. We then investigate whether these tests can be used such that no maintenance effort is necessary. Our results show that overall, state-of-the-art test generation tools can indeed be effective at detecting real faults: collectively, the tools revealed more than half of the bugs we studied. We also show that our proposed alternative technique that is better suited to the problem of revealing changes, can detect more faults, and does so more frequently. However, we also find that for a majority of object-oriented programs, even a random search can achieve good results. Finally, we show that such change-revealing tests can be generated on demand in practice, without requiring them to be maintained over time.
159

Investigating the factors affecting on-line shopping adoption in Saudi Arabia

Alsharif, Faisal Fahhad January 2013 (has links)
The Internet has played and still plays a vital role in people's lives in recent years. One of the most important aspects of technology relating to the Internet is electronic commerce (e-commerce). Despite the tremendous growth in the number of Internet users in Saudi Arabia, adoption of on-line shopping by individuals is still in its early stages. The two main reasons that motivated the researcher to conduct this study to investigate the factors that affect individuals in Saudi Arabia to use and adopt on-line shopping are belief in the importance of on-line shopping for the economy in general and for people's lives in particular, as well as the limited studies conducted in this regard. This research adopts a quantitative methodology to answer the research question: What factors enhance the likelihood of adoption of on-line shopping in Saudi Arabia? This research adopts a Unified Theory of Acceptance and Use Technology (UTAUT). The constructs used in this model are: performance expectancy, effort expectancy, social influence, facilitating conditions, attitude toward using technology, computer self-efficacy, computer anxiety and behaviour intention. The proposed additional constructs for this study are perceived credibility (security, trust, privacy and risks), cultural background (religion and language) and prevention factors (legislation availability, delivery services, postal address and quality of Internet services). The findings from this study show that there is a statistical effect of the factors noted above on behavioural intentions to use the technology; the research hypotheses have been achieved except four hypotheses in addition to the absence of the effect of moderator's factors, such as age, gender, education and experience. The finding also proved the importance of additional factors effecting on-line shopping adoption such as saving time, price, ease, faster shopping and delivery services.
160

Achieving fair exchange and customer anonymity for online products in electronic commerce

Alqahtani, Fahad Ali January 2014 (has links)
In the recent years, e-commerce has gained much importance. Traditional commerce (in which case the customer physically goes to the merchant’s shop, purchases goods and/or services and makes a payment) is slowly being replaced with e-commerce and more people tend to prefer doing their shopping online. One of the main reasons for this attraction is the convenience the e-commerce provides. Customers can choose from a lot of different merchants at the convenience of their homes or while travelling by avoiding the hassle and stress of traditional shopping. However, e-commerce has lots of challenges. One key challenge is trust as transactions take place across territories and there are various legal & regulatory issues that govern these transactions. Various protocols and underlying e-commerce technologies help in the provision of this trust. One way to establish trust is to ensure fair exchange. There is also a question about traceability of transactions and customers’ need for privacy. This is provided by anonymity – making sure that the transactions are untraceable and that the customers’ personal information is kept secret. Thus the aim of this research is to propose a protocol that provides fair exchange and anonymity to the transacting parties by making use of a Trusted Third Party. The research is also aimed at ensuring payment security and making use of a single payment token to enhance the efficiency of the protocol. The proposed protocol consists of pre-negotiation, negotiation, withdrawal, purchase and arbitration phases. The analysis of the protocol proves that throughout all the phases of the e-commerce transaction, it is able to provide fair exchange and complete anonymity to the transacting parties. Anonymity provides the privacy of customers’ data and ensures that all Personally Identifiable Information of the transacting parties are kept hidden to avoid misuse. The protocol proposed is model checked to ensure that it is able to show that the fair exchange feature is satisfied. It is implemented using Java to show that it is ready-to-use and not just a theoretical idea but something that can be used in the real-world scenario. The security features of the protocol is taken care of by making sure that appropriate cryptographic algorithms and protocols are used to ensure provision of confidentiality and integrity. This research explores those areas that have not been covered by other researchers with the idea that there is still a lot of scope for improvement in the current research. It identifies these v opportunities and the ‘research gaps’ and focuses on overcoming these gaps. The current e-commerce protocols do not cover all the desirable characteristics and it is important to address these characteristics as they are vital for the growth of e-commerce technologies. The novelty of the protocol lies in the fact that it provides anonymity as well as fair exchange using a Trusted Third Party that is entirely trustworthy unlike certain protocols where the trusted third party is semi-trusted. The proposed protocol makes use of symmetric key cryptography wherever possible to ensure that it is efficient and light weight. The number of messages is significantly reduced. This overcomes the drawback identified in various other protocols which are cumbersome due to the number of messages. Anonymity is based on blind signature method of Chaum. It has been identified that usage of other methods such as pseudo-identifiers have resulted in the inefficiency of the protocol due to the bottlenecks created by these identifiers. It also ensures anonymity can never be compromised unlike certain protocols whereby an eavesdropper can find out the customer’s identity as the customer is required to disclose his/her public key during transactions. Further to this, the protocol also provides immunity against message replay attacks. Finally, the protocol always assumes that one or more parties can always be dishonest which is unlike certain protocols that assume only one party can be dishonest at any point. This ensures that all scenarios are taken into consideration and two parties cannot conspire against the other thus compromising on the fairness of the protocol. Detailed analysis, implementation, verification and evaluation of the protocol is done to ensure that the research is able to prove that the protocol has been carefully designed and the key goals of fair exchange and anonymity. All scenarios are taken into consideration to prove that the protocol will indeed satisfy all criteria. The research thus expects that the protocol could be implemented in real-life scenarios and finds a great potential in the e-commerce field.

Page generated in 0.0156 seconds