• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 91
  • 24
  • 15
  • 13
  • 1
  • Tagged with
  • 2488
  • 1189
  • 1135
  • 1070
  • 895
  • 183
  • 135
  • 112
  • 91
  • 89
  • 88
  • 86
  • 78
  • 74
  • 68
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

An object-based algebraic specification environment

Ma, Sula January 2001 (has links)
No description available.
72

Making plain/s space : the literary geographies of Cather, Kroetsch, and Heat-Moon

Kristensen, Allan Juhl January 2009 (has links)
This thesis examines literary texts as place-making conduits in the case of the North American region commonly referred to as the Great Plains or the prairies. From a discursive and historicist perspective, it demonstrates how Willa Cather's novel 0 Pioneersl, Robert Kroetsch's poems 'Stone Hammer Poem' and 'Seed Catalogue', and William Least Heat-Moon's PrairyErth constitute key twentieth-century literary geographies that demarcate a shift in the way the land on the ground has been overlaid with spatial tropes and narrative structures. The direction of the shift, it is argued here, is from a narrow regionalism that conceives of place as enclosed, rooted, and essentialist towards a 'middle ground' in which cultural and natural forces come into contact, conflict, as well as collaboration in a complex dialogic negotiation of power, presence, survival, and belonging (Richard White). In setting out a critical framework, the first chapter identifies a culturally dominant meta-narrative of fall and recovery as a powerful ideological influence on how the Plains has been represented and understood. It is shown how, in mapping the region along linear wilderness-garden and desert-landscape trajectories, artists and critics alike have tended to represent the place in binary and essentialist terms as landscape (not wilderness), rooted (not routed), and authentic (not hybrid). Countering this discourse is an archaeological mode of inquiry that decentres linear narratives of progressive recovery/fall by unearthing local particulars. What emerges instead are palimpsest and rhizomatic deep maps that trace intercultural, transnational, and global movements operating beneath, across, and above the levels of region and nation and hence challenge narrow definitions of either. Thus, in addition to a formal dispersion of textual and geographical space, the deep maps also count the cost of empire and nation building and address socio-political issues of ethnicity, ecology, inhabitation, and economics. Through close readings of Cather, Kroetsch, and Heat-Moon's literary geographies, I proceed to situate them in relation to this matrix of prairie place-making and elaborate on how they variously contribute to, dispute, and seek to displace it.
73

Clustering strategies for object databases

Meads, Ann L. January 1997 (has links)
When object databases arrived on the scene some ten years ago, they provided database capabilities for previously neglected, complex applications, such as CAD, but were burdened with one inherent teething problem, poor performance. Physical database design is one tool that can provide performance improvements and it is the general area of concern for this thesis. Clustering is one fruitful design technique which can provide improvements in performance. However, clustering in object databases has not been explored in depth and so has not been truly exploited. Further, clustering, although a physical concern, can be determined from the logical model. The object model is richer than previous models, notably the relational model, and so it is anticipated that the opportunities with respect to clustering are greater. This thesis provides a thorough analysis of object clustering strategies with a view to highlighting any links between the object logical and physical model and improving performance. This is achieved by considering all possible types of object logical model construct and the implementation of those constructs in terms of theoretical clusterings strategies to produce actual clustering arrangements. This analysis results in a greater understanding of object clustering strategies, aiding designers in the development process and providing some valuable rules of thumb to support the design process.
74

User interface design support for the development of knowledge-based systems

Bright, C. K. January 1989 (has links)
The present scarcity of operational knowledge-based systems (KBS) has been attributed, in part, to an inadequate consideration shown to user interface design during development. From a human factors perspective the problem has stemmed from an overall lack of user-centred design principles. Consequently the integration of human factors principles and techniques is seen as a necessary and important precursor to ensuring the implementation of KBS which are useful to, and usable by, the end-users for whom they are intended. Focussing upon KBS work taking place within commercial and industrial environments, this research set out to assess both the extent to which human factors support was presently being utilised within development, and the future path for human factors integration. The assessment consisted of interviews conducted with a number of commercial and industrial organisations involved in KBS development; and a set of three detailed case studies of individual KBS projects. Two of the studies were carried out within a collaborative Alvey project, involving the Interdisciplinary Higher Degrees Scheme (IHD) at the University of Aston in Birmingham, BIS Applied Systems Ltd (BIS), and the British Steel Corporation. This project, which had provided the initial basis and funding for the research, was concerned with the application of KBS to the design of commercial data processing (DP) systems. The third study stemmed from involvement on a KBS project being carried out by the Technology Division of the Trustees Saving Bank Group plc. The preliminary research highlighted poor human factors integration. In particular, there was a lack of early consideration of end-user requirements definition and user-centred evaluation. Instead concentration was given to the construction of the knowledge base and prototype evaluation with the expert(s). In response to this identified problem, a set of methods was developed that was aimed at encouraging developers to consider user interface requirements early on in a project. These methods were then applied in the two further projects, and their uptake within the overall development process was monitored. Experience from the two studies demonstrated that early consideration of user interface requirements was both feasible, and instructive for guiding future development work. In particular, it was shown a user interface prototype could be used as a basis for capturing requirements at the functional (task) level, and at the interface dialogue level. Extrapolating from this experience, a KBS life-cycle model is proposed which incorporates user interface design (and within that, user evaluation) as a largely parallel, rather than subsequent, activity to knowledge base construction. Further to this, there is a discussion of several key elements which can be seen as inhibiting the integration of human factors within KBS development. These elements stem from characteristics of present KBS development practice; from constraints within the commercial and industrial development environments; and from the state of existing human factors support.
75

Improving time-to-market through globally distributed software development

Taweel, Adel January 2001 (has links)
Reductions in development time and cost are important factors for software engineering. Such reductions not only enable companies to sell products earlier, faster and cheaper but in some cases, are necessary for the survival of an organisation. While many ways have been suggested to reduce development time and costs, the complex nature of software development remains a challenge. This research investigates the opportunities to reduce time-to-market through "around the clock" software development. It focuses on a particular work pattern, SeSE (sequential collaborative software engineering), in which a task is passed at the end of a working day from one developer to another across time zones. The research identifies the related contextual factors and associated overheads and presents a model of the relationships between development time and these contextual factors and overheads. As part of the evaluation of this work, the thesis presents an exploratory empirical study that has been carried out to validate the feasibility of this work pattern and to obtain values for some overheads. In addition it presents an implementation of the model that demonstrates one application of the model and facilitates its verification. The thesis concludes that software development across time zones is a feasible work pattern and can potentially be used to reduce development time. However several related issues have to be addressed before this work pattern can be widely adopted and become normal practice.
76

Computer simulations of doping, device structures and fabrication processes in semiconductors

Gug, Randeep Kumar January 1999 (has links)
No description available.
77

Molecular binder : a multimodal virtual environment to assist learning

Garcia Ruiz, Miguel Angel January 2003 (has links)
No description available.
78

Estimating the effort in the early stages of software development

Levy, Zeeva January 1990 (has links)
Estimates of the costs involved in the development of a software product and the likely risk are two of the main components associated with the evaluation of software projects and their approval for development. They are essential before the development starts, since the investment early in software development determines the overall cost of the system. When making these estimates, however, the unknown obscures the known and high uncertainty is embedded in the process. This is the essence of the estimator's dilemma and the concerns of this thesis. This thesis offers an Effort Estimation Model (EEM), a support system to assist the process of project evaluation early in the development, when the project is about to start. The estimates are based on preliminary data and on the judgement of the estimators. They are developed for the early stages of software building in which the requirements are defined and the gross design of the software product is specified. From these estimates only coarse estimates of the total development effort are feasible. These coarse estimates are updated when uncertainty is reduced. The basic element common to all frameworks for software building is the activity. Thus the EEM uses a knowledge-base which includes decomposition of the software development process into the activity level. Components which contribute to the effort associated with the activities implemented early in the development process are identified. They are the size metrics used by the EEM. The data incorporated in the knowledge-base for each activity, and the rules for the assessment of the complexity and risk perceived in the development, allow the estimation process to take place. They form the infrastructure for a 'process model' for effort estimating. The process of estimating the effort and of developing the software are linked. Assumptions taken throughout the process are recorded and assist in understanding deviations between estimates and actual effort and enable the incorporation of a feedback mechanism into the process of software development. These estimates support the decision process associated with the overall management of software development, they facilitate management involvement and are thus considered as critical success factors for the management of software projects.
79

Parallel and distributed simulation of timed Petri nets

Cui, Qing Ming January 1997 (has links)
No description available.
80

Cosynthesis of embedded systems using coloured interpreted petri nets

Sananikone, Dang S. January 1996 (has links)
The rising complexity in systems design, and the shift in the hardware/software functionality boundary has spurred research into development of EDA (Electronic Design Automation) tools at the systems level. Codesign is a methodology that proposes an integrated approach to systems design unifying both hardware and software approaches. Cosynthesis is a major field of research within codesign; cosynthesis takes a behavioural description and generates a hardware/software partition which satisfies system constraints. Current research is concerned with the automatic partitioning of systems. COSYN was developed to address the cosynthesis of embedded systems. A CIPN (Coloured Interpreted Petri Net) is used to model multiple processes and interprocess communication. The partitioning algorithm, which adopts a fine-grained approach to system partitioning (it considers moving nodes at the basic block level), is based on selecting blocks based on their potential speedup and extra hardware requirements, using hardware and software execution time estimators. The interdependence between interprocess communication primitives is exploited to achieve a better hardware/software partition. Results for an input example pdi are given which illustrate the benefits of the approach presented in this thesis.

Page generated in 0.0114 seconds