• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20450
  • 5226
  • 1262
  • 1210
  • 867
  • 670
  • 435
  • 410
  • 410
  • 410
  • 410
  • 410
  • 407
  • 158
  • 156
  • Tagged with
  • 34397
  • 34397
  • 14116
  • 10832
  • 3107
  • 2981
  • 2737
  • 2541
  • 2483
  • 2354
  • 2279
  • 2178
  • 2165
  • 2046
  • 1937
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Accurate geometry reconstruction of vascular structures using implicit splines

Hong, Qingqi January 2012 (has links)
3-D visualization of blood vessel from standard medical datasets (e.g. CT or MRI) play an important role in many clinical situations, including the diagnosis of vessel stenosis, virtual angioscopy, vascular surgery planning and computer aided vascular surgery. However, unlike other human organs, the vasculature system is a very complex network of vessel, which makes it a very challenging task to perform its 3-D visualization. Conventional techniques of medical volume data visualization are in general not well-suited for the above-mentioned tasks. This problem can be solved by reconstructing vascular geometry. Although various methods have been proposed for reconstructing vascular structures, most of these approaches are model-based, and are usually too ideal to correctly represent the actual variation presented by the cross-sections of a vascular structure. In addition, the underlying shape is usually expressed as polygonal meshes or in parametric forms, which is very inconvenient for implementing ramification of branching. As a result, the reconstructed geometries are not suitable for computer aided diagnosis and computer guided minimally invasive vascular surgery. In this research, we develop a set of techniques associated with the geometry reconstruction of vasculatures, including segmentation, modelling, reconstruction, exploration and rendering of vascular structures. The reconstructed geometry can not only help to greatly enhance the visual quality of 3-D vascular structures, but also provide an actual geometric representation of vasculatures, which can provide various benefits. The key findings of this research are as follows: 1. A localized hybrid level-set method of segmentation has been developed to extract the vascular structures from 3-D medical datasets. 2. A skeleton-based implicit modelling technique has been proposed and applied to the reconstruction of vasculatures, which can achieve an accurate geometric reconstruction of the vascular structures as implicit surfaces in an analytical form. 3. An accelerating technique using modern GPU (Graphics Processing Unit) is devised and applied to rendering the implicitly represented vasculatures. 4. The implicitly modelled vasculature is investigated for the application of virtual angioscopy.
72

Par: An approach to architecture-independent parallel programming.

Coffin, Michael Howard January 1990 (has links)
This dissertation addresses the problem of writing portable programs for parallel computers, including shared memory, distributed, and non-uniform memory access architectures. The basis of our approach is to separate the expression of the algorithm from the machine-dependent details that are necessary to achieve good performance. The method begins with a statement of the algorithm in a classic, explicitly parallel, manner. This basic program is then annotated to specify architecture-dependent details such as scheduling and mapping. These ideas have been cast in terms of a programming language, Par, which provides flexible facilities for a range of programming styles, from shared memory to message passing. Par is used to specify both the algorithm and the implementation of the annotations.
73

The application of computer-supported collaborative work and knowledge-based technology to the view modeling and integration problems: A multi-user view integration system (MUVIS).

Hayne, Stephen Charles. January 1990 (has links)
This dissertation describes the architecture, development and implementation of a network application called MUVIS. MUVIS supports the design of distributed object-oriented databases by groups of potential users. MUVIS is a graphical system implemented on Microsoft Windows for personal computers attached to local area networks. It allows designers to share conceptual design objects in real-time and resolve naming conflicts through the electronic medium. It assists these database designers in representing their views and integrating these views into a global conceptual schema. The view integration component is decoupled from the view modeling component. The underlying data model, the Semantic Data Model (SDM), is extended to include distribution information and transaction specification. The visual interface to the SDM is an Extended Entity Relationship model, yet objects in the SDM are classes (as opposed to entities and relationships) and this fact reduces the complexity of the integration. An experiment involving groups of size three and four and individuals modeling a complex case validated the view modeling system. The groups were more efficient and produced higher quality designs than did individuals.
74

A generic user interface for hierarchical knowledge-based simulation and control systems.

Chow, Alex Chung-Hen. January 1990 (has links)
Successful design of the user interface for an interactive system must be sensitive to activities supported by the software and the time users spend in these activities. This dissertation presents a systematic approach to user interface design based on this idea. A control flow model for the interaction between the human and the application is developed and serves as the framework for the proposed approach. Sub-models of the control flow model correspond to actual programming modules and determine the logical system design sequence. The System Entity Structure language is employed to span hierarchical spaces. User interface design is based on the space dimensions of the System Entity Structure pruning process. A Virtual Tree Environment is developed on the TI Explorer to provide a programming facility for implementing user-interfaces. Following the proposed methodology, a system designer can give users an efficient user interface to applications that can be hierarchically described by the System Entity Structure language. The concept can also be implemented on other window systems such as Microsoft Windows and X-windows.
75

Computer-based decision-making: The impact of personal computers and distributed databases on managers' performance.

Gleeson, William Joseph January 1990 (has links)
This study was a field experiment to test the influence of two different computer conditions on decision-making in an organizational setting. The experiment was carried out in ten mid-sized corporations. The 117 subjects contained about equal numbers of managers and non-managers. The computer conditions tested were (1) a personal computer used with a distributed data base and (2) the traditional mainframe with a central data base supplying information by printouts. The experimental problem was identical in both conditions as was the data in the data base. An additional part of the experiment was to vary the amount of information provided. Half the Subjects in both test conditions had only half the information available to them that the other half of the Subjects had. The results indicated that personal computers were more efficient by enabling Subjects to reach decisions faster. PCs did not produce better outcomes overall. Managers performed best using the traditional printouts. Their performance declined in effectiveness (but not in efficiency) when using a PC. With non-managers the reverse applied. Non-managers performed best when using a PC if they were computer literate. In fact, computer literate non-managers with PCs performed better than managers in either test condition whether the managers were computer literate or not. The level of information available is important. More information leads to better decisions. The implications of the results for management are that (1) more training in the use of computers will produce better outcomes in decision-making; (2) PCs can improve productivity by achieving better effectiveness through better decision outcomes and do so more efficiently by taking less time. Non-managers using PCs can make managerial decisions as well as managers can if the non-managers have computer literacy training. This tends to support the view that managers can be de-skilled by the arrival of PCs in the workplace.
76

The impact of a computer network on group decision-making: An experimental investigation.

Raghavan, Veeravalli January 1990 (has links)
This dissertation investigates the performance of a problem solving group whose members are constrained to communicate only through a computer network, a situation that has become commonplace with advances in computer technology. The model of group decision making adopted specifies that the group outcome is a function of six independent variables. They are group size, the incentives offered to group members, the decision rule used to combine member inputs, the distribution of information among members, the complexity of the task, and mode of communication (face-to-face or computer network). The dependent variables are group decision quality and members' perceptions of the group process. A laboratory experiment was conducted to study the effect of the computer network on the dependent variables. The group task required the generation and evaluation of alternative production plans for a candy company that has five divisions, each of which was managed by a group member. The five managers worked together as members of the corporate committee to decide on the allocation of a limited amount of Milk and Cocoa to the divisions. After this allocation was accepted, each manager decided on his/her divisional production plan subject to a local constraint. Three treatments were investigated: face-to-face (manual) groups, computer networked groups, and computer networked groups where the group members selected a Chairman to make the allocation decision. Information about the common resources and the overall profit function was public while the division information was private, but could be shared over the network. The decision rule was unanimity and each member of the group received a cash payoff based on successful completion of the task and the amount of company profit. Manual groups achieved the highest profit followed by computer groups, and computer groups with chairman, in that order. Manual groups also achieved the highest distribution efficiency and divisional efficiency suggesting that the average member attained a better understanding of the problem in the manual condition. It appears that the computer network degraded group decision performance and electing a chairman exacerbated the trend. The satisfaction of members was lower in the computer conditions that the manual condition. Contrary to some past research that used roughly similar tasks, participation by group members was not felt to be more even in the computer conditions than the manual condition. Also, manual groups perceived their group process and solutions to be more effective than the computer groups. These results would seem to indicate that if high decision quality is the primary purpose of a group meeting then face-to-face interaction among the group members may be necessary even though creating such conditions may be expensive.
77

Design and implementation of a negotiation support system.

Herniter, Bruce Corey. January 1991 (has links)
A Negotiation Support System (NSS) is a system consisting of hardware, software, people, and procedures that assists the individual negotiator, mediator, or researcher; and provides a solution or facilitates the process of negotiation. NSS have previously been designed around modeling and simulation, expert systems, and other techniques. However, negotiation can be considered a group process involving two or more teams with communication routed through the group leaders. Electronic Meeting Systems (EMS) provide a model for the use of computers during group processes. EMS provides a research model and a framework for the development of software tools. The framework specifies that a computer-based meeting system consists of the group, the task, and the context and finally the EMS itself. MEDIANSS and its successor GroupSystems/Mediation (GS/M) were tested in two contract negotiations totaling almost 90 hours over 23 sessions. The negotiators generally approved of the GS/M tools and used them: the negotiators spent 19% to 24% of their time using computers. Both negotiations concluded successfully. The major accomplishment of GS/M was its assistance in the secretarial function of the talks. The drafts of the contracts were finalized faster than typically expected.
78

The Jade File System.

Rao, Herman Chung-Hwa. January 1991 (has links)
File systems have long been the most important and most widely used form of shared permanent storage. File systems in traditional time-sharing systems such as Unix support a coherent sharing model for multiple users. Distributed file systems implement this sharing model in local area networks. However, most distributed file systems fail to scale from local area networks to an internet. This thesis recognizes four characteristics of scalability: size, wide area, autonomy, and heterogeneity. Owing to size and wide area, techniques such as broadcasting, central control, and central resources, which are widely adopted by local area network file systems, are not adequate for an internet file system. An internet file system must also support the notion of autonomy because an internet is made up by a collection of independent organizations. Finally, heterogeneity is the nature of an internet file system, not only because of its size, but also because of the autonomy of the organizations in an internet. This thesis introduces the Jade File System, which provides a uniform way to name and access files in the internet environment. Jade is a logical system that integrates a heterogeneous collection of existing file systems, where heterogeneous means that the underlying file systems support different file access protocols. Because of autonomy, Jade is designed under the restriction that the underlying file systems may not be modified. In order to avoid the complexity of maintaining an internet-wide, global name space, Jade permits each user to define a private name space. In Jade's design, we pay careful attention to avoiding unnecessary network messages between clients and file servers in order to achieve acceptable performance. Jade's name space supports two novel features: It allows multiple file systems to be mounted under one directory, and it permits one logical name space to mount other logical name spaces. A prototype of Jade has been implemented to examine and validate its design. The prototype consists of interfaces to the Unix File System, the Sun Network File System, and the File Transfer Protocol.
79

Testing eigenvalue software.

Henderson, Lehman Edwin, Jr. January 1991 (has links)
This dissertation describes a significant advance in automated testing of eigenvalue software. Several programs are described that assist the researcher in verifying that a new program is stable. Using backwards error techniques popularized by Wilkinson, a maximizer or "hill climber" systematically searches for instabilities in the program being tested. This work builds on software first reported by Miller and removes the restriction of not being able to work on iterative methods. Testing eigenvalue solver programs with sets of small random input data can often find instabilities, but the described hill climbing technique is more efficient. Using only ten sets of starting points, the maximizer will often find the instability, if it exists, in only a few tries.
80

Soviet advanced technology: The case of high-performance computing.

Wolcott, Peter. January 1993 (has links)
This study uses Soviet high-performance computing (HPC) as a vehicle to study technological innovation, organizational transformation, and the R&D of advanced technologies in centralized-directive economies in the past and during periods of transition. Case studies are used to identify the factors most strongly influencing the evolution of high-performance systems and the facilities within which they were developed. Although closely tied to the military, the HPC sector was not able to overcome basic systemic and technological difficulties. HPC illustrates the limits of centralized-directive economic management's ability to coordinate and prioritize development and production of highly complex, rapidly evolving technologies. Projects were delayed by complex bureaucratic structures, the monopolistic nature of the supporting infrastructure, and resistance of production factories. Progress of individual projects was dependent on the degree to which they drove supporting industries, used immature technologies, had an industrial vs. academic orientation, and were developed in conjunction with production facilities. The benefits of the reforms--direct contacts between organizations, increased local control of finances and research, greater flexibility in the management of R&D, and improved opportunities for international contacts--have been overshadowed by economic decline and fundamental weaknesses in the supporting infrastructure. R&D facilities have been transformed into a collection of loosely-coupled semi-autonomous organizational units, increasing short-term viability, but threatening their ability to carry out large-scale, long-term, integrated development. Links between R&D and production facilities have been disrupted. The upstream infrastructure remains ill-suited for providing the technologies necessary for HPC development. Preconditions to long-term viability are restoration of the integrity of the development-production cycle and reduction of the HPC sector's dependency on domestic industries. Taking advantage of mass-produced Western technologies will require changes in philosophies of development and architectural approaches. The concept of a unified sector-wide technological paradigm is not well suited for explaining the diversity of architectural approaches and specific development trajectories. A paradigm consisting of layers of "micro-paradigms" better captures the patterns of continuity and change within projects and features shared between projects. This study suggests that the nature of the revenue stream and the opportunities for alternative organizational forms have a significant influence on organizational structure.

Page generated in 0.2388 seconds