• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 476
  • 104
  • 29
  • 24
  • 24
  • 24
  • 24
  • 24
  • 24
  • 8
  • 7
  • 6
  • 2
  • 2
  • 2
  • Tagged with
  • 795
  • 795
  • 787
  • 196
  • 191
  • 191
  • 128
  • 123
  • 100
  • 96
  • 94
  • 90
  • 89
  • 89
  • 86
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
511

Let the computer teach itself : an examination of teaching machines among other methods of teaching in computer science

Wolfe, James S. January 1988 (has links)
Better methods for teaching computer science are needed. While no clear winner emerges in studies on the effectiveness of teaching media, the use of several media can be mutually reinforcing. Methods available to computer science teachers include lecturing, giving projector demonstrations, conducting hands-on sessions, monitoring independent study, and making a teaching machine available for group or individual use.A main strength of the teaching machine is its furnishing immediate feedback as students answer its quizzes. Teaching machines can assist in the major components of instruction: presenting information, guiding students, providing practice, and assessing student learning. They may take the form of tutorials, drills, simulations, or games. Good tutorials include directions, presentation frames, quizzes, assessment of student responses (with hints following mistakes), and facility to branch to different parts of the machine. A teaching machine's usefulness may be validated by comparing pre-test and post-test scores, by tracking student performance, and by field observation. Machines may be used solo or as a supplement and used with a whole class or as a means for bright students to forge ahead or slow students to catch up. / Department of Computer Science
512

Simulation of data link layer of communication protocols using Ada

Bale, Siddalingaswamy January 1989 (has links)
Communication and networking are vast areas which are developing rapidly. It is very important to know in detail how the system works, how it can be enhanced to meet today's requirement, how to implement the appropriate techniques to obtain desired results etc. To explain all of the above, simulation techniques were adopted. Simulation plays an important role in developming new designs and modifying existing systems to make them more feasible and efficient.Protocols are the back bones of the communication systems and play a major role in data transfer. The review of literature provides an explanation of ISO/OSI models, layered approach and concepts of simulation. The simulation was carried out to explain the concepts of error detection, error correction and flow control in Data Link layer. Ada was used to develop the software, because of its concurrent processing tools called tasks. Also state-of-the-art utilities like Vax/Vms Screen Management System, File Definition Language, and Debugger were us ed in the program development. / Department of Computer Science
513

A trace-driven simulation study of cache memories

Xiong, Bo January 1989 (has links)
The purpose of this study is to explore the relationship between hit ratio of cache memory and design parameters. Cache memories are widely used in the design of computer system architectures to match relatively slow memories against fast CPUs. Caches hold the active segments of a program which are currently in use. Since instructions and data in cache memories can be referenced much faster than the time required to access main memory, cache memories permit the execution rate of the machine to be substantially increased. In order to function effectively, cache memories must be carefully designed and implemented. In this study, a trace-driven simulation study of direct mapped, associative mapped and set-associative mapped cache memories is made. In the simulation, cache fetch algorithm, placement policy, cache size and various parameters related to cache design and the resulting effect on system performance is investigated. The cache memories are simulated using the C language and the simulation results are analyzed for the design and implementation of cache memories. / Department of Physics and Astronomy
514

Statistical ray-tracing analysis of the linear Fresnel mirror solar concentrator

Ying, Xiaomin January 1993 (has links)
The Monte Carlo-type statistical ray-tracing method was used to investigate the performance of the line-focusing Fresnel mirror solar concentrator. An optical model of the line-focusing Fresnel mirror concentrator using the statistical ray-tracing approach was developed. Many rays of sunlight from the solar disk were selected at random and traced through the concentrator in this model. This optical model permits calculation of the local and geometric concentration ratios. The latter requires an energyloss analysis. Small sun-tracking errors of the diurnal or transverse type were included in the model.Based on the optical model and the Monte Carlo-type statistical ray-tracing method, a computer program was written implementing the model and computations using Pascal. To facilitate performance comparisons, a baseline concentrator design was adopted. To study the effects of imperfect tracking, performance data were generated for small tracking errors up to approximately two and one-half degrees. The selected mirror configuration permitted comparisons between the statistical approach and previous applications of the "extreme ray" analysis for an imperfectly tracking mirror concentrator.Simulation results demonstrated that the concentration characteristics are highly sensitive to the tracking error. The geometric concentration ratio dramatically decreases when the tracking error increases, which is the same as the "extreme ray" analysis. Results of some typical numerical calculations are presented graphically and discussed. / Department of Physics and Astronomy
515

An extensible Java system for graph editing and algorithm animation

Nall, Aaron J. January 1998 (has links)
The G-Net research group at Ball State University previously developed a graph editor, written in Java, with limited algorithm support. This editor was modified until the code had the instability of a legacy system. It was decided that, rather than continue working with the old system, a new version would be created.The enhancements planned for this new version were more efficient data structures, easy addition of new algorithms, and animated algorithm output. Additionally, the new version was to be written in compliance with the latest Java standards. This paper describes the structure of this new program, Jedit3.1. An overview of the structure of the program and detailed descriptions of the material that future programmers will need to understand in order to add new algorithms is included. Appropriate descriptions are included for files that future programmers should understand but not necessarily modify. / Department of Computer Science
516

Analysis of tracking error effects for the Fresnel mirror solar concentrator

Zhan, Yong January 1989 (has links)
The solar concentration performance of a tracking, flat-base, line-focusing Fresnel mirror was investigated in this study. The Fresnel mirror consists of flat mirror strips situated on a base and oriented at appropriate angles to focus incident light to a desired line. Simple optical ray tracing and energy conservation were used to develop a mathematical model of the concentrator assuming small or zero diurnal tracking errors. The model analyzed the concentrator design and provided detailed expressions for the geometric evaluation of the concentrated sunlight rays in the focal plane above the mirror. The local concentration ratio and the geometric concentration ratio were introduced to describe the intensity profile in the focal plane and the average concentration of sunlight on a target absorber. Included in the model were losses of incident sunlight due to imperfect reflection, nonreflecting portions of the base, and blockage by adjacent mirror strips when imperfect tracking occurs.Based on the analytical model and using the Ada high level language, a computer program was written to simulate the concentrator. To facilitate performance comparisons, a baseline concentrator design was adopted. To study the effects of imperfect tracking, performance data were generated for small tracking errors up to approximately two and one-half degrees. The effects of design variations were studied by varying the concentrator focal length, strip width, and base width.Simulation results demonstrated that the concentration characteristics were highly sensitive to tracking error. Intensity profile shifts relative to the target caused the highest losses in intercepted sunlight.Design decisions were found to dramatically affect the concentration character- istics. For the baseline concentrator under perfect tracking conditions, an optimum focal length was identified. Except for the disadvantage of added costs, decreased strip widths was found to be a way to increase both the maximum and average concentration ratio for the absorber. Using smaller strip widths might, however, critically affect the energy intercepted by the target under imperfect tracking. Increasing the concentrator base width increased the total amount of power in the focal plane, with a higher maximum concentration ratio and additional tailing of the intensity profile. / Department of Physics and Astronomy
517

An examination of the application of design metrics to the development of testing strategies in large-scale SDL models

West, James F. January 2000 (has links)
There exist a number of well-known and validated design metrics, and the fault prediction available through these metrics has been well documented for systems developed in languages such as C and Ada. However, the mapping and application of these metrics to SDL systems has not been thoroughly explored. The aim of this project is to test the applicability of these metrics in classifying components for testing purposes in a large-scale SDL system. A new model has been developed for this purpose. This research was conducted using a number of SDL systems, most notably actual production models provided by Motorola Corporation. / Department of Computer Science
518

Using Ada tasks (concurrent processing) to simulate a business system

Zahidin, Ahmad Zamri January 1988 (has links)
Concurrent processing has always been a traditional problem in developing operating systems. Today, concurrent algorithms occur in many application areas such as science and engineering, artificial intelligence, business systems databases, and many more. The presence of concurrent processing facilities allows the natural expression of these algorithms as concurrent programs. This is a very distinct advantage if the underlying computer offers parallelism. On the other hand, the lack of concurrent processing facilities forces these algorithms to be written as sequential programs, thus, destroying the structure of the algorithms and making them hard to understand and analyze.The first major programming language that offers high-level concurrent processing facilities is Ada. Ada is a complex, general purpose programming language that provides an excellent concurrent programming facility called task that is based on rendezvous concept. In this study, concurrent processing is practiced by simulating a business system using Ada language and its facilities.A warehouse (the business system) consists of a number of employees purchases microwave ovens from various vendors and distributes them to several retailers. Simulation of activities in the system is carried over by assigning each employee to a specific task and all tasks run simultaneously. The programs. written for this business system produce transactions and financial statements of a typical business day. They(programs) are also examining the behavior of activities that occur simultaneously. The end results show that concurrency and Ada work efficiently and effectively. / Department of Computer Science
519

Jedit : a graph editor in Java

Vemuri, Suresh P. January 1996 (has links)
There are increasing demands for graphics-oriented software packages for graph theory teaching and research. G-Net, a departmental research project headed by Dr. Jay Bagga, is an effort to create a software package that comprises of a graph database, a graph editor, 'and a collection of graph algorithms. Also, with widespread use of the World Wide Web, the graph editor should be accessible through the Web. Taking these issues into consideration Jedit (Java Graph editor) is developed. This thesis concentrates on the design and implementation of a graph editor and a graph object (Jgraph). Jgraph is developed in java to implement the graph algorithms in future. One of the unique features of this graph editor is that it can be used as stand-alone application or as an applet for the Web. In addition, it also provides a friendly user interface. / Department of Computer Science
520

Implementing parallel sorting algorithms

Elabed, Jamal January 1989 (has links)
The Republic of Guinea is located on the west coast of Africa at about 11° North latitude. A large portion of Guinea's supply of protein is dried fish. The actual drying method operates under open air, the foodstuff being unprotected from unexpected rains, windborne dirt and dust, and from infestation by insects, rodents, and other animals. More, the deforestation rate is increasing year after year, depleting the source of fuel for drying. Practical ways of drying fish cheaply and sanitarily would be welcome.Recently, much work has been devoted to developing algorithms for parallel processors. Parallel algorithms have received a great deal of attention because of the advances in computer hardware technology. These parallel processors and algorithms have been used to improve computational speed, especially in the areas of sorting, evaluation of polynomials, arithmetic expressions, matrix and graphic problems.Sorting is an important operation in business and computer engineering applications. The literature contains many sorting algorithms, both sequential and parallel, which have been developed and used in practical applications. bubble sort, quick sort, insertion sort, enumeration sort, bucket and odd-even transposition sort. Ada, a new excellent programming language that offers high-level concurrent processing facilities called tasks, is used in this thesis to introduce, implement, compare and evaluate some of the parallel sorting algorithms. This thesis will also show that parallel sorting algorithms reduce the time requirement to perform the tasks. / Department of Computer Science

Page generated in 0.0458 seconds