• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7011
  • 1944
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 11712
  • 11712
  • 8064
  • 8064
  • 1211
  • 1206
  • 927
  • 845
  • 842
  • 774
  • 767
  • 552
  • 479
  • 461
  • 459
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Student Retention and Completion Rates in a Postsecondary Online Distance Learning Environment

Ingle, Faye K. 01 January 2004 (has links)
Responding to demands from legislatures, financial aid providers, accreditation organizations, and other entities, colleges and universities are strongly committed to retaining students and assuring their consistent progress toward degree attainment. Student retention is a strong indicator of institutional performance. It mirrors the extent to which students are successfully integrated into the institutional culture, reflects students' level of satisfaction with their continuing educational experience, and signals the likelihood of student graduation. While the theoretical literature that attempts to describe the variables that effect improvements in online student retention is abundant, published empirical research designed to verify those theoretical constructs is scarce. Two distinct but complementary processes were implemented to gain a more complete understanding of online student retention. First, overall rates for online student retention, course completion, and student success were estimated using student counts obtained from a random sample of postsecondary online distance learning programs. These rate estimates, presented as weighted averages, provide a point of reference for comparing and improving student retention, course completion, and student success in other programs. Second, this dissertation presents a meta-analysis of published and unpublished empirical research, performed between 1994 and 2004, that quantified the relationship of a number of independent variables to online student retention. In addition, online survey responses of research, instructional, and administrative online distance learning practitioners are juxtaposed to these results for emphasis or contrast. The results of this dissertation suggest that online distance learning programs should strongly integrate specific attributes and activities in their courses to improve student satisfaction, learning, and retention, which will require a strong faculty commitment to critically assess their teaching practices and to implement instructional improvements. The benefits should include an enhanced student commitment to persist.
232

Intelligent Collision Warning System Based on Fuzzy Logic and Neural Network Technologies

Jacob, Paravila O. 01 January 1997 (has links)
The recent technological changes in computer and industrial control systems have been steadily extending the capabilities to handle a broad range of complex systems. The emergence and development of computer technology and intelligent systems during the past few decades have created a highly promising direction in the field of artificial intelligence. It is increasingly difficult to describe any real system as the level of complexity continues to increase. A combination of systems and techniques are necessary to solve many complex problems. This new direction involves the use of fuzzy logic and artificial neural network theory to enhance the ability of intelligent systems that can learn from experience and to adapt to changes in an environment of uncertainty and imprecision. The Intelligent Automotive Collision Warning System was developed as a rule based system by integrating a fuzzy logic controller with artificial neural network software. The Intelligent Automotive Collision Warning system constantly monitors the speed of the vehicle and the distance of any object in front of the vehicle using an ultrasonic ranging module to warn the operator to maintain a safe operating distance by using fuzzy logic theory and artificial neural network software. Descriptive statistics was used for collecting and organizing the data. Inferential statistics was used to prove the hypotheses based on the results of the collected data. NeuFuz4 software was used to train the neural network and to optimize the fuzzy rule base. The fuzzy logic technology provided a means of converting a linguistic control strategy to operate the warning system. The input/output relationship was defined by fuzzy membership functions which enabled the numerical inputs to be expressed as fuzzy variables using linguistic terms. A new fuzzy logic operator was also developed to optimize the fuzzy input/output relationship.
233

Color Calibration of Computer Display Devices

Jacobs, Douglas L. 01 January 1999 (has links)
Future growth of Internet commerce offering color-dependent products will require some means of calibrating CRT viewing devices. CRT monitors vary greatly in color quality because of manual setting, internal adjustments, and viewing conditions. This research has developed a physical passive comparative calibration instrument for the purpose of determining surrounding viewing conditions and adjusting displayed images with the correct appearance compensation to display images of product with valid colors. The development and experimentation were conducted in three stages. The first stage developed a correlation between the 8-bit colors displayed on the CRT to correlated color temperature. This information was used with data derived with the passive instrument in the second stage to build ICC color profiles containing correct viewing compensation and appearance modeling. The third stage tested the matching quality of the displayed images to pictorial images of product and other images.
234

An XML Based Authorization Framework for Web-based Applications

Jacobs, David 01 January 2001 (has links)
The World Wide Web is increasingly being used to deliver services. The file based authorization schemes originally designed into web servers are woefully inadequate for enforcing the security policies needed by these services. This has led to the chaotic situation where each application is forced to develop its own security framework for enforcing the policies it requires. In tum, this has led to more numerous security vulnerabilities and greater maintenance headaches. This dissertation lays out an authorization framework that enforces a wide range of security policies crucial to many web-based business applications. The solution is described in three steps. First, it specifies the stakeholders in an authorization system, the roles they play, and the crucial authorization policies that web applications commonly require. Secondly, it maps out the design of the XML based authorization language (AZML), showing how it provides for maintenance to be divided into proscribed roles and for the expression of required policies. Lastly, it demonstrates through a scenario the use of the XML authorization language for enforcing policies in a web-based application. It also explores the issues of how maintenance should be handled, what would be required to scale the authorization service and how to more tightly couple the authorization service to the web server.
235

Effects of Computer-Based Instruction on Student Learning of Psychophysiological Detection of Deception Test Question Formulation

Janniro, Michael J. 01 January 1993 (has links)
This study was undertaken in response to the Department of Defense Polygraph Institute's need to identify efficient and economical alternative methods of delivering instruction to resident students and field examiners. The purpose of this study was to investigate the effects of computer-based instruction (CBI) on student learning of psychophysiological detection of deception test question formulation. A posttest-only control group design and a one-tailed t test for independent samples was used. The comparison involved an experimental group receiving test question formulation instruction using CBI, and a control group receiving the traditional classroom lecture. Other researchers have found that CBI, compared to traditional instruction, raises performance scores and reduces learning and instructional time. CBI usually produces positive effects on students and holds high motivational value. Other findings also show that as technology and design of human-computer interfaces progress, the effectiveness of computer-based learning improves. Participating in this study were students (n=29) attending the fall semester basic forensic psychophysiology course. Students were randomly assigned to an experimental or control group. Students in the experimental group (n=14) learned test question formulation using CBI, and students in the control group (n=15) learned test question formulation from classroom instruction. Both groups were administered a posttest to determine if there was a significant difference in learning. After receiving the treatment, it was found that students in the CBI group achieved significantly higher posttest scores (p < 0.01) than students in the traditional classroom group. Also, students using CBI to learn test question formulation mastered the material in less than half the time of their colleagues in the classroom group. It was concluded that the CBI method of instruction was more effective in promoting learning of test question formulation than classroom instruction. The evidence strongly suggests that the CBI program on test question formulation can supplement or replace classroom instruction, and save in instructional time. It was recommended that the CBI program be implemented in the curriculum and distributed to students selected to attend the basic resident course in forensic psychophysiology, and field examiners for continuing education. It was also recommended that additional CBI programs be developed to provide more classroom hours for in-depth learning of advanced theories about forensic psychophysiology.
236

A study of the Decision styles of Individuals choosing alternatives when there Is Uncertainty

Johnson, Thomas L. 21 May 1988 (has links)
There were two primary purposes for conducting this study. The first purpose was to assess subject ski1 in decision making. The second, and equally important purpose, was to evaluate and predict subject decision-making performance based on cognitive and contingency style. Subjects were first categorized according to decision-making style using the Rowe (1985) "Decision Style Inventory III." The subjects were categorized, depending on their score, into groups that possessed one of four styles. The style classifications used were directive, analytical. Conceptual, and behavioral. It is possible for the individual subject to possess more than one style. In order to assess their decision-making skills, the subjects in each of the four style groups were then examined to determine the risk strategy used in a hypothetical problem. In the research design employed, the subject was given a problem, with choices for solving that problem. The subjects made a decision based on the many factors involved in their individual decision-making process. The resulting decisions were examined in terms of the basis of decision style and type of risk exposure. The subjects were selected from students majoring in Management Information Systems (MIS) or Business Administration at Mobile College in Mobile. Alabama. All subjects participating in the study had completed college courses in beginning statistics, mathematics through beginning algebra, and at least twelve hours in management information systems as well as one course in management theory. The study focused on the decision styles of the student decision makers and their tendencies to undertake certain types of risk based on their decision styles.
237

The Dynamics of Complex Surfaces in n-Dimensions Using Computer Graphics

Johnson, Walter Sir Anthony, Jr. 01 January 1998 (has links)
Visualizing the dynamics of n-dimensional graphics is made possible by high speed, high quality computer graphics, and special techniques. One can visualize the dynamics of a complex surface in n-dimensions by differential manifold segmentation theory and special techniques that utilize surface subdivision algorithms. Techniques like collapsibility, decomposability, separation, and object projection allow a complex surface of multivariate composition to be defined in n-dimensions using computer graphics. These techniques look at n-dimensional manifolds as locally Euclidean in that each of its points has some sufficiently small neighborhood that looks like n-dimensional Euclidean space. These techniques recursively subdivide the complex surface into smaller parts until the projection of a part covers at most one pixel on the screen. The intensity of this pixel is set to the average intensity of the corresponding subarea in the parameter range. The part of the surface corresponding to this subrange is then considered to be displayed. The process stops when the whole surface is displayed. Surfaces in four space exhibit properties that are prohibited in three space. As an example, non-orientable surfaces may be free of self-intersection in four space. The goal of this dissertation is to marry the classical Gaussian models, Euclidean n-space, and Markovian decision process to interactive computer graphics, and to provide a formal geometric foundation for the dynamics of Complex Surfaces in n-dimensions. These structures are used to describe the dimensional shape properties of objects. Various methods such as collapsing, and decomposing are used to make sense of the shapes of objects in a larger dimensional space than the familiar 3 dimensional world. This dissertation describes solutions to several problems associated with manipulating n-dimensional surfaces, and presents visualization techniques for multivariate systems.
238

Planning Genetic Algorithm: Pursuing Meta-knowledge

Johnson, Maury E. 01 January 1999 (has links)
This study focuses on improving business planning by proposing a series of artificial intelligence techniques to facilitate the integration of decision support systems and expert system paradigms. The continued evolution of the national information infrastructure, open systems interconnectivity, and electronic data interchange lends toward the future plausibility of the inclusion of a back-end genetic algorithm approach. By using a back-end genetic algorithm, meta-planning knowledge could be collected, extended to external data sources, and utilized to improve business decision making.
239

Faculty Attitudes Toward Educational Technology: An Extension of Bullard's Analysis of Selected Variables

Johnson, Sallie J. 01 January 2001 (has links)
This formative evaluation study continued the research conducted by Bullard (1998) on the attitudes of professors in teacher-preparation programs toward teaching with technology, the flexibility in using technology for instruction, and the status provided by using educational technology. This study sought information to indicate if the professors' attitudes were related to selected variables. The variables analyzed were the professors' actual use of educational technology, gender, place of employment, rank, length of overall teaching experience, efficacy, institutional encouragement to use computers in instruction, and accreditation affiliation. One hundred thirty-one professors from six institutions of higher education with teacher preparation programs were surveyed using the Facu1ty Instructional Computing Questionnaire (Faseyitan & Hirschbuhl, 1992). Using a stratified random sampling method, the institutions were selected from three southeastern states of the United States with and without National Council for Accreditation of Teacher Education (NCATE) accreditation. Multiple regression analysis and analysis of variance were used to determine the composite and independent effects of the selected independent variables on the dependent variables attitude toward educational technology and use of educational technology. Findings indicated attitudinal differences toward the flexibility of using educational technologies between NCATE and non-NCATE accredited institutions. Results, paralleling Bullard's study, indicated professors' attitudes towards educational technology were significantly affected by the composite set of variables rank, efficacy, and length of teaching experience. Efficacy was found to be the primary contributor of attitudes towards teaching with educational technology and towards the flexibility usage of educational technology in the classroom. No significant relationship between the status provided by using computers and the professors' rank, efficacy, or length of teaching experience was found. No attitudinal differences between genders or among institutions surveyed were found. Although not directly tested, the professors' use of the Internet and electronic mail proved to be far superior to the use of other computer related instructional materials. Both studies indicated professors use computers more in preparation of teaching rather than for actual classroom usage. Additionally, professors indicated they wou1d like to use computers more for instruction and believed computers used for instruction cou1d improve student learning.
240

Applying Genetic Algorithms in the Identification of Novel Behavior Patterns in Network Data Streams

Johnson, Todd A. 01 January 2005 (has links)
The National Strategy to Secure Cyberspace encourages individuals and organizations to identify vulnerabilities before a security breech occurs (PCIPB, 2003). Cabrera et al identify the detection of novel attacks as one of the most elusive and significant problems in intrusion detection (Cabrera, 2000). This sentiment is reiterated by other computer security researchers: (Endler, 1998), (Erbacher, 2002), (Ghosh, 1998) and (Lunt, 2000). A method to detect novel attacks has not been achieved because it implies the hopeless prerequisite of predicting the future. However, there has not been any research that attempts to automate the production of novel attacks. By automating the construction of novel attacks, the intrusion detection system (IDS) may be preemptively enhanced to recognize new attacks. The only difficulty is generating novel attacks. Motivated by the elusive and significant IDS vulnerability to unfamiliar attacks, the goal of this research was to create an evolutionary algorithm (EA) capable of creating original attacks. The EA, known as the Automated Vulnerability Detector (AVO), was designed to be capable of generating both known attacks and previously unknown attacks. It was believed that if these attacks could be discovered before they are used against the IDS, then the IDS could be upgraded proactively, rather than retroactively. The results demonstrate that the A VD can evolve new denial of service attacks.

Page generated in 0.108 seconds