• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7013
  • 1944
  • 3
  • 2
  • 2
  • 1
  • Tagged with
  • 11715
  • 11715
  • 8064
  • 8064
  • 1211
  • 1207
  • 927
  • 845
  • 842
  • 774
  • 767
  • 552
  • 479
  • 461
  • 459
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Computer System Self-Defense Through Object Self/Non-Self Recognition

Dollens, James T. 01 January 2002 (has links)
Knowing that an object does not belong to an authorized set of objects is an important step in computer system defense. Dr. Stephanie Forrest of the University of New Mexico compared the process of computer system defense to the process used by living organisms to defend against diseases, viruses and other foreign agents. Dr. Forrest's thesis was to develop a methodology for identifying the self to use intrusion detection to detect non-self-agents. An alternative to this external view is a system that contains its own self-defense mechanism. The project proposed that an internal function could be used to differentiate between self and non-self-objects by creating unique identifiers for computer systems as the human DNA differentiates individuals. This research developed the DNA Self-Defense Methodology where implementation would insert identification data into an object that will identify the object uniquely to the operating system on which it resides. This identification data, denoted as the DNA Pattern, will serve to create a unique copy of the object and create an ownership token between the object and the operating system. The research project then focused on developing an instantiation of the methodology for single node computer systems. Additionally, a proof of concept system was developed to test the functionality of certain features of the methodology. The results of the test demonstrated that, given additional research, practical application of the methodology is feasible.
142

An Investigation of Data Integrity in Ada 95

Dorchak, Susan Fife 01 January 1996 (has links)
This dissertation investigates data integrity in Ada 95. The hypothesis presented is that Ada 95 programs must be designed under the control of the programmer in order for data entities to be protected from internal corruption. The designers of the language made a conscious decision to add object-oriented features by extending the existing definitions of Ada 83. While the new implementation provides the object-oriented features of inheritance and polymorphism, the language implementation of these features, along with that of hierarchical libraries, introduce an ambiguity of object-oriented design-to language constructs that can result in data integrity problems. Coding techniques are presented for various program design dilemmas and emphasize the protection and consistent use of data entities within and between the various components of an Ada 95 program. The results of testing the coding techniques indicate that different encapsulation organizations have different impacts on the various aspects of data integrity. During this testing, a flaw in the compiler was revealed with respect to the inheritance of private primitive operations. Through investigating the language from the perspective of data integrity, it was found that the object-oriented paradigm, as well as the protection of critical data entities as dictated by the problem domain, can be achieved through a combination of Ada 95 features. This is fully dependent on an increased intervention and control of the program code by the programmer.
143

An Investigation of Learners' Attitudes and Preferences that Relate to Participation in Internet-Based Instruction at Coastal Carolina University

Dorman, Joyce 01 January 2005 (has links)
The purpose of this study was to investigate learners' attitudes and preferences and how they relate to participation in Internet-based instruction at Coastal Carolina University. The study was focused on Generation Y because they behave decidedly different from previous generations. Also, in light of the current fiscal challenges that are ongoing in colleges and universities, it is necessary to explore different avenues that would attract more students to online learning environments (OLE) for many reasons. They include cutting capital costs, attracting more students to distance education formats, and using a profile that catalogs traits favorable to OLEs during advisement. The goal of the study was to gather empirical evidence that would shed light in revamping and improving ways to increase student enrollment in online distance education classes and understanding attitudes and preferences relating to participation in Internet-based instruction. The researcher also examined how selected demographic variables like age, gender, GPA, student rank, student status, academic major, marital status, and employment status shaped students' attitudes and preferences. To collect data, the researcher developed a survey instrument, which adopted a five-point Likel1-type scale. An expert panel of four individuals tested the validity of the instrument. A reliability test and a factor analysis were carried out. A pilot study was conducted and recommendations for changes to the survey were made prior to the actual study. Collection of data took place primarily online via a unique WebCT server. Hard copies were available but were not used because all participants had Internet and WebCT access. Descriptive statistics were used to describe the data. Inferential statistics such as tests of significance like I-tests and analysis of variance (ANOYA) were used to test the null hypotheses. Correlation analyses were run to examine any relationships between attitudes and preferences and linear regression was also performed to establish the strength of the relationships between the variables. The outcomes of this study included a profile of traits with emphasis on Generation Y that would show compatibility with Internet-based instruction. The study results showed significant differences in attitudes and preferences based on selected demographics. Finally, the results revealed existing relationships between attitudes and preferences.
144

The Development Of A Three Year Plan To Integrate Computers And Mathematics In the Undergraduate Liberal Arts Curriculum

Dotson, Marilyn Knight 11 April 1989 (has links)
This was a study for the development of a plan for a computer based mathematics curriculum at Belmont Abbey College. The development and phase-in of the new curricula would take approximately three years. Because of the size and limited resources of the college, the proposal advocated the use of existing materials. Full implementation will require additional computer equipment. Recognizing this financial constraint, two equipment proposals were developed cognate to the curriculum plan. Utilization of the plan required identification and acquisition of appropriate mathematics software for use in the ' classroom. Evaluation of software would be an ongoing activity beyond the projected three year phase-in of the project. As a result of this study, it was determined that Belmont Abbey College would be able to integrate effective computer instruction into the mathematics curriculum. In view of this observation, the following recommendations were offered to Belmont Abbey College's administrators and colleagues: Carefully analyze the curriculum to determine where applications of technology make sense. Keep the technology simple. Classroom applications must take only a few seconds to implement if they are to be used. The courseware must be 'friendly' with help screens and menus available at the touch of a key. Training sessions and demonstrations are ongoing activities. Evaluation and development of courseware are continuous exercises. The evaluation and development of course specific software is as professionally important as publishing and research.
145

A Study of Delayed Time and Real Time Text-Based Computer-Mediated Communication Systems on Group Decision Making Performance

Dringus, Laurie P. 01 January 1991 (has links)
This study explored the dynamics of delayed time and real time computer-mediated communication (CMC) when small groups use text-based computer communication programs to reach closure on a priority setting problem task. This study was a preliminary investigation into the effects of computer-mediated communication mode on several measures: decision quality, gain-loss scores and utilization resources, time to solution, text sequencing, and text readability. The problem related to identifying mode of transmission (delayed time and real time) as a variable that can directly impact computer-mediated discourse. Prior studies involving CMC and group decision making were performed using either delayed time or real time computer-mediated communication mode (Kiesler, Siegel, & McGuire, 1984; Sproull and Kiesler, 1986). This study investigated group communication under both modes. Thirty two (N=32) three-person member groups completed a priority setting problem online in a unix* environment that supports computer-mediated communication in delayed and real time modes. Ninety six participants were randomly assigned to one of two computer- mediated communication modes (delayed time or real time). All participants completed the task from their home locations, communicating via modem connection to Nova University's VAX 8550 mini-computer running DEC's version of Unix, "Ultrix" Version 2.3. Participants were physically, and potentially geographically and time-zone dispersed from others with whom they participated with in the experiment. There was no face-to-face interaction among group members. Transcripts of electronic mail (delayed time) activities and recordings of "Phone" (real time) computer conversation program activities were made during the experiment and later analyzed. It was hypothesized that there would be a difference between CMC mode and group decision making performance and that coordination of communication would be reduced more under delayed time CMC than under real time CMC. Additionally, it was hypothesized that delayed time groups would take longer to reach closure on the task than real time groups. It was also hypothesized that delayed time groups would prepare and share more text than real time groups. It was also hypothesized that delayed time groups would produce higher text readability grade levels than real time CMC groups. Two of four hypotheses were supported. Bonferroni protected univariate F analyses were performed at the .01 level of significance. Results indicated that while patterns of group process to reach closure were unique according to respective CMC modes, there were no significant differences between groups in regard to decision making quality (score on task). Group decision making is achievable through CMC despite time delays and absence of face-to-face or voice communication. Delayed time groups took longer to reach closure on the task than real time groups. Real time groups exchanged more messages than delayed time groups. There were no significant differences between CMC groups in regard to the number of sentences and words exchanged. Delayed time groups produced text that was of higher readability quality. However, it was discovered that short text exchanges by groups from both CMC modes influenced the readability analysis, thus misleading true text readability grade levels under CMC modes. It was recommended that future research is needed to provide further insight as to why users would choose to use delayed time computer communication versus real time computer communication. This will become increasingly important to ascertain as more end-users increase their utilization of interactive computer mediated communication, now a standard feature offered under multi-user and multi-tasking operating system environments.
146

An Expandable Markov Model for the Design of Intelligent Communicative Agents in managed Health Care

Dune, Douglas T. 01 January 2000 (has links)
In the field of medicine, decisions are often difficult to make in the absence of clear symptoms, decisive test results and adequate patient involvement. Medicine in most cases is still an art, using science for its basic foundation. Everyday practice relies on the case-study method, trial and error, and intuitive judgement. In some medical specialties there is heavy reliance on intangibles. For example, human motivation plays a significant role in complex medical decisions affected by many variables which remain unquantifiable and intangible; indeed most variables that determine outcome are hidden, including human motivation. The complexity of medical information systems demands that new ways be investigated that emphasize timeliness and efficiency. Medical information systems are no longer centralized; they are distributed over networks and the Internet. Interoperability has become a requirement in order for these heterogeneous systems be able to exchange information and work together in a cooperative manner. For this reason the design of decision processes within the general domains of medicine require further analyzation and establishing a methodology for developing a flexible agent architecture for the creation of intelligent agent systems in medicine. Thus, this research provided the underlying theoretical framework for the design of interactive intelligent agents in the medical domain examining the design of open and flexible architectures. In the last decade rapid development of agent technologies has occurred. Research of multi-agent systems sprung from earlier works in artificial intelligence and decision sciences. Complementing the study of agent technologies is the discipline of mathematical modeling. Adapted Markov models were applied to facilitate the methodology of the study emphasizing the process by which real decisions are formalized rather than the solution to already formalized problems. Another important element of this dissertation was the use of clinical pathways; fundamental guidelines that are components of managed health care. Clinical pathways were at the core of the architecture and formed the basis of a suitable expandable and adaptive Markov model. The results of the model are a derivation of intelligent agent architecture for the medical domain. The methodology exploited the generality, flexibility and normative power of Markov models, particularly, fully observable Markov decision processes (FOMDP) and partially observable Markov decision processes (POMDP). Based on both the FOMDP and POMDP, an expandable observable Markov decision process (EOMDP) model was formulated. The formulated model was further revised and reformulated based on the phenomenological observation and measure of clinical pathways. This approach is mathematically sound, computationally efficient, and intuitively appealing.
147

Interactive Features for an HTML Tutorial in a Distance Learning Program

Eaton, Mark R. 01 January 1996 (has links)
Nova Southeastern University (NSU) has been delivering on-line courses to students through computer-mediated communications (CMC) since 1983. Like many institutions of higher education, NSU began investigating the use of the World Wide Web for distance education over the Internet. A research group was formed in 1993 to study Web tools, courseware development tools and different strategies for delivering on-line instruction, course management, and conferencing. One of the goals of the courseware development team, a supporting activity to the research group, was to develop on-line instruction and resources on the hypertext markup language (HTML) that could be used on-line by students in preparation of research activities on the Web. This study contributed to the research group by examining past research on learner control in computer aided instruction as it was applied in traditional settings and testing learner control designs in HTML tutorials. Based on a review of literature, it was hypothesized that standard learner control techniques would help improve student-CAI interaction but not significantly improve student-teacher interaction without additional support. HTML/CGI was thought to help make this possible. An experimental design was established that incorporated three treatments. The first treatment (control) was an online tutorial which used branching typical of hypertext documents on the Internet. The second treatment was the same tutorial but with enhancements to include content and context control commonly used in learner control strategies. The third treatment used CGI support to help improve communications between student and teacher. Subjects completed pretests, posttests, and surveys. Paired two-sample for means t-tests were conducted to investigate whether each on-line tutorial, having included some degree of learner control, contributed significantly to learning. It was found that all three versions of the tutorials provided significant learning. MANOV A was used to determine which of these tutorial designs contributed significantly to learning and attitudes and this was a comparison of designs with each other. It was found that there was no significant differences between the tutorial designs. Group 3 which had design features intended to enhance student-teacher interaction did not show greater significant effects on student achievement and attitudes as had been hypothesized.
148

Adaptation of Business Processes in SMEs: An Interpretive Study

Ehrlich, Donna M. 01 January 2007 (has links)
Small and medium enterprises contribute significantly to an economy. With the growth of Internet and related communication technologies, many small and medium enterprises are trying to integrate their strategies and business processes with other firms in the supply chain. The goal of this dissertation is to study the adaptation of a small/medium enterprise to changes in the environment. Specifically, adaptation of internal business processes within the small/medium enterprise embedded in a manufacturing supply chain is studied. A case study method was used within a grounded theory framework so that the continuous interplay between data collection and analysis of each phase helps inductive theory discovery. The findings suggest that while the SME actively adapts their internal processes, it also seeks to minimize dependence on a few large customers by diversifying customer base. This process increases costs to the SME, thus this study explored why the SME strived for a diversified customer base. The SME, which is the subject study, used Long Term Agreements with its key partners. A Long Term Agreement creates a contractual obligation defining the expectations of the SME and the customer. The expectations allow the SME to develop a product and a process specific to the customer's needs while securing future relationships with the customer. The SME reengineered manufacturing processes and designed information systems to support process adaptation. Each supply chain creates demands on the SME to deliver product through a seamless flow of inventory delivered as requested through the Long Term Agreement. The multiple demands from each supply chain along with the precision for delivery time have increased the necessity to develop a precise manufacturing process. This requires an information system to collect the data and define the business processes based on the complexity created in the multiple supply chains. Generalizing from the case study, it appears that small firms which are part of a supply chain adapt by using long term contracts with a smaller set of firms at the strategic level, by redesigning internal processes at business process level and by creating customized information systems at the IS level.
149

Using a Bayesian Belief Network for Going-Concern Risk Evaluation

Ejaz, Azad 01 January 2005 (has links)
An auditor's verdict on client's financial health is delivered in the form of a going concern (GC) opinion. Although an auditor is not required to predict the financial future of a client, stakeholders take the GC opinion as a guideline on a company's financial health. The GC opinion has been a subject of much debate in the financial literature, as it is one of the most widely read parts of an audit report. Researchers and academicians believe that auditors have made costly mistakes in rendering GC opinions. Several factors have been identified as the root causes for these mistakes, including growing business complexities, insufficient auditor training, internal and external pressures, personal biases, economic considerations, and fear of litigation. To overcome these difficulties, researchers have been trying to devise effective audit tools to help auditors form accurate GC opinions on clients ' financial future. Introduction of ratio-based bankruptcy models using a variety of statistical techniques are attempts in the right direction. The results of such efforts, though not perfect, are encouraging. This study examined several popular ratio-based statistical models and their weaknesses and limitations. The author suggests a new model based on the robust Bayesian Belief Network (BBN) technique. Based on sound Bayesian theory, this model provides remedies against the reported deficiencies of the ratio-based techniques. The proposed system, instead of comparing a company's financial ratios with the industrywide ratios, measures the internal financial changes within a company during a particular year and uses the changing financial pattern to predict the financial viability of the company. Unlike other popular models, the proposed model takes various qualitative factors into consideration before delivering the GC verdict. The proposed system is verified and validated by comparing its results with the industry de facto Z-score model.
150

An Evaluation of Program for Cooperative Cataloging(PCC) Records Used in Non-PCC Libraries

Ellett, Robert O., Jr. 01 January 2005 (has links)
The Program for Cooperative Cataloging (PCC), created in 1992 under the auspices of the Library of Congress (LC), provides bibliographic and authority records intended to meet the cataloging needs of all libraries. The number of institutions participating in the BIBCO or Bibliographic Cooperative component of the PCC remains limited to 46 institutions. The PCC introduced a bibliographic record standard, the core level record, which emphasized a dependable description with full authority control, while providing timely access. Time savings and efficiency have been results observed for PCC libraries creating core level records. The PCC libraries are thus able to devote more resources to cataloging difficult foreign language or esoteric material often needed by library users but previously unavailable and unknown because it was in the cataloging arrearage or backlog. However, no studies examined whether non-PCC libraries accepted PCC records as readily as they accepted LC cataloging records in the OCLC Online Computer Library Center (OCLC) database. This study analyzed the acceptability of PCC records by examining how 72 various academic, public, and special libraries edited them during a two-month study period. Findings pointed to the participants' inability to identify PCC records correctly. There was also some indication that editing of notes and non-access point fields in bibliographic records continue to be a priority for some institutions. The most frequent significant change to the PCC records was the addition of Dewey decimal classification (DDC) numbers by public library participants and the addition of LC classification numbers for academic and special library participants. This modification was observed as the main difference between LC and PCC records. Overall, 65.3% of PCC records were used with no editing changes and 97.4% of MARC fields examined were not edited. Results revealed a correlation between the absence of a needed classification number and personnel level handling copy cataloging. An analysis of editing changes in full versus core PCC records was presented. Recommendations for library administrators, cataloging managers, OCLC, and the PCC Policy Operations Committee concerning authority verification, classification number verification, PCC record identification and cataloging record source field, and monitoring copy cataloging work to promote efficiency were provided.

Page generated in 0.0967 seconds