• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 661
  • 144
  • 138
  • 131
  • 26
  • 22
  • 13
  • 13
  • 13
  • 13
  • 13
  • 12
  • 8
  • 7
  • 2
  • Tagged with
  • 1272
  • 1272
  • 1272
  • 453
  • 312
  • 265
  • 265
  • 263
  • 261
  • 261
  • 261
  • 261
  • 261
  • 261
  • 261
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Data management for hospital administration

Soubliere, Jean Pierre January 1971 (has links)
In hospitals, as in business, the literature bears evidence of successful implementation of specialized computer systems. Unfortunately, all attempts at designing large-scale totally integrated hospital information systems have so far been unsuccessful. It seems apparent that the missing link between the dedicated systems and the "total" systems is the non-utilization of the systems approach. To demonstrate the importance and the practicality of this approach, it is used to outline and evaluate the criteria applicable in choosing a data management system for hospital administration. / Business, Sauder School of / Graduate
122

Accounting : from an information systems perspective

Matveief, Vladimir Anatole January 1970 (has links)
The author extended the synthesis of the so-called accounting spread sheet into a more compact and mathematically rigorous formulation. This formulation was applied to an example in the form of a computerized accounting information system. The systematic approach used bridges the communication gap between the accounting profession and the quantitative oriented computer specialists who design computer based accounting systems. The use of tensor analysis and coordinate transformations in accounting theory was also explored. The author believes this to be an important area for further research. / Business, Sauder School of / Graduate
123

Reliable client-server communication in distributed programs

Ravindran, K. January 1987 (has links)
Remote procedure call (RPC) and shared variable are communication abstractions which allow the various processes of a distributed program, often modelled as clients and servers, to communicate with one another across machine boundaries. A key requirement of the abstractions is to mask the machine and communication failures that may occur during the client-server communications. In practice, many distributed applications can inherently tolerate failures under certain situations. If such application layer information is available to the client-server communication layer (RPC and shared variable), the failure masking algorithms in the communication layer may relax the constraints under which the algorithms may have to operate if the information is not available. The relaxation significantly simplifies the algorithms and the underlying message transport layer and allows formulation of efficient algorithms. This application-driven approach forms the backbone of the failure masking techniques described in the thesis, as outlined below: Orphan handling in RPCs: Using the application-driven approach, the thesis introduces a new technique of adopting the orphans caused by failures during RPCs. The adoption technique is preferable to orphan killing because orphan killing wastes any work already completed and requires rollback which may be expensive and sometimes not meaningful. The thesis incorporates orphan adoption into two schemes of replicating a server: i) Primary-secondary scheme in which one of the replicas of the server acts as the primary and executes RPCs from clients while the other replicas stand by as secondaries. When the primary fails, one of the secondaries becomes the primary, restarts the server execution from the most recent checkpoint and adopts the orphan, ii) Replicated execution scheme in which an RPC on the server is executed by more than one replica of the server. When any of the replicas fails, the orphan generated by the failure is adopted by the surviving replicas. Both schemes employ call re-executions by servers based on the application-level idempotency properties of the calls. Access to shared variables: Contemporary distributed programs deal with a new class of shared variables such as information on name bindings, distributed load and leadership within a service group. Since the consistency constraints on such system variables need not be as strong as those for user data, the access operations on the variables may be made simpler using this application layer information. Along this direction, the thesis introduces an abstraction, which we call application-driven shared variable, to govern access operations on the variables. The algorithms for the access operations on a variable use intra-server group communication and enforce consistency of the variable to the extent required by the application. The thesis describes complete communication models incorporating the application-driven approach to mask failures. / Science, Faculty of / Computer Science, Department of / Graduate
124

An analysis of skill requirements in data processing environments

Mantha, Robert William January 1978 (has links)
The purpose of this study was to examine the skills deemed to be useful to data processing managers and to systems analysts in data processing environments of varying levels of maturity. The subjects of the study were 35 data processing managers and 50 systems analysts from a sample of 35 companies of varying size and of varying experience with electronic data processing (EDP). The research method used to gather the data was the mail questionnaire. Two questionnaires were developed: one to measure an EDP organization's relative maturity in terms of data processing, and one to measure EDP practitioners perceived usefulness of 99 data processing skills in terms of their own job position. The results obtained indicate that data processing managers and systems analysts of both more and less mature organizations perceived generalist skills as being more useful than specialist skills. In particular, people, organization and society skills were perceived to be the most useful to data processing managers, whereas people, organizations and system skills were perceived to be the most useful to systems analysts. Model and computer skills were perceived to be the least useful to both groups of practitioners. Data processing managers of more mature organizations perceived people and society skills to be more useful than did their counterparts in less mature organizations. Finally, generalist skills were perceived to be more useful to data processing managers than to systems analysts, whereas specialist skills were perceived to be more useful to systems analysts than to data processing managers. The implication of this study on university curricula in information systems is that universities should prepare their information systems graduates to solve people and organization problems rather than technical problems. However, it was pointed out in this study that a good technical background is necessary to function effectively as an EDP practitioner in the data processing community. / Business, Sauder School of / Graduate
125

The organisational effects of installing a distributed processing system

Lay, Peter Mark Quine January 1980 (has links)
Bibliography: 238-248. / Since its introduction to business in 1952, computerised data processing has undergone a number of substantial changes, both in the hardware and the techniques that are used. The introduction of miniaturisation, and the resultant lowering of the costs of circuitry, has led to the widespread use of mini- and micro-computers. There has also been a large increase in the use of communication facilities. Initially, almost all organisations centralised their computer facilities at the Head Office and systems were run in the batch mode. The need to service the requirements of remote users was resolved by installing on-line facilities and providing unintelligent terminals to those users. Alternatively, stand-alone computers were installed at the remote locations. However, the requirements of businesses for centralised reporting and control led to the need to install processing units at the user sites and to connect those computers, via communications links, to a computer facility located at Head Office. In this way distributed data processing evolved. The provision of this type of processing mode has important implications to the organisation in such areas as costs, staffing, planning, control and systems design. This thesis, therefore, investigates the current (1980) trends in relation to distributed processing. It specifically examines the developments in hardware, software, and data communications. It assesses the criteria that should be considered by an organisation in selecting either the centralisation or distribution of its processing facilities. Through a field study both successful and unsuccessful distributed installations are examined. Conclusions are then drawn and recommendations made, to provide management with working guidelines when assessing the feasibility and practicality of distributed processing for its organisation. The findings of the study are appropriate for both general management and DP management with only centralised computing experience; and for individuals offering professional computer consultancy services to existing or potential users.
126

An assessment of the impact of computers on the practices of chartered accountants with some reference to South Africa including an evaluation of current computer education for chartered accountants

Sulcas, Paul January 1974 (has links)
The study is concerned with assessing the impact of computers on the practices of Chartered Accountants and relating the findings to the accounting profession in South Africa. An evaluation will also be made of current computer education for pre-qualifying and qualified Chartered Accountants (S.A.). No special attempt has been made to define a computer because it is considered that the principles dealt with in this study are applicable to a wide range of electronic data processing equipment possessing common characteristics, i.e. input and output devices, logical, arithmetic and storage units, and a control unit.
127

Droit d'auteur et co (régulation) : la politique du droit d'auteur sur l'internet

Benizri, Yohan-Avner. January 2006 (has links)
No description available.
128

On the reliability of an object based distributed system /

Gherfal, Fawzi Fathi January 1985 (has links)
No description available.
129

The planning of operations and the analysis of alternative information systems : a dynamic programming approach to different costing methods in accounting for inventories /

Bailey, Andrew D. January 1972 (has links)
No description available.
130

A conditionally optimal student sectioning algorithm /

Braitsch, Raymond Joseph January 1973 (has links)
No description available.

Page generated in 0.1036 seconds