• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 745
  • 350
  • 73
  • 73
  • 73
  • 73
  • 73
  • 72
  • 48
  • 31
  • 9
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 1694
  • 1694
  • 271
  • 253
  • 236
  • 208
  • 186
  • 185
  • 173
  • 166
  • 145
  • 138
  • 137
  • 126
  • 125
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
731

Studies of reinforced concrete regions near discontinuities

Cook, William Digby January 1987 (has links)
A non-linear finite element computer program capable of predicting the complete response of two-dimensional reinforced concrete members was developed. This tool which accounts for the stress strain characteristics of cracked concrete was used to predict the responses of a number of members containing discontinuities. These members included corbels, dapped end beams, beams with web holes, and deep beams. The results of tests performed by the author as well as tests performed by other researchers were compared with the non-linear predictions. In addition, simple strut and tie models suitable for designing regions near discontinuities were developed. The predictions obtained by these models were compared with the non-linear finite element predictions and with the test results.
732

Timing analysis for MOSFETS, an integrated approach

Dagenais, Michel R. January 1987 (has links)
Timing and electrical verification is an essential part of the design of VLSI digital MOS circuits. It consists of determining the maximum operating frequency of a circuit, and verifying that the circuit will always produce the expected logical behavior at or under this frequency. This complex task requires considerable computer and human resources. / The classical simulation approach cannot be used to insure the timing and electrical correctness of the large circuits that are now being designed. The huge number of possible states in large circuits renders this method impractical. Worst-case analysis tools alleviate the problem by restricting the analysis to a limited set of states which correspond to the worst-case operating conditions. However, existing worst-case analysis tools for MOS circuits present several problems. Their accuracy is inherently limited since they use a switch-level model. Also, these procedures have a high computational complexity because they resort to path enumeration to find the latest path in each transistor group. Finally, they lack the ability to analyze circuits with arbitrarily complex clocking schemes. / In this text, a new procedure for circuit-level timing analysis is presented. Because it works at electronic circuit level, the procedure can detect electrical errors, and attains an accuracy that is impossible to attain by other means. Efficient algorithms, based on graph theory, have been developed to partition the circuits in a novel way, and to recognize series and parallel combinations. This enables the efficient computation of worst-case, earliest and latest, waveforms in the circuit, using specially designed algorithms. The new procedure extracts automatically the timing requirements from these waveforms and can compute the clocking parameters, including the maximum clock frequency, for arbitrarily complex clocking schemes. / A computer program was written to demonstrate the effectiveness of the new procedure and algorithms developed. It has been used to determine the clocking parameters of circuits using different clocking schemes. The accuracy obtained on these parameters is around 5 to 10% when compared with circuit-level simulations. The analysis time grows linearly with the circuit size and is approximately 0.5s per transistor, on a microVAX II computer. This makes the program suitable for VLSI circuits.
733

Scheduling customer service representatives for the Workers’ Compensation Board of British Columbia

Sanegre, Rafael 11 1900 (has links)
The Workers' Compensations Board of British Columbia (WCB) operates 4 call centers in the province devoted to compensation services. This thesis describes a review of operations at the call center in the head office, located in Richmond, B.C. We identified the need for a mathematical model to schedule telephone operators (called CSRs in WCB) in order to minimize the workforce required, while assuring minimal staffing levels to provide good customer service. We developed two integer programming models and built an interface that would enable the supervisors to use the model interactively. We compared the results from the model to one of their schedules and found that the model used 5 fewer CSRs than currently in their schedule. The potential savings derived from such a test convinced management that WCB needed to acquire and implement software to schedule staff.
734

Automatic text summarization using lexical chains : algorithms and experiments

Kolla, Maheedhar, University of Lethbridge. Faculty of Arts and Science January 2004 (has links)
Summarization is a complex task that requires understanding of the document content to determine the importance of the text. Lexical cohesion is a method to identify connected portions of the text based on the relations between the words in the text. Lexical cohesive relations can be represented using lexical chaings. Lexical chains are sequences of semantically related words spread over the entire text. Lexical chains are used in variety of Natural Language Processing (NLP) and Information Retrieval (IR) applications. In current thesis, we propose a lexical chaining method that includes the glossary relations in the chaining process. These relations enable us to identify topically related concepts, for instance dormitory and student, and thereby enhances the identification of cohesive ties in the text. We then present methods that use the lexical chains to generate summaries by extracting sentences from the document(s). Headlines are generated by filtering the portions of the sentences extracted, which do not contribute towards the meaning of the sentence. Headlines generated can be used in real world application to skim through the document collections in a digital library. Multi-document summarization is gaining demand with the explosive growth of online news sources. It requires identification of the several themes present in the collection to attain good compression and avoid redundancy. In this thesis, we propose methods to group the portions of the texts of a document collection into meaningful clusters. clustering enable us to extract the various themes of the document collection. Sentences from clusters can then be extracted to generate a summary for the multi-document collection. Clusters can also be used to generate summaries with respect to a given query. We designed a system to compute lexical chains for the given text and use them to extract the salient portions of the document. Some specific tasks considered are: headline generation, multi-document summarization, and query-based summarization. Our experimental evaluation shows that efficient summaries can be extracted for the above tasks. / viii, 80 leaves : ill. ; 29 cm.
735

A method for economically evaluating and refining future pulpwood harvesting machines in conjunction with forest planning and management

Cosby, Walter Wayne 12 1900 (has links)
No description available.
736

A study of floating point arthmetic

Bookhart, Thomas Woodward 12 1900 (has links)
No description available.
737

Rigorous and reasonable error bounds for the numerical solution of dynamical systems

Kuhn, Wolfgang 12 1900 (has links)
No description available.
738

Kierkegaard and the computer : some recent contributions

Hogue, Stéphane January 1990 (has links)
This document is submitted with the permission and encouragement of the department of philosophy of McGill University in lieu of a conventional thesis. Briefly, it consists of a combined account and selective historical review of some uses of the computer in philosophy, and of a partial list of my computer-related contributions to Kierkegaard scholarship. The former deals generally with the creation, interrogation and analysis of machine-readable forms of philosophical texts. The latter deals specifically with my own work of creating and analyzing Kierkegaard-related machine-readable texts.
739

A text editor based on relations /

Fayerman, Brenda. January 1984 (has links)
No description available.
740

An experimental-analytical investigation of hypoelastic models for plain and reinforced concrete /

Bahlis, Jihad. January 1986 (has links)
No description available.

Page generated in 0.477 seconds