• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 751
  • 350
  • 73
  • 73
  • 73
  • 73
  • 73
  • 72
  • 48
  • 31
  • 9
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 1700
  • 1700
  • 271
  • 253
  • 241
  • 208
  • 186
  • 185
  • 174
  • 166
  • 145
  • 138
  • 137
  • 127
  • 125
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
541

The good drawings D r of the complete graph K r /

Rafla, Nabil H. January 1988 (has links)
No description available.
542

Computer software for the control of potato storage environment

Landry, Jacques-André January 1994 (has links)
No description available.
543

Automatic visual inspection of solder joints

Merrill, Paul A. January 1984 (has links)
No description available.
544

Methods and Tools for Practical Software Testing and Maintenance

Saieva, Anthony January 2024 (has links)
As software continues to envelop traditional industries the need for increased attention to cybersecurity is higher than ever. Software security helps protect businesses and governments from financial losses due to cyberattacks and data breaches, as well as reputational damage. In theory, securing software is relatively straightforward—it involves following certain best practices and guidelines to ensure that the software is secure. In practice, however, software security is often much more complicated. It requires a deep understanding of the underlying system and code (including potentially legacy code), as well as a comprehensive understanding of the threats and vulnerabilities that could be present. Additionally, software security also involves the implementation of strategies to protect against those threats and vulnerabilities, which may involve a combination of technologies, processes, and procedures. In fact many real cyber attacks are caused not from zero day vulnerabilities but from known issues that haven't been addressed so real software security also requires ongoing monitoring and maintenance to ensure critical systems remain secure. This thesis presents a series of novel techniques that together form an enhanced software maintenance methodology from initial bug reporting all the way through patch deployment. We begin by introducing Ad Hoc Test Generation, a novel testing technique that handles when a security vulnerability or other critical bugis not detected by the developers’ test suite, and is discovered post-deployment, developers must quickly devise a new test that reproduces the buggy behavior. Then the developers need to test whether their candidate patch indeed fixes the bug, without breaking other functionality, while racing to deploy before attackers pounce on exposed user installations. This work builds on record-replay and binary rewriting to automatically generate and run targeted tests for candidate patches significantly faster and more efficiently than traditional test suite generation techniques like symbolic execution. Our prototype of this concept is called ATTUNE. To construct patches in some instances developers maintaining software may be forced to deal directly with the binary since source code is no longer available. In these instances this work presents a transformer based model called DIRECT that provides semantics related names for variables and function names that have been lost giving developers the opportunity to work with a facsimile of the source code that would otherwise be unavailable. In the event developers need even more support deciphering the decompiled code we provide another tool called REINFOREST that allows developers to search for similar code which they can use to further understand the code in question and use as a reference when developing a patch. After patches have been written, deployment remains a challenge. In some instances deploying a patch for the buggy behavior may require supporting legacy systems where software cannot be upgraded without causing compatibility issues. To support these updates this work introduces the concept of binary patch decomposition which breaks a software release down into its component parts and allows software administrators to apply only the critical portions without breaking functionality. We present a novel software patching methodology that we can recreate bugs, develop patches, and deploy updates in the presence of the typical challenges that come when patching production software including deficient test suites, lack of source code, lack of documentation, compatibility issues, and the difficulties associated with patching binaries directly.
545

The effect of sampling error on the interpretation of a least squares regression relating phosporus and chlorophyll

Beedell, David C. (David Charles) January 1995 (has links)
No description available.
546

Computerized Flow Process Charting System and Applications

Griffin, George H. 01 January 1987 (has links) (PDF)
A computerized flow process charting application program of dBase III+ has been developed to aid in resource requirements planning and operations analysis. Traditional flow process charting has used as the following data elements: assembly number, assembly sequence number, distance travelled, time required for the activity and an activity symbol. The computerized system adds several variables to these in order to customize the application at Martin Marietta Electronic Systems. These additional variables include work center identification, machine number identification, lot sizes, set up and run times and manufacturing specifications. Additionally, the circle or operations symbol has been expanded to differentiate between manual, process and test activities. Resources requirements planning and analysis is accomplished by a series of reports where a user defines search requirements and enters three independent equation variables for the calculations. The three variables are realization factor or safety factor, resource availability in hours per month and monthly production demand. The resource requirements can be used in methods engineering, make-buy decisions and resource planning. Sensitivity analyses can be easily accomplished by changing the input variables and/or data.
547

A microcomputer-based budget allocation and planning algorithm for interdependent projects

Kladivko, Kurt A. January 1986 (has links)
A budget allocation and planning algorithm was developed for the economic evaluation of interdependent projects. Projects may be characterized by discrete point estimates or by continuous functions. The unique approach of this algorithm is that, if continuous functions are used, the analyst may specify a functional relationship between costs and returns. Budget planning techniques were incorporated into the algorithm to aid the decision maker in planning for future budgetary needs and in influencing the budget determination process. The algorithm utilizes a future worth of net return criterion in conducting the analyses. All projects under consideration need not have the same life; unequal-lived projects may be considered. The algorithm was coded in BASIC on an IBM Personal Computer. Two versions of the algorithm were developed; one which utilizes discrete point estimates to characterize projects, and one which utilizes continuous functions. The algorithm identifies the optimal combination of projects, given a budget profile and various project interdependencies. After the optimal portfolio of projects is identified, four sensitivity analyses may be run to analyze the effect of any uncertainty in the decision environment. An example is provided to illustrate the capabilities and limitations of the algorithm. / M.S.
548

An object-oriented methodology and supporting framework for creating engineering software by dynamic integration

Woyak, Scott A. 26 October 2005 (has links)
Software design within the engineering community has generally been relegated to encoding algorithms for the purpose of executing them very rapidly. This is a very important purpose, however substantially more is required to build an entire CAD application. Structure must be provided to the data maintained in the application. Various analyses must be integrated and coordinated in an orderly fashion. Interaction with the user must be managed. These topics have traditionally received secondary attention. The result has been engineering applications that are difficult to use, costly to create, and expensive to maintain or modify. The system created in this dissertation, the Dynamic Integration System, addresses these issues with respect to engineering-related software. Code constructed with Dynamic Integration System techniques anticipate future needs, such as integration, before those needs explicitly arise. This greatly reduces downstream costs and facilitates the development of engineering-related software. The Dynamic Integration System consists of two primary constructs: Dynamic Variables and dependency hierarchies. Dynamic Variables are used to model the key parameters in an application while a dependency hierarchy is built from the relationships between Dynamic Variables. Using these constructs, issues such as integration and analysis coordination are automated by the underlying Dynamic Integration System facilities. / Ph. D.
549

Development of a coupled finite element - boundary element program for a microcomputer

Brown, Steven Andrew 15 November 2013 (has links)
This thesis describes the development of a coupled finite element — boundary element program for a microcomputer. The steps outlined in the thesis include the adaptation of a mainframe—based boundary element code for use on a microcomputer, the Verification of this program with sample problems, the development of an algorithm for coupling the Finite Element Method to the Boundary Element Method, the implementation of the coupling algorithm with finite element and boundary element codes, including the development of a Constant Strain Triangular finite element, and the Verification of the coupled program with sample problems. Conclusions are drawn from the results presented, and suggestions are made for future research in this area. / Master of Science
550

Evaluation of accuracy and reliability of structural analysis and steel design software

Divecha, Dharmesh R. 04 December 2009 (has links)
The accuracy and reliability of three structural analysis and design programs were evaluated. The three programs considered were MicasPlus, STAAD III, and GTSTRUDL. Nine different test cases of three basic steel space frames were analyzed and designed using the programs. The nine cases were formed by modifying different modeling, analysis, and design parameters. Comparison of results obtained from the programs for each case was presented. It was found that the percentage differences in the analysis results obtained from the programs were typically less than five percent. For steel design, both MicasPlus and GTSTRUDL gave member sizes that were very close. It was also found that changing modeling parameters can have a significant effect on analysis results. This confirms the importance of accurate structural modeling. The functionality and usability of the three programs were also evaluated. After an extensive study of MicasPlus a list of suggested modifications and additions to the program was presented. / Master of Science

Page generated in 0.0972 seconds