• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 745
  • 350
  • 73
  • 73
  • 73
  • 73
  • 73
  • 72
  • 48
  • 31
  • 9
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 1694
  • 1694
  • 271
  • 253
  • 236
  • 208
  • 186
  • 185
  • 173
  • 166
  • 145
  • 138
  • 137
  • 126
  • 125
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
551

A computer program for the tentative selection of structural members

Rightmier, Lonnie J. 20 November 2012 (has links)
The subject of this project is the development of a computer program that assists in the preliminary selection of wooden beams for residential scale buildings. It is a useful project because the program is intended to assist the architectural designer; it does not provide comprehensive engineering. With this purpose in mind, the intention is to simplify the process of sizing wooden members, to generate graphic visualization, and to make clear the analytic and decision making process incorporated in the program's construction. The user is presumed to be a designer or architectural student. Efforts towards the design of the program have been aimed at making the software effective for designers and students. This project deals with such issues as logical sequence or flow, decision points, and conceptual organization of display screens for the purpose of focusing the users attention on vital information. / Master of Architecture
552

An Ada library for positional board games

Mangolas, Athanassios Anastassios 08 June 2009 (has links)
This thesis describes the design and the implementation of an Ada library for positional board games. The library consists of general software modules that use some concepts from a model of Positional Board Games presented in [Antoy 87]. This thesis shows that general software modules based on the mathematical concept of board can be built and used by any positional board game program. Furthermore, this thesis describes the data types used in the modules and presents informal and formal specifications of the operations on the data types. It also describes the implementation of the data types; presents the algorithms implementing the operations and shows how the library can be used on two positional board programs and justifies the claim of generality and simplicity of the model in [Antoy 87]. The programming language Ada is used to express the formal specifications and to code the software modules. / Master of Science
553

The application of structure and code metrics to large scale systems

Canning, James Thomas January 1985 (has links)
This work extends the area of research termed software metrics by applying measures of system structure and measures of system code to three realistic software products. Previous research in this area has typically been limited to the application of code metrics such as: lines of code, McCabe's Cyclomatic number, and Halstead's software science variables. However, this research also investigates the relationship of four structure metrics: Henry's Information Flow measure, Woodfield's Syntactic Interconnection Model, Yau and Collofello's Stability measure and McClure's Invocation complexity, to various observed measures of complexity such as, ERRORS, CHANGES and CODING TIME. These metrics are referred to as structure measures since they measure control flow and data flow interfaces between system components. Spearman correlations between the metrics revealed that the code metrics were similar measures of system complexity, while the structure metrics were typically measuring different dimensions of software. Furthermore, correlating the metrics to observed measures of complexity indicated that the Information Flow metric and the Invocation Measure typically performed as well as the three code metrics when project factors and subsystem factors were taken into consideration. However, it was generally true that no single metric was able to satisfactorily identify the variations in the data for a single observed measure of complexity. Trends between many of the metrics and the observed data were identified when individual components were grouped together. Code metrics typically formed groups of increasing complexity which corresponded to increases in the mean values of the observed data. The strength of the Information Flow metric and the Invocation measure is their ability to form a group containing highly complex components which was found to be populated by outliers in the observed data. / Ph. D.
554

Multimodal Representations for Video

Suris Coll-Vinent, Didac January 2024 (has links)
My thesis explores the fields of multimodal and video analysis in computer vision, aiming to bridge the gap between human perception and machine understanding. Recognizing the interplay among various signals such as text, audio, and visual data, my research explores novel frameworks to integrate these diverse modalities in order to achieve a deeper understanding of complex scenes, with a particular emphasis on video analysis. As part of this exploration, I study diverse tasks such as translation, future prediction, or visual question answering, all connected through the lens of multimodal and video representations. I present novel approaches for each of these challenges, contributing across different facets of computer vision, from dataset creation to algorithmic innovations, and from achieving state-of-the-art results on established benchmarks to introducing new tasks. Methodologically, my thesis embraces two key approaches: self-supervised learning and the integration of structured representations. Self-supervised learning, a technique that allows computers to learn from unlabeled data, helps uncovering inherent connections within multimodal and video inputs. Structured representations, on the other hand, serve as a means to capture complex temporal patterns and uncertainties inherent in video analysis. By employing these techniques, I offer novel insights into modeling multimodal representations for video analysis, showing improved performance with respect to prior work in all studied scenarios.
555

Structured arrows : a type-based framework for structured parallelism

Castro, David January 2018 (has links)
This thesis deals with the important problem of parallelising sequential code. Despite the importance of parallelism in modern computing, writing parallel software still relies on many low-level and often error-prone approaches. These low-level approaches can lead to serious execution problems such as deadlocks and race conditions. Due to the non-deterministic behaviour of most parallel programs, testing parallel software can be both tedious and time-consuming. A way of providing guarantees of correctness for parallel programs would therefore provide significant benefit. Moreover, even if we ignore the problem of correctness, achieving good speedups is not straightforward, since this generally involves rewriting a program to consider a (possibly large) number of alternative parallelisations. This thesis argues that new languages and frameworks are needed. These language and frameworks must not only support high-level parallel programming constructs, but must also provide predictable cost models for these parallel constructs. Moreover, they need to be built around solid, well-understood theories that ensure that: (a) changes to the source code will not change the functional behaviour of a program, and (b) the speedup obtained by doing the necessary changes is predictable. Algorithmic skeletons are parametric implementations of common patterns of parallelism that provide good abstractions for creating new high-level languages, and also support frameworks for parallel computing that satisfy the correctness and predictability requirements that we require. This thesis presents a new type-based framework, based on the connection between structured parallelism and structured patterns of recursion, that provides parallel structures as type abstractions that can be used to statically parallelise a program. Specifically, this thesis exploits hylomorphisms as a single, unifying construct to represent the functional behaviour of parallel programs, and to perform correct code rewritings between alternative parallel implementations, represented as algorithmic skeletons. This thesis also defines a mechanism for deriving cost models for parallel constructs from a queue-based operational semantics. In this way, we can provide strong static guarantees about the correctness of a parallel program, while simultaneously achieving predictable speedups.
556

Evaluating Microsoft .NET technology: Implementation online store

Dou, Jie 01 January 2006 (has links)
The purpose of this project is to design, develop and implement an e-commerce shopping cart system based on Microsoft.NET technology and to evaluate ASP.NET technology by developing a shopping cart system.
557

Application of a Geographical Information System to Estimate the Magnitude and Frequency of Floods in the Sandy and Clackamas River Basins, Oregon

Brownell, Dorie Lynn 26 May 1995 (has links)
A geographical information system (GIS) was used to develop a regression model designed to predict flood magnitudes in the Sandy and Clackamas river basins in Oregon. Manual methods of data assembly, input, storage, manipulation and analysis traditionally used to estimate basin characteristics were replaced with automated techniques using GIS-based computer hardware and software components. Separate GIS data layers representing (1) stream gage locations, (2) drainage basin boundaries, (3) hydrography, (4) water bodies, (5) precipitation, (6) landuse/land cover, (7) elevation and (8) soils were created and stored in a GIS data base. Several GIS computer programs were written to automate the spatial analysis process needed in the estimation of basin characteristic values using the various GIS data layers. Twelve basin characteristic data parameters were computed and used as independent variables in the regression model. Streamflow data from 19 gaged sites in the Sandy and Clackamas basins were used in a log Pearson Type III analysis to define flood magnitudes at 2-, 5-, 10-, 25-, 50- and 100-year recurrence intervals. Flood magnitudes were used as dependent variables and regressed against different sets of basin characteristics (independent variables) to determine the most significant independent variables used to explain peak discharge. Drainage area, average annual precipitation and percent area above 5000 feet proved to be the most significant explanatory variables for defining peak discharge characteristics in the Sandy and Clackamas river basins. The study demonstrated that a GIS can be successfully applied in the development of basin characteristics for a flood frequency analysis and can achieve the same level of accuracy as manual methods. Use of GIS technology reduced the time and cost associated with manual methods and allowed for more in-depth development and calibration of the regression model. With the development of GIS data layers and the use of GIS-based computer programs to automate the calculation of explanatory variables, regression equations can be developed and applied more quickly and easily. GIS proved to be ideally suited for flood frequency modeling applications by providing advanced computerized techniques for spatial analysis and data base management.
558

MODRSP: a program to calculate drawdown, velocity, storage and capture response functions for multi-aquifer systems

Maddock, Thomas, III, Lacher, Laurel J. January 1991 (has links)
MODRSP is program used for calculating drawdown, velocity, storage losses and capture response functions for multi - aquifer ground -water flow systems. Capture is defined as the sum of the increase in aquifer recharge and decrease in aquifer discharge as a result of an applied stress from pumping [Bredehoeft et al., 19821. The capture phenomena treated by MODRSP are stream- aquifer leakance, reduction of evapotranspiration losses, leakance from adjacent aquifers, flows to and from prescribed head boundaries and increases or decreases in natural recharge or discharge from head dependent boundaries. The response functions are independent of the magnitude of the stresses and are dependent on the type of partial differential equation, the boundary and initial conditions and the parameters thereof, and the spatial and temporal location of stresses. The aquifers modeled may have irregular -shaped areal boundaries and non -homogeneous transmissive and storage qualities. For regional aquifers, the stresses are generally pumpages from wells. The utility of response functions arises from their capacity to be embedded in management models. The management models consist of a mathematical expression of a criterion to measure preference, and sets of constraints which act to limit the preferred actions. The response functions are incorporated into constraints that couple the hydrologic system with the management system (Maddock, 1972). MODRSP is a modification of MODFLOW (McDonald and Harbaugh, 1984,1988). MODRSP uses many of the data input structures of MODFLOW, but there are major differences between the two programs. The differences are discussed in Chapters 4 and 5. An abbreviated theoretical development is presented in Chapter 2, a more complete theoretical development may be found in Maddock and Lacher (1991). The finite difference technique discussion presented in Chapter 3 is a synopsis of that covered more completely in McDonald and Harbaugh (1988). Subprogram organization is presented in Chapter 4 with the data requirements explained in Chapter 5. Chapter 6 contains three example applications of MODRSP.
559

Accessing timesheets via internet through ASP and ODBC

Challa, Varshi 01 January 2000 (has links)
The purpose of this project is to develop a computerized timesheet application. Using this application, an employee of a company can log onto the company's Web site and fill out a timesheet from anywhere in the world. The project involved automating timesheet data entry and approval procedures using contemporary technologies like Active Server Pages (ASP), JavaScript, VB Script, Component Object Model (COM), Components and Open Database connectivity (ODBC).
560

COMPUTER SIMULATION OF SURFACE GROUND MOTIONS INDUCED BY NEAR SURFACE BLASTS.

Barkley, Ross Charles. January 1982 (has links)
No description available.

Page generated in 0.0898 seconds