• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 349
  • 134
  • 38
  • 33
  • 32
  • 31
  • 16
  • 12
  • 11
  • 10
  • 8
  • 6
  • 6
  • 5
  • 4
  • Tagged with
  • 779
  • 122
  • 88
  • 86
  • 84
  • 73
  • 65
  • 58
  • 52
  • 51
  • 51
  • 50
  • 44
  • 41
  • 40
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

The Log Analysis in an Automatic Approach

Lei, Jianhui 10 1900 (has links)
<p>Large software systems tend to have complex architecture and numerous lines of source code. As software systems have grown in size and complexity, it has become increasingly difficult to deliver bug-free software to end-users. Most system failures occurring at run-time are directly caused by system defects; therefore diagnosis of software defects becomes an important but challenging task in software development and maintenance.</p> <p>A system log is one available source of information from a software system. Software developers have used system logs to record program variable values, trace execution, report run-time statistics and print out full-sentence messages. This makes system logs a helpful resource for diagnosis of software defects. The conventional log analysis requires human intervention to examine run-time information in system logs and to apply their expertise to software systems, in order to determine the root cause of a software defect and work out a concrete solution. Complex software systems can generate thousands of system logs in a relatively short time frame. Analyzing such large amounts of information turns out to be extremely time-consuming. Automated techniques are needed to improve the efficiency and quality of the diagnostic process.</p> <p>This thesis presents an automated approach to diagnosis of software defects, combining source code analysis, log analysis and sequential pattern mining, to detect anomalies among system logs, diagnose reported system errors and narrow down the range of source code lines to determine the root cause. We demonstrate that, by implementation, the methodology provides a feasible solution to the diagnostic problem.</p> / Master of Applied Science (MASc)
172

Reconfigurable Turbo Decoding for 3G Applications.

Chaikalis, Costas, Noras, James M. January 2004 (has links)
No / Software radio and reconfigurable systems represent reconfigurable functionalities of the radio interface. Considering turbo decoding function in battery-powered devices like 3GPP mobile terminals, it would be desirable to choose the optimum decoding algorithm: SOVA in terms of latency, and log-MAP in terms of performance. In this paper it is shown that the two algorithms share common operations, making feasible a reconfigurable SOVA/log-MAP turbo decoder with increased efficiency. Moreover, an improvement in the performance of the reconfigurable architecture is also possible at minimum cost, by scaling the extrinsic information with a common factor. The implementation of the improved reconfigurable decoder within the 3GPP standard is also discussed, considering different scenarios. In each scenario various frame lengths are evaluated, while the four possible service classes are applied. In the case of AWGN channels, the optimum algorithm is proposed according to the desired quality of service of each class, which is determined from latency and performance constraints. Our analysis shows the potential utility of the reconfigurable decoder, since there is an optimum algorithm for most scenarios.
173

A high gain multiband offset MIMO antenna based on a planar log-periodic array for Ku/K-band applications

Fakharian, M.M., Alibakhshikenari, M., See, C.H., Abd-Alhameed, Raed 27 March 2022 (has links)
Yes / An offset quad-element, two-port, high-gain, and multiband multiple-input multiple-output (MIMO) planar antenna based on a log-periodic dipole array (LPDA) for Ku/K-band wireless communications is proposed, in this paper. A single element antenna has been designed starting from Carrel's theory and then optimized with a 50-Ω microstrip feed-line with two orthogonal branches that results mainly in a broadside radiation pattern and improves diversity parameters. For experimental confirmation, the designed structure is printed on an RT-5880 substrate with a thickness of 1.57 mm. The total substrate dimensions of the MIMO antenna are 55 × 45 mm2. According to the measured results, the designed structure is capable of working at 1.3% (12.82-12.98 GHz), 3.1% (13.54-13.96 GHz), 2.3% (14.81-15.15 GHz), 4.5% (17.7-18.52 GHz), and 4.6% (21.1-22.1 GHz) frequency bands. Additionally, the proposed MIMO antenna attains a peak gain of 4.2-10.7 dBi with maximum element isolation of 23.5 dB, without the use of any decoupling structure. Furthermore, the analysis of MIMO performance metrics such as the envelope correlation coefficient (ECC) and mean effective gain (MEG) validates good characteristics, and field correlation performance over the operating band. The proposed design is an appropriate option for multiband MIMO applications for various wireless systems in Ku/K-bands.
174

Development of an Instrument to Evidence Knowledge Abstractions in Technological/Engineering Design-Based Activities

Figliano, Fred Joseph 24 May 2011 (has links)
This document outlines the development of a Design Log Instrument (DLI) intended for use in identifying moments of abstraction as evidence of STEM content knowledge transfer. Many theoretical approaches to explaining knowledge transfer are rooted in a belief that transfer occurs through knowledge abstraction (Reed, Ernst, & Banerji, 1974; Gick & Holyoak, 1980, 1983). The DLI prompts participants to be reflective during technological/engineering design activities. During the development of this instrument, a three-phase multiple case: embedded design was used. Three distinct Phases accommodated the collection and analysis of data necessary for this investigation: Phase 1: Pilot Case Study, Phase 2: Establishing Content Validity, and Phase 3: Establishing Construct Validity. During Phase 3, data from the DLI was collected at each of seven work sessions from two design teams each working through different engineering problems. At the end of Phase 3, a comparison of abstractions found in DLI responses and observation data (Audio/Video transcripts) indicated the extent to which the DLI independently reflected those abstractions revealed in observations (Audio/Video transcripts). Results of this comparison showed that the DLI has the potential to be 68% reliable to reveal abstracted knowledge. Further analysis of these findings showed ancillary correlations between the percent abstractions found per DLI reflective prompt and the percent abstractions found per T/E design phase. Specifically, DLI Reflective Prompts 2 and 3 correlate with T/E Design Phases 3 and 4 (58% and 76% respectively of the total abstractions) which deal with design issues related to investigating the problem and developing alternate solutions. DLI Reflective Prompts 4 and 5 correlate with T/E Design Phases 5 and 6 (22% and 24% respectively of total abstractions) which deal with design issues related to choosing a solution and developing a prototype. Findings also indicate that there are highs and lows of abstraction throughout the T/E design process. The implications of these highs and lows are that specific phases of the T/E design process can be targeted for research and instruction. By targeting specific T/E design phases, a researcher or instructor can increase the likelihood of fostering abstractions as evidence of STEM content knowledge transfer. / Ph. D.
175

Automated Detection of Surface Defects on Barked Hardwood Logs and Stems Using 3-D Laser Scanned Data

Thomas, Liya 15 November 2006 (has links)
This dissertation presents an automated detection algorithm that identifies severe external defects on the surfaces of barked hardwood logs and stems. The defects detected are at least 0.5 inch in height and at least 3 inches in diameter, which are severe, medium to large in size, and have external surface rises. Hundreds of real log defect samples were measured, photographed, and categorized to summarize the main defect features and to build a defect knowledge base. Three-dimensional laser-scanned range data capture the external log shapes and portray bark pattern, defective knobs, and depressions. The log data are extremely noisy, have missing data, and include severe outliers induced by loose bark that dangles from the log trunk. Because the circle model is nonlinear and presents both additive and non-additive errors, a new robust generalized M-estimator has been developed that is different from the ones proposed in the statistical literature for linear regression. Circle fitting is performed by standardizing the residuals via scale estimates calculated by means of projection statistics and incorporated in the Huber objective function to bound the influence of the outliers in the estimates. The projection statistics are based on 2-D radial-vector coordinates instead of the row vectors of the Jacobian matrix as proposed in the statistical literature dealing with linear regression. This approach proves effective in that it makes the GM-estimator to be influence bounded and thereby, robust against outliers. Severe defects are identified through the analysis of 3-D log data using decision rules obtained from analyzing the knowledge base. Contour curves are generated from radial distances, which are determined by robust 2-D circle fitting to the log-data cross sections. The algorithm detected 63 from a total of 68 severe defects. There were 10 non-defective regions falsely identified as defects. When these were calculated as areas, the algorithm locates 97.6% of the defect area, and falsely identifies 1.5% of the total clear area as defective. / Ph. D.
176

The LibX Edition Builder

Gaat, Tilottama 07 January 2009 (has links)
LibX is a browser plugin that allows users to access library resources directly from their browser. Many libraries that wished to adopt LibX needed to customize a version of LibX for their own institution. Most librarians did not possess the necessary knowledge of XML, running scripts and the underlying implementation of LibX required to create customized, functional LibX versions for their own institutions. Therefore, we have developed a web-based tool called the LibX Edition Builder that empowers librarians to create their own customized LibX version (editions), effortlessly. The Edition Builder provides rich interactivity to its users by exploiting the ZK AJAX framework whose components we adapted. The Edition Builder provides automatic detection of relevant library resources based on several heuristics which we have developed, which reduces the time and effort required to configure these resources. We have used sound software engineering techniques such as agile development principles, code generation techniques, and the model-view-controller design paradigm to maximize maintainability of the Edition Builder, which enables us to easily incorporate changing functional requirements in the Edition Builder. The LibX Edition Builder is currently used by over 800 registered users who have created over 400 editions. We have carried out a custom log-based usability evaluation that examined the interactions of our users over a 5 month period. This evaluation has shown that the Edition Builder can dramatically reduce the time needed to customize LibX editions and is being increasingly adopted by the library community. / Master of Science
177

Extra-light log trailer design

Wylezinski, Andrzej T. 02 February 2007 (has links)
A mechanical design of an ultra-light log trailer was performed. The design process necessitated the experimental measurement of dynamic loads exerted on a log trailer and a comprehensive stress analysis of the structure. Two light-weight, prototype, log trailers were selected to be studied, after an exhaustive survey of the existing designs. Finite element models (FEM) were developed for each of the trailers using three-dimensional tapered unsymmetrical beam elements. Static stress analysis was performed to identify critical spots in the structures and to estimate stresses encountered at these locations. These spots were selected as strain gage placements for the experimental dynamic stress analysis. A special data acquisition system based on the STD bus computer was designed, assembled, programmed, and tested for the experimental force and strain measurements. Experimental stress analysis of each of the selected trailers was performed. Dynamic loads and the resultant strains at critical locations were measured, both while simulating extreme situations, and during typical work cycles. The recorded service stress-time histories were then used to identify peak values of the maximum dynamic loading and the structure response. Stress distributions throughout the structures were obtained using the FEM models. The recorded service load spectra were then utilized to assess a fatigue life of each of the tested designs. A survey of log trailers for fatigue cracks was conducted when the dynamic stress analyses indicated that a log trailer is most likely to fail due to repeated loading. The causes for fatigue cracking of log trailers were then investigated through elastic shell element FEM modeling. Finally, an efficient and economically feasible ultra-light design was developed based on the formulated recommendations. The most important features of the proposed design are: 1. Reduced tare weight of 7,770 pounds. 2. Sound structural integrity. 3. Material of high strength and toughness used throughout. 4. Improved fatigue resistance (e.g., welds were replaced by elastic friction joints). 5. Movable bunks. 6. Replaceable bolsters and standards. 7. Constant tensioning device for load binders. / Ph. D.
178

Propeller-induced jet impact on vegetated flow field: Complex coupled effect towards velocity profile

Pu, Jaan H. 12 October 2024 (has links)
Yes / The failure of swirling ship propellers in marine environments can lead to huge repair costs. One of the main causes of such failure is when propellers tangle with vegetation, especially in shallow flow environments like ports, harbours, or shipyards. In order to understand the above-mentioned issue, this study proposes an analytical approach to explore efficient predictions and provide a flow guideline with respect to the co-existence of vegetation and propeller swirling effects. More specifically, we intend to investigate the full-depth theoretical velocity profile to represent propeller-induced flow under submerged vegetation conditions. This paper first investigates the modified logarithmic law approach to determine its suitability to represent the regional vegetated flow zone before implementing it into a three-layer analytical model. It was found, using the benchmark of literature measurements, that the modified log law improved the near-bed velocity calculation by nearly 70% when compared to an existing model. A propeller jet impact computation coupled into the vegetation analytical model was then investigated in different locations within the vegetated flow, i.e., at free-flow, water–vegetation interface, and vegetation-hindered zones, to study their complex velocity distribution patterns. The results demonstrate adequate validation with the vegetated flow and measured propeller jet data from the literature. This proves the potential of the proposed analytical approach in representing a real-world propeller jet event submerged in water flow with the existence of vegetation. The proposed novel method allows costless computation, i.e., as compared to conventional numerical models, in representing the complex interaction of the propeller jet and vegetated flow.
179

Validation et exploitation d’un registre histologique des cancers : Estimation par capture recapture de l’exhaustivité par modélisation log-linéaire et selon les modèles écologiques Mtbh en Bayesien / Assessing the value of a histopathological cancer registry : Completeness estimation by capture-recapture by log-linear modeling and on ecological models Mtbh in Bayesian

Bailly, Laurent 08 December 2011 (has links)
Introduction: Les études populationnelles sur le cancer nécessitent un recensement de référence fiable et exhaustif, en théorie possible à partir d'un recueil histologique. Méthode: Depuis 2005, toutes les structures d'anatomopathologie des Alpes-Maritimes adressent les codes ADICAP des tumeurs malignes et invasives et identifiants patients. L'exhaustivité pour les cancers du sein et colorectaux des 50-75 ans a été évalué par méthode de capture recapture en modélisation log-linéaire et en Bayesien à partir des cas communs ou non dépistés et vus en Réunion de Concertation Pluridisciplinaire. RésultatUn programme d'assurance qualité a permis de s'assurer de la fiabilité des données recueillies.L'estimation de l'exhaustivité était de plus de 90 % pour les cancers du sein et colorectaux des 50-75 ans. Les taux observés sur le département des Alpes-Maritimes, comparés aux taux estimés en France, se sont révélés cohérents.Enfin, la base a été utilisée pour déterminer l'existant les lésions prénéoplasiques du col de l'utérus avant la vaccination anti-HPV. ConclusionCe travail conclut à l'intérêt d'un recueil histologique des cas de cancers incidents. / Introduction Cancer population studies require reliable and complete baseline data, which should theoretically be available by collecting histopathology records.Method Since 2005, all histopathology laboratories from Alpes-Maritimes address ADICAP codes for invasive cancer and patient identifiers. The completeness of such a collection was evaluated using capture-recapture analysis based on three data sources concerning breast and colorectal cancers with the number of cases which were common or not between sources recording screened, diagnosed and treated cancers in the French Alpes Maritimes districtResult Data quality for the ADICAP code database may be considered satisfactoryThe estimated completeness of cancer records collected from histopathology laboratories was higher than 90%.Rates observed in the Alpes-Maritimes, compared with estimated rates in France have proven consistent. Rates of CIN for the entire female population of the Alpes-Maritimes in 2006 has been established.Conclusion A verified and validated histopathology data collection may be useful for cancer population studies.
180

Penalized regression models for compositional data / Métodos de regressão penalizados para dados composicionais

Shimizu, Taciana Kisaki Oliveira 10 December 2018 (has links)
Compositional data consist of known vectors such as compositions whose components are positive and defined in the interval (0,1) representing proportions or fractions of a whole, where the sum of these components must be equal to one. Compositional data is present in different areas, such as in geology, ecology, economy, medicine, among many others. Thus, there is great interest in new modeling approaches for compositional data, mainly when there is an influence of covariates in this type of data. In this context, the main objective of this thesis is to address the new approach of regression models applied in compositional data. The main idea consists of developing a marked method by penalized regression, in particular the Lasso (least absolute shrinkage and selection operator), elastic net and Spike-and-Slab Lasso (SSL) for the estimation of parameters of the models. In particular, we envision developing this modeling for compositional data, when the number of explanatory variables exceeds the number of observations in the presence of large databases, and when there are constraints on the dependent variables and covariates. / Dados composicionais consistem em vetores conhecidos como composições cujos componentes são positivos e definidos no intervalo (0,1) representando proporções ou frações de um todo, sendo que a soma desses componentes totalizam um. Tais dados estão presentes em diferentes áreas, como na geologia, ecologia, economia, medicina entre outras. Desta forma, há um grande interesse em ampliar os conhecimentos acerca da modelagem de dados composicionais, principalmente quando há a influência de covariáveis nesse tipo de dado. Nesse contexto, a presente tese tem por objetivo propor uma nova abordagem de modelos de regressão aplicada em dados composicionais. A ideia central consiste no desenvolvimento de um método balizado por regressão penalizada, em particular Lasso, do inglês least absolute shrinkage and selection operator, elastic net e Spike-e-Slab Lasso (SSL) para a estimação dos parâmetros do modelo. Em particular, visionamos o desenvolvimento dessa modelagem para dados composicionais, com o número de variáveis explicativas excedendo o número de observações e na presença de grandes bases de dados, e além disso, quando há restrição na variável resposta e nas covariáveis.

Page generated in 0.0191 seconds