• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1153
  • 817
  • 181
  • 129
  • 114
  • 76
  • 39
  • 30
  • 26
  • 20
  • 18
  • 13
  • 10
  • 9
  • 9
  • Tagged with
  • 2984
  • 923
  • 345
  • 277
  • 271
  • 226
  • 154
  • 152
  • 139
  • 137
  • 133
  • 122
  • 121
  • 113
  • 106
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Application of A Voice Coil Actuator for Punching Flexible Printed Circuit Boards

Chen, Po-tzu 30 August 2007 (has links)
In the past the machinery used in punching of flexible printed circuit boards(FPCBs), it used mostly the rotary motor as the power source in the mechanism design. To transfer rotary motion to linear motion need a succession of mechanical conversion components, in order to achieve the purpose of linear output. However these mechanical parts for transforming bring some unavoidable problems such as the machinery itself huge volume, backlash and friction which created during the action process, all have harmful influences on the system dynamic performance and precision. Voice coil actuator has direct-drive output, high response and high thrust force these characteristics, therefore this research apply voice coil actuator to the punching of flexible printed circuit boards. For present industry, S-curve velocity profile is often used in point-to-point displacement intermittent action applications, due to its jerk-limited characteristic for reducing vibration and raising precision. Then integrating plans of S-curve velocity profile with voice coil actuator based on punching characters, to analyze the whole system dynamic performance in such a vertical linear output application. Then generalizing the dependence of influence factors of punching quality and motion characteristics of punching mechanism through experimental results. The achievement of this research could provide references for some related designers using similar linear actuators in vertical linear output applications.
182

The inverted CD4/CD8 ratio and associated parameters in 66-year-old individuals : the Swedish HEXA immune study

Strindhall, Jan, Skog, Mårten, Ernerudh, Jan, Bengner, M, Lofgren, S, Matussek, A, Nilsson, B O., Wikby, A January 2013 (has links)
The Swedish OCTO and NONA immune longitudinal studies were able to identify and confirm an immune risk profile (IRP) predictive of an increased 2-year mortality in very old individuals, 86–94 years of age. The IRP, was associated with persistent cytomegalovirus infection and characterized by inverted CD4/CD8 ratio and related to expansion of terminally differentiated effector memory T cells (TEMRA phenotype). In the present HEXA immune longitudinal study, we have examined a younger group of elderly individuals (n = 424, 66 years of age) in a population-based sample in the community of Jönköping, Sweden, to examine the relevance of findings previously demonstrated in the very old. Immunological monitoring that was conducted included T cell subsets and CMV-IgG and CMV-IgM serology. The result showed a prevalence of 15 % of individuals with an inverted CD4/CD8 ratio, which was associated with seropositivity to cytomegalovirus and increases in the level of TEMRA cells. The proportion of individuals with an inverted CD4/CD8 ratio was significantly higher in men whereas the numbers of CD3+CD4+ cells were significantly higher in women. In conclusion, these findings are very similar to those previously found by us in the Swedish longitudinal studies, suggesting that an immune profile previously identified in the very old also exists in the present sample of hexagenerians. Therefore, it will be important to examine clinical parameters, including morbidity and mortality, to assess whether the immune profile also is a risk profile associated with higher mortality in this sample of hexagenerians. / <p>Funding Agencies|Medical Research Council of South-East Sweden||</p>
183

Database server workload characterization in an e-commerce environment

Liu, Fujian 19 December 2005
A typical E-commerce system that is deployed on the Internet has multiple layers that include Web users, Web servers, application servers, and a database server. As the system use and user request frequency increase, Web/application servers can be scaled up by replication. A load balancing proxy can be used to route user requests to individual machines that perform the same functionality. <br><br>To address the increasing workload while avoiding replicating the database server, various dynamic caching policies have been proposed to reduce the database workload in E-commerce systems. However, the nature of the changes seen by the database server as a result of dynamic caching remains unknown. A good understanding of this change is fundamental for tuning a database server to get better performance. <br><br> In this study, the TPC-W (a transactional Web E-commerce benchmark) workloads on a database server are characterized under two different dynamic caching mechanisms, which are generalized and implemented as query-result cache and table cache. The characterization focuses on response time, CPU computation, buffer pool references, disk I/O references, and workload classification. <br><br>This thesis combines a variety of analysis techniques: simulation, real time measurement and data mining. The experimental results in this thesis reveal some interesting effects that the dynamic caching has on the database server workload characteristics. The main observations include: (a) dynamic cache can considerably reduce the CPU usage of the database server and the number of database page references when it is heavily loaded; (b) dynamic cache can also reduce the database reference locality, but to a smaller degree than that reported in file servers. The data classification results in this thesis show that with dynamic cache, the database server sees TPC-W profiles more like on-line transaction processing workloads.
184

Environmental impact assessment of the operation of conventional helicopters at mission level

Linares Bejarano, Carlos Andres 10 1900 (has links)
Helicopters play a unique role in modern aviation providing a varied range of benefits to society and satisfying the need for fast mobility, particularly in metropolitan areas. However, environmental concerns associated with the operation of rotorcraft have increased due to envisaged growth of air traffic. Even though helicopter operations represent a small percentage of the total greenhouse gas emissions resulting from all human activities, helicopters are categorised as a main source of local air pollution around airports and urban areas. New rotorcraft designs, innovative aero engines and all-electrical systems are being developed in order to diminish the impact that aviation has on the global and local environment. However, advanced rotorcraft designs and breakthrough technologies might take decades to be in service. Additionally, there is a large number of polluting rotorcraft that are in use and must be progressively replaced. Therefore, in the near-term, improvements to minimise air quality degradation (around airports and metropolitan areas) may be possible from better use of existing rotorcraft by focusing on trajectory and mission profile management. In this research project, a parametric study was carried out in order to assess the environmental impact, in terms of fuel burn and emissions, that the operation of light single-engine helicopters causes under different flight conditions. The results of this assessment were used as a basis to carry out a single and multi-objective optimisation for minimum fuel consumption and air pollutant emissions. Oxides of nitrogen, carbon monoxide and unburnt hydrocarbons were considered as trade-off parameters. In order to achieve this, a multidisciplinary assessment framework, intended to generate outputs for estimating the fuel burn and emissions during the operation of conventional helicopters, was developed. Simulink® Design Optimization™ software was incorporated into the framework in order to enhance the benefits of this tool.A baseline mission profile was proposed in order to validate the potential of mission profile management. Different case studies were carried out changing flight parameters at every segment of the baseline mission. The single and multi-objective optimisation proved that favourable reductions in fuel burn may be attainable at the expense of a slight increase of NOX emissions during the entire mission. If reductions of more than 3% in block fuel burn are to be achievable in the short term for a single helicopter, savings for air transport companies are expected to be significant if mission profile management is considered for a whole fleet of helicopters.
185

Determining housing need in rural Manitoba

Sumner, Kevan 14 October 2005 (has links)
With the aim of developing a housing needs assessment tool for rural Manitoba, the research investigates definitions of ‘rural’, the concept of housing need, and approaches to its assessment. The main question is: how can housing need be identified and quantified at the community or regional level? The response comes in the development of a community-based rural housing needs assessment guidebook (documented in Volume 2). Literature reviewed (Part 2) targets definitions of rural, and an examination of: methods of assessing housing need, the nature of housing need, trends in housing policy, housing services programming, and the determination of housing need at a local level. Throughout, there is a focus on application of relevant literature to informing the design and development of the guidebook. The key informant interview process and related ethical considerations are presented in Part 3. The precedents review, of prior housing needs assessments and guides, is also described. The results from these research methods interviews are presented in Part 4, again with a focus on identifying how each informed development of the guidebook. Part 4 concludes with a description of the design of the housing needs assessment guidebook, addressing the structure and scope of the assessment process, key considerations and components included in the guide, the two-phase process that constitutes the main information-generating component of the tool, and the discrepancy model used to guide the user through the assessment process. Design of a sample survey (an optional component of the guide) is also briefly discussed, as well as perceived limitations of the guide including the need for a complementary strategic planning process that picks up where the assessment leaves off, which might merit a second guide that details such further steps. Certain realities of housing needs assessment in rural Manitoba, and complicating aspects of the discrepancy model, are also discussed. Part 5 details the early stages of application of components of the housing needs assessment guide in the Minnedosa area of Manitoba. Strengths and weaknesses of the components are identified, and associated refinements and changes are noted. It is concluded (Part 6) that the rural housing needs assessment guide is a viable means of identifying housing need in rural Manitoba. Possible benefits of its application are noted, but there are also cautions regarding further desirable or necessary research. / February 2006
186

Low-Level Haskell Code: Measurements and Optimization Techniques

Peixotto, David 06 September 2012 (has links)
Haskell is a lazy functional language with a strong static type system and excellent support for parallel programming. The language features of Haskell make it easier to write correct and maintainable programs, but execution speed often suffers from the high levels of abstraction. While much past research focuses on high-level optimizations that take advantage of the functional properties of Haskell, relatively little attention has been paid to the optimization opportunities in the low-level imperative code generated during translation to machine code. One problem with current low-level optimizations is that their effectiveness is limited by the obscured control flow caused by Haskell's high-level abstractions. My thesis is that trace-based optimization techniques can be used to improve the effectiveness of low-level optimizations for Haskell programs. I claim three unique contributions in this work. The first contribution is to expose some properties of low-level Haskell codes by looking at the mix of operations performed by the selected benchmark codes and comparing them to the low-level codes coming from traditional programming languages. The low-level measurements reveal that the control flow is obscured by indirect jumps caused by the implementation of lazy evaluation, higher-order functions, and the separately managed stacks used by Haskell programs. My second contribution is a study on the effectiveness of a dynamic binary trace-based optimizer running on Haskell programs. My results show that while viable program traces frequently occur in Haskell programs the overhead associated with maintaing the traces in a dynamic optimization system outweigh the benefits we get from running the traces. To reduce the runtime overheads, I explore a way to find traces in a separate profiling step. My final contribution is to build and evaluate a static trace-based optimizer for Haskell programs. The static optimizer uses profiling data to find traces in a Haskell program and then restructures the code around the traces to increase the scope available to the low-level optimizer. My results show that we can successfully build traces in Haskell programs, and the optimized code yields a speedup over existing low-level optimizers of up to 86% with an average speedup of 5% across 32 benchmarks.
187

Hydrologic Validation of Real-Time Weather Radar VPR Correction Methods

Klyszejko, Erika Suzanne January 2006 (has links)
Weather radar has long been recognized as a potentially powerful tool for hydrological modelling. A single radar station is able to provide detailed precipitation information over entire watersheds. The operational use of radar in water resources applications, however, has been limited. Interpretation of raw radar data requires several rigorous analytical steps and a solid understanding of the technology. In general, hydrologists’ lack of meteorological background and the persistence of systematic errors within the data, has led to a common mistrust of radar-estimated precipitation values. As part of the Enhanced Nowcasting of Extreme Weather project, researchers at McGill University’s J.S. Marshall Radar Observatory in Montreal have been working to improve real-time quantitative precipitation estimates (QPEs). The aim is to create real-time radar precipitation products for the water resource community that are reliable and properly validated. The validation of QPEs is traditionally based on how well observed measurements agree with data from a precipitation gauge network. Comparisons between radar and precipitation gauge quantities, however, can be misleading. Data from a precipitation gauge network represents a series of single-point observations taken near ground surface. Radar, however, estimates the average rate of precipitation over a given area (i.e. a 1-km grid cell) based on the intensity of reflected microwaves at altitudes exceeding 1 km. Additionally, both measurement techniques are susceptible to a number of sources of error that further confound efforts to compare the two. One of the greatest challenges facing radar meteorologists is the variation in the vertical profile of reflectivity (VPR). A radar unit creates a volumetric scan of the atmosphere by emitting microwave beams at several elevation angles. As a beam travels away from the radar, its distance from ground surface increases. Different precipitation types are sampled at a number of heights (i.e. snow above the 0º C elevation and rain below it) that vary with range. The difficulty lies in estimating the intensity of precipitation at the Earth’s surface, based on measurements taken aloft. Scientists at McGill University have incorporated VPR correction techniques into algorithms used to automatically convert raw radar data into quantitative hydrological products. This thesis evaluates three real-time radar precipitation products from McGill University’s J.S. Marshall Radar Observatory in the context of hydrological modelling. The C0 radar product consists of radar precipitation estimates that are filtered for erroneous data, such as ground clutter and anomalous precipitation. The C2 and C3 radar products use different VPR correction techniques to improve upon the C0 product. The WATFLOOD hydrological model is used to assess the ability of each radar product to estimate precipitation over several watersheds within the McGill radar domain. It is proposed that using a watershed as sample area can reduce the error associated with sampling differences between radar and precipitation gauges and allow for the evaluation of a precipitation product over space and time. The WATFLOOD model is run continuously over a four-year period, using each radar product as precipitation input. Streamflow hydrographs are generated for 39 gauging stations within the radar domain, which includes parts of eastern Ontario, south-western Quebec and northern New York and Vermont, and compared to observed measurements. Streamflows are also modelled using distributed precipitation gauge data from 44 meteorological stations concentrated around the Montreal region. Analysis of select streamflow events reveals that despite the non-ideal placement of precipitation gauges throughout the study area, distributed precipitation gauge data are able to reproduce hydrological events with greater accuracy and consistency than any of the provided radar products. Precipitation estimates within the McGill radar domain are found to only be useful in areas within the Doppler range (120-km) where the radar beam is unobstructed by physiographic or man-made features. Among radar products, the C2 VPR-corrected product performed best during the greatest number of the flood events throughout the study area.
188

A New Third Compartment Significantly Improves Fit and Identifiability in a Model for Ace2p Distribution in Saccharomyces cerevisiae after Cytokinesis.

Järvstråt, Linnea January 2011 (has links)
Asymmetric cell division is an important mechanism for the differentiation of cells during embryogenesis and cancer development. Saccharomyces cerevisiae divides asymmetrically and is therefore used as a model system for understanding the mechanisms behind asymmetric cell division. Ace2p is a transcriptional factor in yeast that localizes primarily to the daughter nucleus during cell division. The distribution of Ace2p is visualized using a fusion protein with yellow fluorescent protein (YFP) and confocal microscopy. Systems biology provides a new approach to investigating biological systems through the use of quantitative models. The localization of the transcriptional factor Ace2p in yeast during cell division has been modelled using ordinary differential equations. Herein such modelling has been evaluated. A 2-compartment model for the localization of Ace2p in yeast post-cytokinesis proposed in earlier work was found to be insufficient when new data was included in the model evaluation. Ace2p localization in the dividing yeast cell pair before cytokinesis has been investigated using a similar approach and was found to not explain the data to a significant degree. A 3-compartment model is proposed. The improvement in comparison to the 2-compartment model was statistically significant. Simulations of the 3-compartment model predicts a fast decrease in the amount of Ace2p in the cytosol close to the nucleus during the first seconds after each bleaching of the fluorescence. Experimental investigation of the cytosol close to the nucleus could test if the fast dynamics are present after each bleaching of the fluorescence. The parameters in the model have been estimated using the profile likelihood approach in combination with global optimization with simulated annealing. Confidence intervals for parameters have been found for the 3-compartment model of Ace2p localization post-cytokinesis. In conclusion, the profile likelihood approach has proven a good method of estimating parameters, and the new 3-compartment model allows for reliable parameter estimates in the post-cytokinesis situation. A new Matlab-implementation of the profile likelihood method is appended.
189

Aktiebolaget Marlot : Att skapa en grafisk profil och en Wordpress-sida.

Thorsell, Johan January 2012 (has links)
Marlot Company Limited is a company in the sector household services. It is a new company where the owners, and also the clients, felt that the company was in need of a distinct platform where they could communicate with their current and prospective customers. The assignment was to create a platform consisting of a graphic profile and a Wordpress site for the company. The graphic profile's task was to distinctly profile the company in the market, and visualizing the conceptions that the company wants to be associated with. The purpose of the Wordpress site was to create a showcase for Marlot towards their customers, and thecompany itself could manage after completion. These two products have been created with the help of a pilot study which included an analysis in which the competitors in the market has been scanned, and reports of the household services sector has been studied. The result is an appealing and modern graphical profile that stands out in the sector household services, and a Wordpress site where the company can communicate with their market in a distinct and appealing manner.
190

Hydrologic Validation of Real-Time Weather Radar VPR Correction Methods

Klyszejko, Erika Suzanne January 2006 (has links)
Weather radar has long been recognized as a potentially powerful tool for hydrological modelling. A single radar station is able to provide detailed precipitation information over entire watersheds. The operational use of radar in water resources applications, however, has been limited. Interpretation of raw radar data requires several rigorous analytical steps and a solid understanding of the technology. In general, hydrologists’ lack of meteorological background and the persistence of systematic errors within the data, has led to a common mistrust of radar-estimated precipitation values. As part of the Enhanced Nowcasting of Extreme Weather project, researchers at McGill University’s J.S. Marshall Radar Observatory in Montreal have been working to improve real-time quantitative precipitation estimates (QPEs). The aim is to create real-time radar precipitation products for the water resource community that are reliable and properly validated. The validation of QPEs is traditionally based on how well observed measurements agree with data from a precipitation gauge network. Comparisons between radar and precipitation gauge quantities, however, can be misleading. Data from a precipitation gauge network represents a series of single-point observations taken near ground surface. Radar, however, estimates the average rate of precipitation over a given area (i.e. a 1-km grid cell) based on the intensity of reflected microwaves at altitudes exceeding 1 km. Additionally, both measurement techniques are susceptible to a number of sources of error that further confound efforts to compare the two. One of the greatest challenges facing radar meteorologists is the variation in the vertical profile of reflectivity (VPR). A radar unit creates a volumetric scan of the atmosphere by emitting microwave beams at several elevation angles. As a beam travels away from the radar, its distance from ground surface increases. Different precipitation types are sampled at a number of heights (i.e. snow above the 0º C elevation and rain below it) that vary with range. The difficulty lies in estimating the intensity of precipitation at the Earth’s surface, based on measurements taken aloft. Scientists at McGill University have incorporated VPR correction techniques into algorithms used to automatically convert raw radar data into quantitative hydrological products. This thesis evaluates three real-time radar precipitation products from McGill University’s J.S. Marshall Radar Observatory in the context of hydrological modelling. The C0 radar product consists of radar precipitation estimates that are filtered for erroneous data, such as ground clutter and anomalous precipitation. The C2 and C3 radar products use different VPR correction techniques to improve upon the C0 product. The WATFLOOD hydrological model is used to assess the ability of each radar product to estimate precipitation over several watersheds within the McGill radar domain. It is proposed that using a watershed as sample area can reduce the error associated with sampling differences between radar and precipitation gauges and allow for the evaluation of a precipitation product over space and time. The WATFLOOD model is run continuously over a four-year period, using each radar product as precipitation input. Streamflow hydrographs are generated for 39 gauging stations within the radar domain, which includes parts of eastern Ontario, south-western Quebec and northern New York and Vermont, and compared to observed measurements. Streamflows are also modelled using distributed precipitation gauge data from 44 meteorological stations concentrated around the Montreal region. Analysis of select streamflow events reveals that despite the non-ideal placement of precipitation gauges throughout the study area, distributed precipitation gauge data are able to reproduce hydrological events with greater accuracy and consistency than any of the provided radar products. Precipitation estimates within the McGill radar domain are found to only be useful in areas within the Doppler range (120-km) where the radar beam is unobstructed by physiographic or man-made features. Among radar products, the C2 VPR-corrected product performed best during the greatest number of the flood events throughout the study area.

Page generated in 0.0427 seconds