• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 17558
  • 5457
  • 2960
  • 2657
  • 1693
  • 1640
  • 1013
  • 877
  • 762
  • 541
  • 306
  • 283
  • 279
  • 257
  • 175
  • Tagged with
  • 42219
  • 4330
  • 3915
  • 3756
  • 2861
  • 2490
  • 2415
  • 2310
  • 2143
  • 2020
  • 2011
  • 1951
  • 1949
  • 1926
  • 1864
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
501

Applying an unfolding model to the stages and processes of change

Beever, Rob 02 January 2008 (has links)
The purpose of this study was to utilize the graded unfolding model (GUM) (Roberts, 1995; Roberts & Laughlin, 1996) to examine the interaction between the stages of change (SOC) and the processes of change (POC) for smoking cessation (SC). Although an abundance of research has examined the transtheoretical model (TTM) and SC, the POC remains one of the least investigated dimensions of the TTM. Only one study has applied an item response theory model, the GUM, to the examination of the SOC and POC (Noel, 1999). This study attempted to replicate and extend the findings of Noel (1999) and provides additional external validity evidence for the SOC and the POC for SC.<p>The TTM posits that people undergoing change will use different processes and strategies as they proceed through the SOC and that each POC appears to reach peak use at different stages. Thus, the POC appear to follow an inverse-U-shaped pattern (Noel, 1999).<p>Responses to the SOC and 40-item POC for SC were collected from young adults. Analysis of the data using the GGUM (Roberts, 2000) demonstrated the applicability of the GUM and provides additional external validity of the POC for SC. More specifically, six POC were ordered as expected according to results of longitudinal studies. Four POC were found to be out of order, however, this could be due to sample characteristics or reduced validity of items (due to smoking law changes, some items may no longer be valid). Helping Relationships and Stimulus Control appeared together out of order. This finding replicates Noel (1999) and further research is needed to examine the ordering of these POC. The GUM was also found to fit the POC data better than other item response theory models.
502

Modelling retention time in a clearwell

Yu, Xiaoli 23 September 2009 (has links)
Clearwells are large water reservoirs often used at the end of the water treatment process as chlorine contact chambers. Contact time required for microbe destruction is provided by residence time within the clearwell. The residence time distribution can be determined from tracer tests and is the one of the key factors in assessing the hydraulic behaviour and efficiency of these reservoirs. This work provides an evaluation of whether the two-dimensional, depth-averaged, finite element model, River2DMix can adequately simulate the flow pattern and residence time distribution in clearwells. One question in carrying out this modelling is whether or not the structural columns in the reservoir need to be included, as inclusion of the columns increases the computational effort required.<p> In this project, the residence time distribution predicted by River2DMix was compared to results of tracer tests in a scale model of the Calgary Glenmore water treatment plant northeast clearwell. Results from tracer tests in this clearwell were available. The clearwell has a serpentine baffle system and 122 square structural columns distributed throughout the flow. A comparison of the flow patterns in the hydraulic and computational models was also made. The hydraulic model tests were carried out with and without columns in the clearwell.<p> The 1:19 scale hydraulic model was developed on the basis of Froude number similarity and the maintenance of minimum Reynolds numbers in the flow through the serpentine system and the baffle wall at the entrance to the clearwell. Fluorescent tracer slug injection tests were used to measure the residence time distribution in the clearwell. Measurements of tracer concentration were taken at the clearwell outlet using a continuous flow-through fluorometer system. Flow visualization was also carried out using dye to identify and assess the dead zones in the flow. It was found that it was necessary to ensure the flow in the scale model was fully developed before starting the tracer tests, and determining the required flow development time to ensure steady state results from the tracer tests became an additional objective of the work. Tests were carried out at scale model flows of 0.85, 2.06, and 2.87 L/s to reproduce the 115, 280, and 390 ML/day flows seen in the prototype tracer tests.<p> Scale model results of the residence time distribution matched the prototype tracer test data well. However, approximately 10.5 hours was required for flow development at the lowest flow rate tested (0.85 L/s) before steady state conditions were reached and baffle factor results matched prototype values. At the intermediate flow, baffle factor results between the scale model and prototype matched well after only 1 h of flow development time, with improvements only in the Morril dispersion index towards prototype values with increased flow development time (at 5 h). Similar results were seen at the highest flow tested. For fully developed flow, there was little change in the baffle factor and dispersion index results in the scale model with varied flow rate.<p> With the addition of columns to the scale model, there was no significant change in the baffle factor compared to the case compared to without the columns, but up to a 13.9 % increase in dispersion index as compared to the tests in the scale model without columns for fully developed flow. Further, the residence time distribution results from the scale model tests without columns matched the entire residence time distribution found in the prototype tests. However, for the model with columns, the residence time distribution matched the prototype curve well at early times, but departed significantly from it at times later in the tests. It appears the major effect of the addition of columns within a model clearwell is to increase the dispersion index and increase the proportion of the clearwell which operates as a mixed reactor.<p> The results also showed there was good agreement between the physical model tests and River2DMix simulations of the scale model tests for both the flow pattern and residence time distributions. This indicates that a two-dimensional depth-averaged computer model such as River2DMix can provide representative simulation results in the case where the inlet flow is expected to be quickly mixed throughout the depth of flow in the clearwell.
503

Applicability of multiplicative and additive hazards regression models in survival analysis

Sarker, Sabuj 12 April 2011 (has links)
Background: Survival analysis is sometimes called time-to-event analysis. The Cox model is used widely in survival analysis, where the covariates act multiplicatively on unknown baseline hazards. However, the Cox model requires the proportionality assumption, which limits its applications. The additive hazards model has been used as an alternative to the Cox model, where the covariates act additively on unknown baseline hazards. Objectives and methods: In this thesis, performance of the Cox multiplicative hazards model and the additive hazards model have been demonstrated and applied to the transfer, lifting and repositioning (TLR) injury prevention study. The TLR injury prevention study was a retrospective, pre-post intervention study that utilized a non-randomized control group. There were 1,467 healthcare workers from six hospitals in Saskatchewan, Canada who were injured from January 1, 1999 to December 1, 2006. De-identified data sets were received from the Saskatoon Health Region and Regina Quappelle Health Region. Time to repeated TLR injury was considered as the outcome variable. The models goodness of fit was also assessed. Results: Of a total of 1,467 individuals, 149 (56.7%) in the control group and 114 (43.3%) in the intervention group had repeated injuries during the study period. Nurses and nursing aides had the highest repeated TLR injuries (84.8%) among occupations. Back, neck and shoulders were the most common body parts injured (74.9%). These covariates were significant in both Cox multiplicative and additive hazards models. The intervention group had 27% fewer repeated injuries than the control group in the multiplicative hazards model (HR= 0.63; 95% CI=0.48-0.82; p-value=0.0002). In the additive model, the hazard difference between the intervention and the control groups was 0.002. Conclusion: Both multiplicative and additive hazards models showed similar results, indicating that the TLR injury prevention intervention was effective in reducing repeated injuries. The additive hazards model is not widely used, but the coefficient of the covariates is easy to interpret in an additive manner. The additive hazards model should be considered when the proportionality assumption of the Cox model is doubtful.
504

Aktievärdering : En kvantitativ studie i värdering med Dividend Discount Model och Residual Income Model i förhållande till P/B-tal som referensvärde

Sotkasiira, Monica, Enberg, Fredrik January 2012 (has links)
No description available.
505

Wide-Band and Scalable Equivalent Circuit Model for Multiple Quantum Well Laser Diodes

Kim, Jae Hong 20 May 2005 (has links)
This dissertation presents a wide-band lumped element equivalent circuit model and a building block-based scalable circuit model for multiple quantum well laser diodes. The wide-band multiple-resonance model expresses two important laser diode characteristics such as input reflection and electrical-to-optical transmission together. Additionally, it demonstrates good agreements with the measurement results of the selected commercial discrete laser diodes. The proposed building block-based modeling approach proves its validity using a numerically derived scalable rate equation. Since success in a circuit design depends largely on the availability of accurate device models, the practical application of the proposed models provides improved accuracy, simple implementation and a short design time.
506

The construction of cross-market risk model ¡V with application in a Taiwan-China two-market model

Liou, Siang-yi 15 July 2010 (has links)
This study constructs a cross-market risk model based upon local multi-factor risk models of Taiwan and China equity markets. We employ world, country, industry, and global risk factors to build a structural model which could explain the relationship between local factors across markets by further decomposing local factor returns. Under the structure, this model allows each local market to adopt different local factors rather than force all local markets to use one parsimonious set of factors. Therefore, this model could provide both in-depth and broad coverage analysis of international equity portfolios. The innovative methodology is first introduced by Barra as the Integrated Model. Moreover, we build a simple portfolio and its corresponding benchmark to illustrate the power of our model. Once the contents of a portfolio are decided, this model could provide not only the risk estimation and decomposition in advance but also the performance attribution compared with the benchmark after the portfolio is realized. The analytical viewpoint could also easily change with different numeraire perspectives. The result demonstrates that this model is practical and flexible for international equity portfolio analysis.
507

Automatic Construction of Model Testing Case: Methodology and Prototype

Lin, Chien-Ping 22 July 2010 (has links)
Software testing is a vital part of the software development process, usually implemented at the coding stage, and costly. Due to the increased use of the Unified Modeling Language (UML) and Model Driven Architecture (MDA) approach in systems analysis and design, Model-Based Testing has been discussed as a prominent solution for software testing to reduce the cost of software testing. Prior researches proposed an integrated method which utilizes the artifacts from the Platform Independent Model (PIM) to construct the test paths and generate the test cases. This study develops a methodology which extracting the information from PIM (e.g., Sequence Diagram and Class diagram) to generate the test cases directly. The research methodology is articulated using the design science research methodology. A usability evaluation is performed to demonstrate its applicability. With this methodology, the test cases can be easily generated; thereby reducing the cost and enhancing the efficiency of Model-Based Testing.
508

Study on Architecture-Oriented Product Lifecycle Management Model

Chuang, Shun-Ju 12 June 2012 (has links)
A new product progresses through a sequence of stages from its introduction in the market and its profitability, this sequence is known as the ¡§product life cycle (PLC).¡¨ However, before a product enter the market during the conceive phase, there are three major elements: product data conception, product data collaboration, and product data management. These elements allow a company to track product development through portfolio management and to maintain sales product configuration and market documentation through centralize database. The essence of product lifecycle management (PLM) helps a company increase product revenues through reduced design phase period and prototyping costs due to changes. Currently, the PLM uses a process-oriented model where its strategies are inefficient due to lack of comprehensive consideration and fast rate of change in market needs. As a result, resources are wasted and product competitiveness is affected when the entire process is altered. Thus, this research presents the architecture-oriented product lifecycle management model (AOPLMM) as a solution to deficiencies of the process-oriented model. The goal of AOPLMM is to establish product design development based on multi-needs of the market. Overall, AOPLMM is able to integrate design organization (such as requirement specification, intellectual property right, and green environmental design) and information system for product development knowledge in defining responsibilities of each division and collaboration among them. AOPLMM includes six fundamental diagrams: the ¡§architecture hierarchy diagram,¡¨ the ¡§framework diagram,¡¨ the ¡§component operation diagram,¡¨ the ¡§component connection diagram, the ¡§structure-behavior coalescence diagram,¡¨ and the ¡§interaction flow diagram.¡¨ To construct an efficient model through information management system has several advantages such as flexibility in product design, immediate response to market needs, simplification of product development, decline in enterprise cost, and increase in competitiveness. Therefore, an architecture-oriented model is the clear choice for product lifecycle management.
509

Integrated Multi-Well Reservoir and Decision Model to Determine Optimal Well Spacing in Unconventional Gas Reservoirs

Ortiz Prada, Rubiel Paul 2010 December 1900 (has links)
Optimizing well spacing in unconventional gas reservoirs is difficult due to complex heterogeneity, large variability and uncertainty in reservoir properties, and lack of data that increase the production uncertainty. Previous methods are either suboptimal because they do not consider subsurface uncertainty (e.g., statistical moving-window methods) or they are too time-consuming and expensive for many operators (e.g., integrated reservoir characterization and simulation studies). This research has focused on developing and extending a new technology for determining optimal well spacing in tight gas reservoirs that maximize profitability. To achieve the research objectives, an integrated multi-well reservoir and decision model that fully incorporates uncertainty was developed. The reservoir model is based on reservoir simulation technology coupled with geostatistical and Monte Carlo methods to predict production performance in unconventional gas reservoirs as a function of well spacing and different development scenarios. The variability in discounted cumulative production was used for direct integration of the reservoir model with a Bayesian decision model (developed by other members of the research team) that determines the optimal well spacing and hence the optimal development strategy. The integrated model includes two development stages with a varying Stage-1 time span. The integrated tools were applied to an illustrative example in Deep Basin (Gething D) tight gas sands in Alberta, Canada, to determine optimal development strategies. The results showed that a Stage-1 length of 1 year starting at 160-acre spacing with no further downspacing is the optimal development policy. It also showed that extending the duration of Stage 1 beyond one year does not represent an economic benefit. These results are specific to the Berland River (Gething) area and should not be generalized to other unconventional gas reservoirs. However, the proposed technology provides insight into both the value of information and the ability to incorporate learning in a dynamic development strategy. The new technology is expected to help operators determine the combination of primary and secondary development policies early in the reservoir life that profitably maximize production and minimize the number of uneconomical wells. I anticipate that this methodology will be applicable to other tight and shale gas reservoirs.
510

Factors of Successful and Unsuccessful e-Projects -A Case Study of a Non-Profit Education and Training Institute

Kao, Yi-Chih 21 July 2005 (has links)
The study aims to find key factors of information system success in a non-profit education and training institute. The institute under study has five centers providing on-job training and long-term cultivation in the area of information technology. Facing the pressure of continuing growth and increasing competition, the institute has begun the e-projects since many years ago. After several years of system development and implementation, a survey of users¡¦ satisfaction showed that only part of the systems was acceptable. This study adopts the Technology Acceptance Model (TAM) as the primary research framework along with the service quality model and comes up with five success factors including perceived usefulness, perceived ease-of-use, system quality, information quality and service quality. Moreover, user satisfaction is taken as dependent variable of the framework. In order to differentiate the successful system from the unsuccessful system, a curriculum management system and a knowledge management system was selected to represent the successful and unsuccessful systems respectively. The case study method and questionnaire survey wre both used as research method. In the case study, the development and implementation background, process and results were investigated. The survey method was used to verify the differences between the two systems using the five factors mentioned previously. The results of questionnaire survey and statistical test clearly distinguish the two systems and support case study findings. The results of this study can serve as guideline to improve the users¡¦ satisfaction as well as to improve the information system success.

Page generated in 0.0938 seconds