Spelling suggestions: "subject:"cmpirical."" "subject:"7empirical.""
41 |
Multi-Dimensional Error Analysis of Nearshore Wave Modeling Tools, with Application Toward Data-Driven Boundary CorrectionJiang, Boyang 2010 December 1900 (has links)
As the forecasting models become more sophisticated in their physics and possible depictions of the nearshore hydrodynamics, they also become increasingly sensitive to errors in the inputs. These input errors include: mis-specification of the input parameters (bottom friction, eddy viscosity, etc.); errors in input fields and errors in the specification of boundary information (lateral boundary conditions, etc.). Errors in input parameters can be addressed with fairly straightforward parameter estimation techniques, while errors in input fields can be somewhat ameliorated by physical linkage between the scales of the bathymetric information and the associated model response. Evaluation of the errors on the boundary is less straightforward, and is the subject of this thesis.
The model under investigation herein is the Delft3D modeling suite, developed at Deltares (formerly Delft Hydraulics) in Delft, the Netherlands. Coupling of the wave (SWAN) and hydrodynamic (FLOW) model requires care at the lateral boundaries in order to balance run time and error growth. To this extent, we use perturbation method and spatio-temporal analysis method such as Empirical Orthogonal Function (EOF) analysis to determine the various scales of motion in the flow field and the extent of their response to imposed boundary errors. From the Swirl Strength examinations, we find that the higher EOF modes are affected more by the lateral boundary errors than the lower ones.
|
42 |
An Analysis of Traceability in Requirements DocumentsYAMAMOTO, Shuichiro, TAKAHASHI, Kenji 20 April 1995 (has links)
No description available.
|
43 |
noneHuang, Ya-Yao 04 July 2002 (has links)
none
|
44 |
Multi-step-ahead prediction of MPEG-coded video source traffic using empirical modeling techniquesGupta, Deepanker 12 April 2006 (has links)
In the near future, multimedia will form the majority of Internet traffic and
the most popular standard used to transport and view video is MPEG. The MPEG
media content data is in the form of a time-series representing frame/VOP sizes.
This time-series is extremely noisy and analysis shows that it has very long-range
time dependency making it even harder to predict than any typical time-series. This
work is an effort to develop multi-step-ahead predictors for the moving averages of
frame/VOP sizes in MPEG-coded video streams.
In this work, both linear and non-linear system identification tools are used to
solve the prediction problem, and their performance is compared. Linear modeling is
done using Auto-Regressive Exogenous (ARX) models and for non linear modeling,
Artificial Neural Networks (ANN) are employed. The different ANN architectures
used in this work are Feed-forward Multi-Layer Perceptron (FMLP) and Recurrent
Multi-Layer Perceptron (RMLP).
Recent researches by Adas (October 1998), Yoo (March 2002) and Bhattacharya
et al. (August 2003) have shown that the multi-step-ahead prediction of individual
frames is very inaccurate. Therefore, for this work, we predict the moving average
of the frame/VOP sizes instead of individual frame/VOPs. Several multi-step-ahead
predictors are developed using the aforementioned linear and non-linear tools for
two/four/six/ten-step-ahead predictions of the moving average of the frame/VOP
size time-series of MPEG coded video source traffic.
The capability to predict future frame/VOP sizes and hence the bit rates will
enable more effective bandwidth allocation mechanism, assisting in the development
of advanced source control schemes needed to control multimedia traffic over wide
area networks, such as the Internet.
|
45 |
Higher-Dimensional Properties of Non-Uniform Pseudo-Random VariatesLeydold, Josef, Leeb, Hannes, Hörmann, Wolfgang January 1998 (has links) (PDF)
In this paper we present the results of a first empirical investigation on how the quality of non-uniform variates is influenced by the underlying uniform RNG and the transformation method used. We use well known standard RNGs and transformation methods to the normal distribution as examples. We find that except for transformed density rejection methods, which do not seem to introduce any additional defects, the quality of the underlying uniform RNG can be both increased and decreased by transformations to non-uniform distributions. (author's abstract) / Series: Preprint Series / Department of Applied Statistics and Data Processing
|
46 |
Sustainability and company performance : Evidence from the manufacturing industryChen, Lujie January 2015 (has links)
This dissertation approaches the question of sustainability and its influence on company performance, with special focus on the manufacturing industry. In the contemporary production environment, manufacturing operations must take into account not only profit, but also environmental and social performance, in order to ensure the long-term development of the company. Companies have to decide whether they should allocate resources to environmental and social practices in order to improve their competitive advantage. Consequently, in decision-making processes concerning operations, it is important for companies to understand how to coordinate profit, people, and planet. The objective of this dissertation was to investigate the current situation regarding manufacturers’ sustainable initiatives, and to explore the relationship between these sustainable practices and companies’ performance, including financial performance, operational performance, innovation performance, environmental performance, and social performance. First of all, a structured literature review was conducted to identify sustainable factors considered to be important in the decision making of manufacturing operations. The findings were synthesized into a conceptual model, which was then adopted as the basis for designing the survey instrument used in this dissertation. Drawing on Global Reporting Initiative (GRI) reports, empirical research was performed to explore the relationship between environmental management practices and company performance. Interestingly, the findings showed that many environmental management practices had a strong positive impact on innovation performance. Sustainability disclosures and financial performance were further analyzed using extended data from the GRI reports. The results also showed that several sustainability performance indicators, such as product responsibility, human rights, and society, displayed a significant and positive correlation with return on equity in the sample companies. In order to further explore the research area and to verify these findings, a triangulation approach was adopted and new data were collected via a survey conducted among middle and large sample companies in the Swedish manufacturing industry. The results indicated that the sustainable improvement practices had a positive impact on company performance. Some environmental and social improvement practices had a direct and positive correlation with product and process innovation. Furthermore, findings suggested that better cooperation with suppliers on environmental work could help to strengthen the organizational green capabilities of the focal companies. When considering the company’s general approach to implementing sustainable practices, some interesting findings emerged. There were limited significant differences in sustainable practices when comparing different manufacturing sectors, and different countries and regions. However, the results showed that Swedish manufacturing companies often place higher priority on implementing economic and environmental sustainability practices than on social ones. This dissertation contributes to the literature on manufacturing sustainability. The study expands the understanding of how environmental, social, or economic perspectives as a triple bottom line can influence company performance and to a certain extent the supply chain. Identifying and understanding such relationships gives companies the opportunity to integrate sustainability into their manufacturing operations strategy in order to sustain their manufacturing operations over the long term.
|
47 |
Designing structural election models for new policy analysisKretschman, Kyle James 20 June 2011 (has links)
This dissertation focuses on designing new structural election models and applying modern estimation techniques to quantify policy reform questions. All three chapters use models that are based on individual decision-making and estimate the parameters using a novel data set of U.S. House of Representative elections. These models provide new opportunities to analyze and quantify election policy reforms.
The first chapter utilizes a unique compilation of primary election expenditures to see if general election voters value the primary nomination signal. While producing new results on the relationships between primary elections and general elections and between candidate characteristics and vote shares, this model allows me to show that campaign finance reform can have an unintended consequence. A limit on expenditures would have little effect on the competitiveness of elections and substantially decrease voter turnout in the U.S. House elections. In contrast, it is shown that a mandatory public funding policy is predicted to increase competitiveness and increase voter turnout.
The second chapter examines why unopposed candidates spend massive amounts on their campaign. The postulated answer is that U.S. House of Representative candidates are creating a barrier to entry to discourage candidates from opposing them in the next election. This barrier reduces competition in the election and limits the voters’ choices. An unbalanced panel of congressional districts is used to quantify how an incumbent’s expenditure in previous elections impacts the probability of running unopposed in a later election.
The third chapter estimates the value of a congressional seat based on the observed campaign expenditures. Campaign expenditures are modeled as bids in an asymmetric all-pay auction. The model produces predictions on how much a candidate should spend based on the partisanship leaning of each district. The predictions and observed expenditures are then used to estimate the value of a congressional seat. Along with analyzing how expenditures would change with new campaign finance reforms, this model has the capability of quantifying the effect of redistricting. After 2010 Census results become available, the majority of states will redraw their congressional districts changing the distribution of partisan votes. This model can be used to quantify the effect that the change in voter distribution has on campaign expenditures. / text
|
48 |
Simulation of naturally fractured reservoirs using empirical transfer functionTellapaneni, Prasanna Kumar 30 September 2004 (has links)
This research utilizes the imbibition experiments and X-ray tomography results for modeling fluid flow in naturally fractured reservoirs. Conventional dual porosity simulation requires large number of runs to quantify transfer function parameters for history matching purposes. In this study empirical transfer functions (ETF) are derived from imbibition experiments and this allows reduction in the uncertainness in modeling of transfer of fluids from the matrix to the fracture. The application of the ETF approach is applied in two phases. In the first phase, imbibition experiments are numerically solved using the diffusivity equation with different boundary conditions. Usually only the oil recovery in imbibition experiments is matched. But with the advent of X-ray CT, the spatial variation of the saturation can also be computed. The matching of this variation can lead to accurate reservoir characterization. In the second phase, the imbibition derived empirical transfer functions are used in developing a dual porosity reservoir simulator. The results from this study are compared with published results. The study reveals the impact of uncertainty in the transfer function parameters on the flow performance and reduces the computations to obtain transfer function required for dual porosity simulation.
|
49 |
PVIT: A task-based approach for design and evaluation of interactive visualizations for preferential choiceBautista, Jeanette Lyn 05 1900 (has links)
In decision theory the process of selecting the best option is called preferential
choice. Many personal, business, and professional preferential choice decisions
are made every day. In these situations, a decision maker must select the optimal option among multiple alternatives. In order to do this, she must be able
to analyze a model of her preferences with respect to the objectives that are important to her. Prescriptive decision theory suggests several ways to effectively
develop a decision model. However, these methods often end up too tedious
and complicated to apply to complex decisions that involve many objectives
and alternatives.
In order to help people make better decisions, an easier, more intuitive way
to develop interactive models for analysis of decision contexts is needed. The
application of interactive visualization techniques to this problem is an opportune solution. A visualization tool to help in preferential choice must take into
account important aspects from both fields of Information Visualization and
Decision Theory. There exists some proposals that claim to aid preferential
choice, but some key tasks and steps from at least one of these areas are often
overlooked. An added missing element in these proposals is an adequate user
evaluation. In fact, the concept of a good evaluation in the field of information
visualization is a topic of debate, since the goals of such systems stretch beyond
what can be concluded from traditional usability testing. In our research we
investigate ways to overcome some of the challenges faced in the design and
evaluation of visualization systems for preferential choice.
In previous work, Carenini and Lloyd proposed ValueCharts, a set of visualizations and interactive techniques to support the inspection of linear models
of preferences. We now identify the need to consider the decision process in its
entirety, and to redesign ValueCharts in order to support all phases of preferential choice. We present our task-based approach to the redesign of ValueCharts
grounded in recent findings from both Decision Analysis and Information Visualization. We propose a set of domain-independent tasks for the design and
evaluation of interactive visualizations for preferential choice. We then use the
resulting framework as a basis for an analytical evaluation of our tool and alternative approaches. Finally, we use an application of the task model in conjunction with a new blend of evaluation methods to assess the utility of ValueCharts.
|
50 |
Empirical Likelihood Based Confidence Intervals for the Difference between Two Sensitivities of Continuous-scale Diagnostic Tests at a Fixed Level of SpecificityYao, Suqin 28 November 2007 (has links)
Diagnostic testing is essential to distinguish non-diseased individuals from diseased individuals. The sensitivity and specificity are two important indices for the diagnostic accuracy of continuous-scale diagnostic tests. If we want to compare the effectiveness of two tests, it is of interest to construct a confidence interval for the difference of the two sensitivities at a fixed level of specificity. In this thesis, we propose two empirical likelihood based confidence intervals (HBELI and HBELII) for the difference of two sensitivities at a predetermined specificity level. Simulation studies show that when correlation between the two test results exists, HBELI and HBELII intervals perform better than the existing bootstrap based BCa, BTI and BTII intervals due to shorter interval lengths. However, when there is no correlation, BCa, BTI and BTII intervals outperform HBELI and HBELII intervals due to better coverage probability in most simulation settings.
|
Page generated in 0.0357 seconds