• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 476
  • 71
  • 37
  • 37
  • 37
  • 37
  • 37
  • 37
  • 2
  • Tagged with
  • 604
  • 604
  • 59
  • 52
  • 47
  • 46
  • 42
  • 42
  • 40
  • 33
  • 32
  • 30
  • 25
  • 25
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Linear-time algorithms for graphs with bounded branchwidth

Christian, William Anderson, Jr January 2003 (has links)
We present an algorithmic framework (including a single data structure) that is extended into linear-time algorithms to solve several NP-complete graph problems (i.e., INDEPENDENT SET, M AXIMUM CUT, GRAPH COLORING, HAMILTONIAN CYCLE, and DISJOINT PATHS). The linearity is achieved assuming the provision of a branch decomposition of the instance graph. We then modify the framework to create a multithreaded framework that uses the existing problem-specific extensions without any revision. Computational results for the serial and parallel algorithms are provided. In addition, we present a graphical package called JPAD that can display a graph and branch decomposition, show their relationship to each other, and be extended to rim and display the progress and results of algorithms on graphs or on branch decompositions.
152

Dynamic estimation for serial manipulators

Kapadia, Behram January 1995 (has links)
This thesis considers the development of an accurate dynamic model for two industrial manipulators. The accuracy of available instrumentation for recording sensor-motion data required the application of practical methods for improving the estimation bounds. This thesis presents some methods for improving the estimation accuracy of the reduced order dynamic model, particularly with regard to the importance of a correct formulation for the base parametric set and the dependance of parameter estimation accuracy on the torque sensitivity. / The thesis also experimentally implement an algorithm for trajectory optimization for estimation by the LMS adaptive law. Practical methods are suggested for carrying out the optimization. / Results are presented based on experimentation conducted on an electric and a hydraulic manipulator; a brief description is also provided on the software that was developed for this purpose. Identification results in one case were seen to closely approximate those previously determined by explicit measurements by Khatib et al. Finally, a comparison between ordinary PD control and feedforward control using the available model shows the promise of actual ability to implement model based control. However, a comparison of the results of the control performance between a direct drive manipulator and a geared manipulator indicate that unmodelled dynamics play a central role in increasing the error bounds of the identified parmeters. (Abstract shortened by UMI.)
153

Using Social Media to Predict Traffic Flow under Special Event Conditions

Ni, Ming 07 December 2013 (has links)
<p> Social media is great resource of user-generated contents. Public attention, opinion and hot topics can be captured in the social media, which provides the ability to predict human related events. Since social media can be retrieved in real time with no building cost and no maintenance cost, traffic operation authorizes probably identify the social media data as another type sensor for traffic demand. In this thesis, we aim to use social media information to assist traffic flow prediction under special event conditions. Specially, a short-term traffic flow prediction model, incorporated with tweet features, is developed to forecast the incoming traffic flow prior sport game events. Both tweet rate features and semantic features are included in the prediction model. We examine and compare the performance of four regression methods, respectively autoregressive model, neural networks model, support vector regression, and k-nearest neighbor, with and without social media features. To the end, we show the benefit gained by including social media information in the prediction model and its computational efficiency for potential practical applications. </p>
154

Proximity sensor network for sensor based manipulation

Damianakis, John. January 1997 (has links)
A Proximity Sensor Network (PSN) consisting of four Infra-Red (IR) sensors was developed in order to track, grasp or manipulate objects with robots. The work is motivated by the need for local high bandwidth sensors at the robot's end effector to provide feedback during the pre-contact stage. Two types of amplitude based IR sensors were designed, an "Electrically Biased Sensor" (EBS) and a "Photon Biased Sensor" (PBS). The PBS sensor has a diameter of 5.55 mm and a range of approximately 9.0 cm. The EBS sensor has a diameter of 7.15 mm and a range of approximately 11.2 cm. Both sensors are robust and inexpensive since they were constructed using low-cost, off the shelf components. The design of the sensor heads, the signal processing electronics and the sensor characteristics will be discussed.
155

A validation of the enterprise management engineering approach to knowledge management systems engineering

Mac Garrigle, Ellen F. 18 June 2014 (has links)
<p> Knowledge management is one of the current "buzzwords" gaining popularity on an almost-daily basis within the business world. Much attention has been paid to the theory and justification of knowledge management (KM) as an effective business and organizational practice. However, much less attention has been paid to the more specific issues of effective <u>implementation</u> of knowledge management, or to the potential financial benefit or payoff that could potentially result from an effective system implementation. As the concept of KM becomes more generally accepted, knowledge management systems (KMS) are becoming more prevalent. A KMS is often considered simply another information system to be designed, built, and supported by the IT department. In actual implementation, many KM system development efforts are not successful. There is frequently a perception that strict adherence to development processes produces an excessive time lag, rigor, and formality which will "disrupt" the desired free flow of knowledge. Professor Michael Stankosky of GWU has posited a more flexible variation of the usual systems engineering (SE) approach, tailored specifically to the KM domain and known as Enterprise Management Engineering<sup>&copy;</sup> (EME). This approach takes the four major pillars of KM as identified by GWU research in this area&mdash;Leadership, Organization, Technology, and Learning&mdash;and adapts eighteen key SE steps to accommodate the more flexible and imprecise nature of "knowledge". </p><p> Anecdotal study of successful KMS developments has shown that many of the more formal processes imposed by systems engineering (such as defining strategic objectives before beginning system development) serve a useful purpose. Consequently, an integrated systems engineering process tailored specifically to the KM domain should lead to more successful implementations of KM systems. If this is so, organizations that have followed some or all of the steps in this process will have designed and deployed more "successful" KMS than those organizations that have not done so. To support and refine this approach, a survey was developed to determine the usage of the 18 steps identified in EME. These results were then analyzed against a objective financial measurement of organizational KM to determine whether a correlation exists. This study is intended to test the validity of the efficacy of the EME approach to KM implementation. </p><p> For the financial measurement data, the subject list of organizations for this study used a measure of intangible valuation developed by Professor Baruch Lev of NYU called Knowledge Capital Earnings <sup>&copy;</sup> (KCE). This is the amount of earnings that a company with good "knowledge" has left over once its earnings based on tangible financial and physical assets have been subtracted from overall earnings. KCE can then be used to determine the Knowledge Capital (KC) of an organization. This in turn provides two quantitative measures (one relative, one absolute) that can be used to define a successful knowledge company. </p><p> For this study, Lev's research from 2001 was updated, using more recent financial data. Several of these organizations completed a survey instrument based upon the 18 points of the EME approach. The results for the 18 steps were compared against each other and against each organization's KC scores. The results show that there <u>is</u> a significant correlation between EME and the relative KC measurement, and select EME steps do correlate significantly with a high KC value. Although this study, being the first validation effort, does not show provable <u>causation</u>, it does demonstrate a quantifiable <u>correlation</u> and association between EME and successful KM implementation. This in turn should contribute to the slim body of objective knowledge on the design, deployment, and measurement of KM systems.</p>
156

Designing menus of extended warranty contracts under technological change and competition.

Laksana, Kamonkan. January 2007 (has links)
Thesis (Ph.D.)--Lehigh University, 2007. / Adviser: Joseph C. Hartman.
157

Comparison of analytical hierarchy process with design morphology optimization.

Leung, Lawrence Chin-Pang. Unknown Date (has links)
Thesis (Ph.D.)--University of Houston, 2007. / (UMI)AAI3273791. Source: Dissertation Abstracts International, Volume: 68-07, Section: B, page: 4753. Adviser: Lawrence H. Schulze.
158

Mixed Integer Linear Programming for Time-Optimal Cyclic Scheduling of High Throughput Screening Systems

Sahin, Deniz 08 June 2018 (has links)
<p> High Throughput Screening (HTS) systems are highly technological and fully automated plants which are used for the analysis of thousands of biochemical substances to provide basis for the drug discovery process. As the operation of these systems is remarkably expensive, the scheduling for the processes of such complex systems is critical to the HTS companies. Since the processing time affects the throughput and the efficiency of the system, a time-optimal schedule must be developed for the system which can yield high throughputs. In this thesis, a Mixed Integer Programming model is presented, minimizing the overall processing time and therefore maximizing the throughput of the system. To generate the mathematical model, the principles of Job-Shop Scheduling and Cyclic Scheduling are utilized. The results of the study are supported by an experiment conducted at the High Throughput Screening plant at Washington University in St. Louis. As a conclusion, the model has generated a time-optimal cyclic schedule which improves the total processing time of the system by 3 minutes for 25 batches. The projection of the model for experiments that run with hundreds of batches is interpreted to generate greater improvements for the overall processing time of the system.</p><p>
159

Structured Expert Judgment Elicitation of Use Error Probabilities for Drug Delivery Device Risk Assessment

Zampa, Nicholas Joseph 17 August 2018 (has links)
<p> In the pharmaceutical industry, estimating the probability of occurrence for use errors and use-error-causes (here forth referred to as use error probabilities) when developing drug delivery devices is hindered by a lack of data, ultimately limiting the ability to conduct robust usability risk assessments. A lack of reliable data is the result of small sample sizes and challenges simulating actual use environments in simulated use studies, compromising the applicability of observed use error rates. Further, post-market surveillance databases and internal complaint databases are limited in their ability to provide reliable data for product development. Inadequate usability risk assessment hinders drug delivery device manufacturers' understanding of safety and efficacy risks. The current industry and regulatory paradigm with respect to use error probabilities is to de-emphasize them, focusing instead of assessing the severity of harms. However, de-emphasis of use error probabilities is not rooted in a belief that probability estimates inherently lack value. Rather, the status quo is based on the absence of suitable methodologies for estimating use error probabilities. </p><p> In instances in which data is lacking, engineers and scientist may turn to structured expert judgment elicitation methodologies, in which subjective expert opinions are quantified and aggregated in a scientific manner. This research is a case study in adapting and applying one particular structured expert judgment methodology, Cooke&rsquo;s Classical model, to human factors experts for estimating use error probabilities for a drug delivery device. Results indicate that a performance-weighted linear pooling of expert judgments significantly outperforms any one expert and an equal-weighted linear pooling. Additionally, this research demonstrates that a performance-weighted linear pooling of expert judgments is statistically accurate, robust to the choice of experts, and robust to choice elicitation questions. Lastly, this research validates the good statistical accuracy of a performance-weighted linear pooling of experts on a new set of use error probabilities, indicating that good expert performance translates to use error probabilities estimates for different devices. Through structured expert judgment elicitation according to Cooke&rsquo;s Classical model, this research demonstrates that it is possible to reinstall use error probability estimates, with quantified uncertainty, into usability risk assessments for drug delivery devices.</p><p>
160

Slow Pyrolysis Experiments for High Yields of Solid Carbon

LeBlanc, Jeffrey 22 March 2017 (has links)
<p> Coal and biomass slow pyrolysis reactions were investigated using thermogravimetric analysis close coupled to gas chromatography (TG-GC). The pyrolysis mass balance via this system was closed to >99 wt. %. Parallel in-situ Diffuse Reflectance Infrared Fourier-Transform Spectroscopy pyrolysis experiments were used to explain the mechanistic relationship between functional groups and volatile products. Gas and tar evolution profiles correspond to the loss of surface oxygenated functional groups and increases in char aromaticity during pyrolysis. Various pyrolysis conditions including heating rates, particle size, and reaction confinements were investigated secondary pyrolysis reactions via TG-GC. The investigation demonstrated that increasing the residence time of tar in the solid-gas interface by 0.23-0.31 seconds results in a 2.1-2.5 wt. % decrease in tar production with a commensurate 0.6-5.7 wt. % increase in solid product, a 40 wt. % increase in CH<sub>4</sub>, and a 10-30 wt. % increase in H<sub> 2</sub> between 510 and 575 &deg;C. Matrix-assisted laser desorption/ionization-time-of-flight mass spectroscopy (MALDI-TOF) measured the molecular weight distribution (MWD) of the pyrolysis tar product to be between 200 and 550 amu. Gas chromatographic-mass spectroscopy (GC-MS) was used to identify 120 distinct species in pyrolysis tar. Tar products of the different reaction conditions show that extended residence time of pyrolysis tars in the solid-gas interface decreased the average MWD, decreased the H/C ratio, and resulted in a more expansive speciation of nitrogen and sulfur species in the tar. Further investigations of tar show that coal tar vaporizes by 1000 &ordm;C without producing secondary gas products or coke. Biomass was found to produce a 40 wt. % char product plus CO<sub> 2</sub>, CO, CH<sub>4</sub>, C<sub>2</sub>H<sub>4</sub>, C<sub>2</sub>H<sub> 6</sub>, and H<sub>2</sub>. The experimentally measured mass closure insists that the product distributions and profiles from slow pyrolysis are absolute and the error may be directly calculated. These are used to estimate the rates, kinetic parameters and number of reactions during pyrolysis.</p>

Page generated in 0.1305 seconds