• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 237
  • 45
  • 25
  • 25
  • 25
  • 25
  • 25
  • 25
  • 5
  • 1
  • Tagged with
  • 334
  • 334
  • 334
  • 61
  • 60
  • 48
  • 21
  • 21
  • 21
  • 21
  • 19
  • 16
  • 13
  • 13
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Factory design layout: A combined artificial intelligence and simulation approach

Chierotti, Marco, 1960- January 1991 (has links)
This thesis proposes a methodology for factory layout synthesis that combines the symbolic and numerical approaches. The layout model is represented as a loosely packed arrangement of rectangles with sides parallel to two orthogonal directions. Solutions must satisfy a set of qualitative and quantitative constraints and must optimize material handling costs. Qualitative constraints are specified using a rule based approach. Quantitative constraints and material handling costs are incorporated into a numerical, non linear-optimization problem. Optimal solutions are generated by inserting one device at the time in all admissible locations on the floorplan. The resulting partial layouts are optimized with respect to material handling costs. Then, the best partial solution obtained so far is the starting point for the next object insertion. The process stops when all devices have been placed. To illustrate the methodology, three case studies are presented.
192

A systems thinking approach to cloud computing that complements the NIST guidance

Brow, Raymond A. 08 April 2014 (has links)
<p> The move to cloud computing as mandated by the US federal CIO (Kundra, 2010) is one of the key initiatives expected to provide relief from US federal IT budget concerns. NIST was commissioned to provide guidance for the move. Federal agencies expressed concern that the guidance provided to them was deficient and GAO (2012) further stated that more planning was required. There is no research investigating the possible role systems thinking could play for complementing the NIST guidance and enhancing the planning. This study presents systems thinking as a complementary option to the NIST guidance. Using a mixed method, this study demonstrates first, quantitatively through content analysis that the NIST documentation does not take a systems thinking approach. Then secondly, this study qualitatively explores a systems thinking framework to supplement the NIST guidance. The framework is established based upon a thorough review of current scholarship in the areas of systems thinking and cloud computing using the established tools and methods of content analysis. The review of cloud computing demonstrates the diversity, complexity, and uncertainty of the technology. Systems thinking is shown to address just such situations. Based upon the research, a systems thinking framework is created that demonstrates how systems thinking could supplement the NIST guidance on moving to the cloud. Through the application of the framework US federal agencies could more confidently manage the risk of moving US federal assets to the cloud and thereby gain a firmer foothold in arresting the IT budget concerns.</p>
193

Designing complex, interactive, architectural systems with CIAS-DM| A model-based, human-centered, design & analysis methodology

Manganelli, Joseph Charles 04 March 2014 (has links)
<p> The built environment increasingly contributes to improving human health, well-being, and performance in measurable, predictable, and tailorable ways. Achieving high-performance environmental systems requires real-time-interactive sensing, monitoring, actuation, and communication subsystems, as well as real-time interactions of these environmental systems with their users and other internal and external systems. Developing theories, constructs, methods, and tools necessary for designing such high-performance, complex, interactive systems is an active area of research. </p><p> This dissertation focused on methods and tools for representing the cognitive and physical affordances of complex, interactive, architectural systems (CIAS). The Complex, Interactive, Architectural Systems Design Methodology (CIAS-DM) was proposed as a method and tool for helping designers uncover and document the scope of proposed CIAS. CIAS-DM was evaluated qualitatively. This project used the design of a `smart' mattress in a patient room `smart' bed/mattress/over-the-bed table ecosystem as the basis for a series of design cases. Fourteen clinicians participated as subject matter experts. Four research associates participated as raters. The results of evaluating CIAS-DM indicate that CIAS-DM is useful for scoping CIAS design challenges. The contributions of this dissertation are: 1) identifying and characterizing CIAS; 2) introducing the systems modeling language (SysML) and a cognitive work analysis (CWA) representational and analytic methods into architecture; 2) mapping constructs and methods from CWA into SysML; and 3) providing these methods and tools in an integrated package appropriate for those designing CIAS.</p>
194

Stochastic Inventory Modeling and Integration to Multi-criteria Risk Decision-making for Interdependent Economic and Infrastructure Systems

Resurreccion, Joanna Z. 02 May 2013 (has links)
<p> Extreme risks associated with natural and man-made disasters involve disruptions in the production of goods or provision of services in interdependent systems. The reduced supply of critical goods and services will degrade "as planned" production outputs and create ripple effects of direct and indirect disruptions. Input-output modeling evaluates the propagation of disaster consequences by quantifying the associated economic risks of disruption, namely economic loss and inoperability, for multi-sectoral economic regions. The thesis enhances the reliability of these risk estimates by formulating a stochastic inventory-based risk assessment model using a multi-objective optimization framework for minimizing (i) economic losses, and (ii) sector inoperability. The research utilizes inventory-to-sales ratio data from the Bureau of Economic Analysis for modeling uncertainty in the levels of finished goods inventory and the beta distribution to integrate uncertainty in decision-maker preferences associated with the multi-objective framework. The framework focuses on the development of a holistic, flexible and scalable decision support system through a Dynamic Cross Prioritization Plot (DCPP) for identifying inventory enhancement opportunities among critically disrupted systems that is applicable to different regions and disaster scenarios. </p>
195

Multiple targets detection and tracking with Doppler effect in noisy acoustic wireless sensor networks

An, Youngwon Kim 18 May 2013 (has links)
<p> This dissertation presents a new detection algorithm and high speed/accuracy tracker for tracking a single and multiple ground vehicles in noisy acoustic wireless sensor networks (WSNs). For tracking ground vehicles, acoustic WSNs have been regarded to be useful and increasingly available due to recent advancement of micro sensors technology with low cost. Requirements for WSNs applications are low computational and communication load to each sensor node as each sensor node runs with a low-power battery. </p><p> The new detection algorithm of this dissertation naturally accounts for the Doppler effect which is an important consideration for tracking higher-speed targets. The tracking system used in this study employs Kalman Filtering (KF) with the weighted sensor position centroid being used as the target position measurement. The weighted centroid makes the tracker to be independent of the detection model and changes the tracker to be near optimal, at least within the detection parameters used in the study of single target. </p><p> The new approach for a single target tracking contrasts with the previous approaches that employed more sophisticated tracking algorithms with higher computational complexity and used a power law detection model. The power law detection model, neglecting Doppler effect, is valid only for low speed targets and is susceptible to mismatch with detection by the sensors in the field. The tracking model also makes it possible to uniquely study various environmental effects on track accuracy, such as the Doppler effect, signal collision, signal delay, and different sampling time. The tracking model is shown to be highly accurate for a moving single target in both linear and accelerated motions. The computing speed is estimated to be 50&sim;100 times faster than the previous more sophisticated methods and track accuracy compares very favorably. </p><p> As the WSN systems face multiple targets in the real world, the study has been expanded to the multiple targets tracking including the environmental noise mitigation. Localization and tracking multiple targets which undergo merging and split in the noisy acoustic WSNs require a new approach for detection and tracking. Doppler effect is included in the detection model and an efficient noise mitigation algorithm is developed. A new rule-based tracking algorithm is also developed, which guarantees reliable tracking of multiple targets in noisy acoustic WSNs with very low computational complexity and high track accuracy. The tracking system guarantees much lower computational complexity and comparable track accuracy to more sophisticated algorithms of the previous work.</p>
196

Small Business Leadership and Organizational Culture, Job Satisfaction and Performance| Correlational Research

Frazier, Eugene 09 May 2013 (has links)
<p> This study investigated the relationship among organizational leadership styles (criterion variables), organizational culture, and employee job satisfaction, and organizational performance (predictor variables). The study research method was the quantitative method using a correlational research design that investigated the relationship among the variables. The data indicated relationships existed among the criterion variable, transformational leadership and the predictor variables, culture and employee job satisfaction. Correlation tests and regression analysis reflected the existence of significant relationships. The MLQ survey instrument facilitated data collection on leadership styles and on employee job satisfaction. In addition, the MLQ survey facilitated the collection of data concerning outcomes or behaviors of the leadership styles. Outcomes included cultural artifacts, extra effort from employees, and employee job satisfaction. In addition, the amended survey included supplementary demographic questions. Transformational and transactional leadership positively correlated with increased employee job satisfaction correlated with organizational performance. The findings of this study supported the literature review findings. The results may illuminate the relationship among organizational leadership styles and organizational culture and organizational job satisfaction and performance thereby facilitating organizational success.</p>
197

Statistical characterization of sensors for sensor fusion

Smith, Jeffrey David January 1996 (has links)
It is standard practice in advanced telerobotic systems to use a detailed model of the operating environment to plan tasks and drive simulations. How well these systems operate is highly dependent upon the accuracy of this model. As objects are moved by manipulators, inaccuracies in the world model will accumulate, progressively reducing the overall system utility. To compensate for errors in the world model, sensory information on the location of objects within the operating area must be used to drive world model updates. Every sensory process (e.g. laser range-finding) provides information of limited accuracy, and this accuracy function must be taken into account during the data fusion. We measured and characterized sensor inaccuracies for video cameras, laser range-finders and kinematic positioning information, and used them as the foundation of a sensor fusion system to produce more accurate position measurements of objects in the workspace. By studying the resulting fused sensor error characteristics, we show how sensor sub-systems for telerobotic systems can be designed, and how fused sensor information can be used to increase manipulator accuracy.
198

Code design and multiuser detection for code division multiple access systems with continuous phase modulation

Papasakellariou, Aristides January 1996 (has links)
The proliferation of wireless communications services combined with the limited spectrum availability have placed the bandwidth utilization as a major performance measure. Consequently, the bandwidth allocation technique to multiple signals and the bandwidth occupied by each signal are issues of paramount importance. For voice and bursty data communications, code-division multiple-access provides excellent bandwidth management. The objective to produce constant-envelope signals with compact spectral characteristics is most effectively accomplished using continuous phase modulation. The purpose of this study is to examine detection issues for signals that combine the above techniques. For a synchronous system, the reliable operation of a single-user receiver without power control requires spreading codes that exhibit minimal mutual interference. Signal memory is essential for good performance and precludes the existence of orthogonal codes. Code design is examined for two signal formats that offer different spectral and error rate characteristics. A recursive algorithm that provides the structure and maximum number of codes is presented for both signal formats. Moreover, the code performance is evaluated for an asynchronous system with power control. To avoid the performance limitations of the single-user receiver in the presence of interference and the disadvantages of power control, multiuser detectors are considered for both synchronous and asynchronous systems. The optimum coherent multiuser detector is briefly analyzed and its computational complexity is shown to be prohibitively large for practical applications. For this reason, the emphasis is placed on suboptimum detectors with linear complexity and near-optimum performance. The choice of an appropriate set of decision statistics is crucial for this objective and conventional detectors, if applicable, perform poorly. Two linear complexity detection methods that can be applied to both signal formats are proposed for each system. The individual code design to optimize the error rate for a specific receiver complexity is determined and substantial gains are achieved over antipodal signaling. Moreover, the spectral and error rate performance are largely independent and impressive capacity improvements are obtained over conventional systems for a modest increase in the complexity of the receiver.
199

System identification for robust control

Zhang, Huipin January 1998 (has links)
In the design of a robust control system, one needs a nominal model together with a quantitative bound on the uncertainty that results from under-modeling and disturbances. In this thesis we do not intentionally seek a nominal model and a quantitative bound, instead, the uncertainty is directly parameterized so that the resulting uncertain model family can be characterized by means of a real parameter vector with at most unit length. This is an innovative approach to the control-oriented system identification, since it is not in accordance with the general philosophy of robust identification. However, it is applicable to the robust synthesis problem by taking advantage of a convex parameterization of robust controllers that simultaneously stabilize the uncertain models in the family. The robust performance problem becomes tractable since it can be converted into a quasi-convex optimization problem with Linear Matrix Inequality (LMI) constraints. The relation between the optimal robust performance and the uncertainty is studied by analyzing the explicit bounds of the maximal robust margin. Model (in)validation is a complement to system identification. In our approach it is an integral ingredient of the process of obtaining robust control-oriented system models. A single model is not invalidated if it is inside the ellipsoid, and thus the intersection of the ellipsoids is not invalidated. In order to make the unfalsified model set (the intersection) fit in our framework, we can compute an optimal ellipsoid bounding the intersection of the ellipsoids. (Abstract shortened by UMI.)
200

Nonlinear multicriteria optimization and robust optimality

Das, Indraneel January 1997 (has links)
This dissertation attempts to address two important problems in systems engineering, namely, multicriteria optimization and robustness optimization. In fields ranging from engineering to the social sciences designers are very often required to make decisions that attempt to optimize several criteria or objectives at once. Mathematically this amounts to finding the Pareto optimal set of points for these constrained multiple criteria optimization problems which happen to be nonlinear in many realistic situations, particularly in engineering design. Traditional techniques for nonlinear multicriteria optimization suffer from various drawbacks. The popular method of minimizing weighted sums of the multiple objectives suffers from the deficiency that choosing an even spread of 'weights' does not yield an even spread of points on the Pareto surface and further this spread is often quite sensitive to the relative scales of the functions. A continuation/homotopy based strategy for tracing out the Pareto curve tries to make up for this deficiency, but unfortunately requires exact second derivative information and further cannot be applied to problems with more than two objectives in general. Another technique, goal programming, requires prior knowledge of feasible goals which may not be easily available for more than two objectives. Normal-Boundary Intersection (NBI), a new technique introduced in this dissertation, overcomes all of the difficulties inherent in the existing techniques by introducing a better parametrization of the Pareto set. It is rigorously proved that NBI is completely independent of the relative scales of the functions and is quite successful in producing an evenly distributed set of points on the Pareto set given an evenly distributed set of 'NBI parameters' (comparable to the 'weights' in minimizing weighted sums of objectives). Further, this method can easily handle more than two objectives while retaining the computational efficiency of continuation-type algorithms, which is an improvement over homotopy techniques for tracing the trade-off curve. Various aspects of NBI including computational issues and its relationships with minimizing convex combinations and goal programming are discussed in this dissertation. Finally some case studies from engineering disciplines are performed using NBI. The other facet of this dissertation deals with robustness optimization, a concept useful in quantifying the stability of an optimum in the face of random fluctuations in the design variables. This robustness optimization problem is presented as an application of multicriteria optimization since it essentially involves the simultaneous minimization of two criteria, the objective function value at a point and the dispersion in the function values in a neighborhood of the point. Moreover, a formulation of the robustness optimization problem is presented so that it fits the framework of constrained, nonlinear optimization problems, which is an improvement on existing formulations that deal with either unconstrained nonlinear formulations or constrained linear formulations.

Page generated in 0.4089 seconds