• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 237
  • 45
  • 25
  • 25
  • 25
  • 25
  • 25
  • 25
  • 5
  • 1
  • Tagged with
  • 334
  • 334
  • 334
  • 61
  • 60
  • 48
  • 21
  • 21
  • 21
  • 21
  • 19
  • 16
  • 13
  • 13
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

A predictable real-time system for control and instrumentation /

Martel, Sylvain. January 1996 (has links)
Many research and commercial environments need to measure fast time varying signals and need to control subsystems, e.g. actuators, also in real-time with high bandwidth and minimum latency. Existing commercial systems for data acquisition and control have many shortcomings and behave in a non-deterministic manner. Determinism or predictability is a key element of high performance real-time systems which must always meet specific deadlines under tight synchronizations. In this thesis, a new approach to very high-performance predictable real-time acquisition and control is proposed, theoretically analyzed, implemented in hardware, and experimentally tested. The resulting system is highly adaptable and reconfigurable, and has been applied to a number of problem areas including micro-robot control via a high performance parallel computer architecture, and cardiac electropotential mapping. Indeed, the resulting cardiac mapping system is so far as we know, the highest resolution produced to date.
282

Use of Multi-Fidelity and Surrogate Models to Reduce the Cost of Developing Physics-Based Systems

Hebert, James L. 10 April 2015 (has links)
<p> Building complex physics-based systems in a timely cost-effective manner, that perform well, meet diverse user needs, and have no bad emergent behaviors is a challenge. To meet these requirements the solution is to model the physics-based system before building it. Modeling and Simulation capabilities for these type systems have advanced continuously during the past 20 years thanks to progress in the application of high fidelity computational codes that are able to model the real physical performance of system components. The problem is that it is often too time consuming and costly to model complex systems, end-to-end, using these high fidelity computational models alone. Missing are good approaches to segment the modeling of complex systems performance and behaviors, keep the model chain coherent and only model what is necessary. Current research efforts have shown that using multi-fidelity and/or surrogate models might offer alternative methods of performing the modeling and simulations needed to design and develop physics-based systems more efficiently. This study demonstrates that it is possible reduce the number of high fidelity runs allowing the use of classical systems engineering analysis and tools that would not be possible if only high fidelity codes were employed. This study advances the systems engineering of physics-based systems by reducing the number of time consuming high fidelity models and simulations that must be used to design and develop the systems. The study produced a novel approach to the design and development of complex physics-based systems by using a mix of variable fidelity physics-based models and surrogate models. It shows that this combination of increasing fidelity models enables the computationally and cost efficient modeling and simulation of these complex systems and their components. The study presents an example of the methodology for the analysis and design of two physics-based systems: a Ground Penetrating Radar (GPR) and a Nuclear Electromagnetic Pulse Bounded Wave System.</p>
283

From waste to resource| a systems-based approach to sustainable community development through equitable enterprise and agriculturally-derived polymeric composites

Teipel, Elisa 23 October 2014 (has links)
<p> Rural communities in developing countries are most vulnerable to the plight of requiring repeated infusions of charitable aid over time. Micro-business opportunities that effectively break the cycle of poverty in resource-rich countries in the developing world are limited. However, a strong model for global commerce can break the cycle of donor-based economic supplements and limited local economic growth. Sustainable economic development can materialize when a robust framework combines engineering with the generous investment of profits back into the community. This research presents a novel, systems-based approach to sustainable community development in which a waste-to-resource methodology catalyzes the disruption of rural poverty. </p><p> The framework developed in this thesis was applied to the rural communities of Cagmanaba and Badian, Philippines. An initial assessment of these communities showed that community members are extremely poor, but they possess an abundant natural resource: coconuts. The various parts of the coconut offer excellent potential value in global commerce. Today the sale of coconut water is on the rise, and coconut oil is an established $3 billion market annually that is also growing rapidly. </p><p> Since these current industries harvest only two parts of the coconut (meat and water), the 50 billion coconuts that grow annually leave behind approximately 100 billion pounds of coconut shell and husk as agricultural waste. Coconuts thus provide an opportunity to create and test a waste-to-resource model. Intensive materials analysis, research, development, and optimization proved that coconut shell, currently burned as a fuel or discarded as agricultural waste, can be manufactured into high-grade coconut shell powder (CSP), which can be a viable filler in polymeric composites. </p><p> This framework was modeled and tested as a case study in a manufacturing facility known as a Community Transformation Plant (CTP) in Cagmanaba, Philippines. The CTP enables local creation of globally viable products from agricultural waste. This researcher seeks to encourage the propagation of CTPs throughout developing communities worldwide, each profiting from its own waste-to-resource value.</p>
284

Using social media content to inform agent-based models for humanitarian crisis response

Wise, Sarah 21 August 2014 (has links)
<p>Crisis response is a time-sensitive problem with multiple concurrent and interacting subprocesses, applied around the world in a wide range of contexts and with access to varying levels of resources. The movement of individuals with their shifting patterns of need and, frequently, disrupted normal support systems pose challenges to responders trying to understand what is needed, where, and when. Unfortunately, crises frequently occur in parts of the world that lack the infrastructure to respond to them and the information to inform responders where to target their efforts. In light of these challenges, researchers can make use of new data sources and technologies, combining the information products with simulation techniques to gain knowledge of the situation and to explore the various ways in which a crisis may develop. These new data sources&mdash;including social media such as Twitter and volunteered geographic information (VGI) from groups such as OpenStreetMap&mdash;can be combined with authoritative data sources in order to create rich, synthetic datasets, which may in turn be subjected to processes such as sentiment analysis and social network analysis. Further, these datasets can be transformed into information which supports powerful agent- based models (ABM). Such models can capture the behavior of heterogeneous individuals and their decision-making process, allowing researchers to explore the emergent dynamics of crisis situations. To that end, this research explores the gathering, cleaning, and synthesis of diverse data sources as well as the information which can be extracted from such synthetic data sources. Further, the work presents a rich, behaviorally complex agent-based model of an evacuation effort. The case study deals with the 2012 Colorado Wildfires, which threatened the city of Colorado Springs and prompted the evacuation of over 28,000 persons over the course of four days. The model itself explores how a synthetic population with automatically generated synthetic social networks communicates about and responds to the developing crisis, utilizing real evacuation order information as well as a model of wildfire development to which the individual agents respond. This research contributes to the study of data synthesis, agent-based modeling, and crisis development. </p>
285

Systems Engineering Knowledge Asset (SEKA) Management for Higher Performing Engineering Teams| People, Process and Technology toward Effective Knowledge-Workers

Shelby, Kenneth R., Jr. 22 March 2014 (has links)
<p> Systems engineering teams' value-creation for enterprises is slower than possible due to inefficiencies in communication, learning, common knowledge collaboration and leadership conduct. This dissertation outlines the surrounding people, process and technology dimensions for higher performing engineering teams. It describes a true experiment investigation of opportunities to improve communication, learning and common knowledge collaboration. </p><p> The art and practice of Systems Engineering contributes business value by orchestrating large numbers of knowledge-workers as engineering teams in the achievement of complex goals. During the creation of new systems, engineering team performance modulates business efficiencies to realize those complex goals. Higher performing engineering teams share a vision providing purpose, rely on personal knowledge convolved with collaborators knowledge to unleash potential, leverage common knowledge in their team mental models, and execute synergistically. Why do non-high performing teams exist? Culture change is hard. Humans prefer the familiar. Without Leadership and systematic enablement, teams usually do not naturally find the high performing team traits. </p><p> This research investigates a unique Information Technology based Systems Engineering Knowledge Asset (SEKA) management mechanism. The selected mechanism integrates multiple techniques for improved collaboration efficacy. The research methodology was a modified true experiment with dual-posttest only, using an A and B group for comparative controls. Research findings reflect, with 99% confidence, that SEKA represented in 3-way Multiple Informational Representations Required of Referent (MIRRoR) knowledge constructs improves systems engineering teams' consumption of a common knowledge base. </p><p> Engineering teams can consume a set of information, which generates knowledge common with their collaborators, in a shorter period. More knowledge that is common facilitates increased ability to collaborate. Increased collaboration accelerates team learning, leading to shorter systems delivery schedules, lower cost to produce and earlier actionable intelligence. Shorter delivery times increase customer satisfaction; lower costs improve profit margin potential, and earlier actionable intelligence supports "left of boom" intervention. </p>
286

A Risk Analysis Tool for Evaluating ROI of TRA for Major Defense Acquisition Programs

Bailey, Reginald U. 22 January 2015 (has links)
<p> The U.S. DoD budget has grown to over a half trillion dollars annually. Unfortunately, the majority of these acquisitions do not satisfy their initial performance objectives in terms of cost, schedule, and technical performance. The U.S. DoD attributes these shortfalls in part to the use of immature technologies within these programs. The U.S. DoD endorsed and later mandated the use of Technology Readiness Assessments (TRAs) and knowledge-based practices in the early 2000's to be used as a tool in the management of program acquisition risk. Unfortunately, the expense of implementing TRAs can be significant, especially when programs include knowledge-based practices such as prototyping, performance specifications, test plans, and technology maturity plans. What has been the economic impact of these TRA practices on the acquisition performance of the U.S. Army, Navy, and Air Force? The conundrum that exists today is there is no commonly accepted approach used to determine the economic value of TRAs. This study provides a model for the valuation of TRAs in assessing the risk of technical maturity. It provides a framework to evaluate the economic benefits of performing Technology Readiness Assessments on acquisition performance using cost and technology maturity risks to derive economic benefits, which can then be input into valuation techniques such as benefit/cost ratio, return on investment percentage, net present value, and real options analysis. </p><p> (Keywords: TRA, Knowledge-Based Acquisition, B/CR, ROI%, NPV, ROA).</p>
287

An object-oriented approach to distributed network management

Torrente, Salvatore January 1995 (has links)
Network management is concerned with monitoring, controlling and coordinating network elements for reliable end-to-end customer services. This thesis presents an object-oriented approach to distributed network management of heterogeneous network elements across multiple telecommunication network service provider domains. Specifically, we present a prototype network management approach using distributed object database management systems as a repository and manager of a standard network information model, and we present the results of object interaction across distributed object-oriented databases. In particular, we provide data which illustrates the advantages of an active and dynamic network management environment over static management information bases for fast and efficient telecommunications Operations, Administration, Maintenance and Provisioning (OAM&P) of end-to-end network services. Finally, we expand on distributed object-oriented systems and their role in future information networking architectures.
288

Building a systems level theory of IS integration in mergers and acquisitions

Reinicke, Bryan Alan, January 2007 (has links)
Thesis (Ph.D.)--Indiana University, Kelley School of Business, 2007. / Source: Dissertation Abstracts International, Volume: 68-09, Section: A, page: 3956. Adviser: Carol V. Brown. Title from dissertation home page (viewed May 5, 2008).
289

Rendez-vous autonome en orbite martienne selon les methodes non-lineaires des commandes par modes glissants et retroactions linearisees.

Lafleur, Jean-Roch. Unknown Date (has links)
Thèse (M.Sc.A.)--Université de Sherbrooke (Canada), 2008. / Titre de l'écran-titre (visionné le 1 février 2007). In ProQuest dissertations and theses. Publié aussi en version papier.
290

An integrated trusted processes framework for consumer-facing B2B networks

Doshi, Chintan January 2008 (has links)
With the advent of the Internet, co-operating enterprises are increasingly sharing consumer data to deliver them information-rich online experiences with value-added services. At the same time, technological advances in web services standards based on a Service Oriented Architecture (SOA) design have enabled heterogeneous Business-to-Business (B2B) integration between enterprises. End-to-End business processes that span across multiple enterprises can be developed using the Business Process Execution Language (BPEL). However, such processes must address issues that normally do not arise for processes within a single enterprise. A framework is needed which supports a SOA-enabled business process management approach but also has technical infrastructure to address issues related to identity, privacy, compliance, monitoring and people interaction. In our thesis, we accomplish this using a framework that supports the design, implementation and management of trusted B2B processes defined using the BPEL standard and deployed into Circle of Trust architecture as specified by the Liberty Alliance federated identity standards. A key contribution of our thesis was to extend the Circle of Trust architecture with a new entity that introduces components to help businesses manage their B2B processes. Two use-case scenarios involving information-rich B2B processes were implemented to evaluate our framework.

Page generated in 0.1037 seconds