• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1741
  • 435
  • 330
  • 192
  • 111
  • 79
  • 78
  • 59
  • 54
  • 27
  • 22
  • 22
  • 17
  • 17
  • 15
  • Tagged with
  • 3773
  • 680
  • 400
  • 397
  • 335
  • 319
  • 303
  • 296
  • 292
  • 257
  • 247
  • 242
  • 229
  • 222
  • 186
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

On nanoferroelectric domain structures and distributions of defects inferroelectrics

Hong, Liang, 洪亮 January 2010 (has links)
published_or_final_version / Mechanical Engineering / Doctoral / Doctor of Philosophy
162

On the hunt for willing sellers : the U.S. Army's land acquisition process

Fitzsimmons, Michael James 17 November 2010 (has links)
To maintain high levels of proficiency and readiness the U.S. Army trains its soldiers on military bases across the country. However, the Army currently possesses an insufficient amount of land with which to train on, necessitating an expansion of current bases. This paper explores the Army's land acquisition policies, using as case studies the ongoing expansions at Fort Carson in Colorado and Fort Polk in Louisiana. Fort Carson, which announced expansion plans in 2006, faced strong opposition and the project has ground to a halt. In early 2009 Fort Polk announced a 100,000-acre expansion. They have utilized a broad public outreach program and promised on numerous occasions not to use eminent domain to acquire privately-held land. As a result, the Polk expansion has proceeded much more smoothly. Using lessons learned from the pair of case studies, this paper then presents a list of best practices the Army can use for future land acquisition projects. / text
163

Numerical Study of Coherent Structures within a legacy LES code and development of a new parallel Frame Work for their computation.

Giammanco, Raimondo R 22 December 2005 (has links)
The understanding of the physics of the Coherent Structures and their interaction with the remaining fluid motions is of paramount interest in Turbulence Research. Indeed, recently had been suggested that separating and understanding the the different physical behavior of Coherent Structures and "uncoherent" background might very well be the key to understand and predict Turbulence. Available understanding of Coherent Structures shows that their size is considerably larger than the turbulent macro-scale, making permissible the application of Large Eddy Simulation to their simulation and study, with the advantage to be able to study their behavior at higher Re and more complex geometry than a Direct Numerical Simulation would normally allow. Original purpose of the present work was therefore the validation of the use of Large Eddy Simulation for the study of Coherent Structures in Shear-Layer and the its application to different flow cases to study the effect of the flow topology on the Coherent Structures nature. However, during the investigation of the presence of Coherent Structures in numerically generated LES flow fields, the aging in house Large Eddy Simulation (LES) code of the Environmental & Applied Fluid Dynamics Department has shown a series of limitations and shortcomings that led to the decision of relegating it to the status of Legacy Code (from now on indicated as VKI LES legacy code and of discontinuing its development. A new natively parallel LES solver has then been developed in the VKI Environmental & Applied Fluid Dynamics Department, where all the shortcomings of the legacy code have been addressed and modern software technologies have been adopted both for the solver and the surrounding infrastructure, delivering a complete framework based exclusively on Free and Open Source Software (FOSS ) to maximize portability and avoid any dependency from commercial products. The new parallel LES solver retains some basic characteristics of the old legacy code to provide continuity with the past (Finite Differences, Staggered Grid arrangement, Multi Domain technique, grid conformity across domains), but improve in almost all the remaining aspects: the flow can now have all the three directions of inhomogeneity, against the only two of the past, the pressure equation can be solved using a three point stencil for improved accuracy, and the viscous terms and convective terms can be computed using the Computer Algebra System Maxima, to derive discretized formulas in an automatic way. For the convective terms, High Resolution Central Schemes have been adapted to the three-dimensional Staggered Grid Arrangement from a collocated bi-dimensional one, and a system of Master-Slave simulations has been developed to run in parallel a Slave simulation (on 1 Processing Element) for generating the inlet data for the Master simulation (n - 1 Processing Elements). The code can perform Automatic Run-Time Load Balancing, Domain Auto-Partitioning, has embedded documentation (doxygen), has a CVS repository (version managing) for ease of use of new and old developers. As part of the new Frame Work, a set of Visual Programs have been provided for IBM Open Data eXplorer (OpenDX), a powerful FOSS Flow visualization and analysis tool, aimed as a replacement for the commercial TecplotTM, and a bug tracking mechanism via Bugzilla and cooperative forum resources (phpBB) for developers and users alike. The new M.i.O.m.a. (MiOma) Solver is ready to be used again for Coherent Structures analysis in the near future.
164

The development and characterisation of a conformal FDTD method for oblique electromagnetic structures

Hao, Yang January 1998 (has links)
No description available.
165

A modelling approach to individualised computer aided learning for geometric design

Abbas, Ayman January 1997 (has links)
No description available.
166

An investigation into support for early human computer interaction design activities

Stone, Deborah K. January 2001 (has links)
No description available.
167

Investigation of high-speed optical transmission in the presence of nonlinearities

Thiele, Hans Joerg January 2000 (has links)
No description available.
168

Activation and silencing of α globin expression

Tufarelli, Cristina January 2000 (has links)
No description available.
169

網際網路資源標識與位址解析技術研究 / Research on Internet Resource Identifiers and Address Resolution Technologies

黃勝雄, Huang, Kenny Unknown Date (has links)
Internet is the information space with the collection of resources, network protocols, resource identifiers and addresses. The resources include not only the physical resources but also the virtual resources such as information services. How to build the efficient identifier mapping mechanism and discover these resources is crucial for efficient use of all kinds of resources. The DNS is the most prevalent means of initiating a network transaction. It is the core of Internet and virtually forwards messages to desired destinations. With the growth of Internet, there are more and more applications and services are innovated with new identifier requirements. DNS simply isn’t capable to identify and resolve these resources. This study describes the DNS technology and its disadvantages. Several technical approaches are evaluated. The study proposes a new Partial match mechanism for resource naming and addressing, which improves the DNS service features, supports the demands for advanced resource identification and resolution. The main contributions of this research are: 1. Research on the issues of equal of simple Chinese characters and traditional Chinese characters; propose a solution based on the algorithm of IDN-Admin to implement the equal of simple Chinese characters (SC) and traditional Chinese (TC) characters, and design TC/SC conversion computing algorithm. 2. Evaluate directory service model in the DNS infrastructure 3. Propose internationalized domain name (IDN) administrative guideline for managing locality IDN implementation and operation. 4. Propose Partial Match Mechanism and multifaceted model. Partial Match Mechanism enhances naming service features and backward integrated with the existing DNS infrastructure. 5. Explore business considerations for multilingual name identifiers.
170

Domain adaptation for classifying disaster-related Twitter data

Sopova, Oleksandra January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Doina Caragea / Machine learning is the subfield of Artificial intelligence that gives computers the ability to learn without being explicitly programmed, as it was defined by Arthur Samuel - the American pioneer in the field of computer gaming and artificial intelligence who was born in Emporia, Kansas. Supervised Machine Learning is focused on building predictive models given labeled training data. Data may come from a variety of sources, for instance, social media networks. In our research, we use Twitter data, specifically, user-generated tweets about disasters such as floods, hurricanes, terrorist attacks, etc., to build classifiers that could help disaster management teams identify useful information. A supervised classifier trained on data (training data) from a particular domain (i.e. disaster) is expected to give accurate predictions on unseen data (testing data) from the same domain, assuming that the training and test data have similar characteristics. Labeled data is not easily available for a current target disaster. However, labeled data from a prior source disaster is presumably available, and can be used to learn a supervised classifier for the target disaster. Unfortunately, the source disaster data and the target disaster data may not share the same characteristics, and the classifier learned from the source may not perform well on the target. Domain adaptation techniques, which use unlabeled target data in addition to labeled source data, can be used to address this problem. We study single-source and multi-source domain adaptation techniques, using Nave Bayes classifier. Experimental results on Twitter datasets corresponding to six disasters show that domain adaptation techniques improve the overall performance as compared to basic supervised learning classifiers. Domain adaptation is crucial for many machine learning applications, as it enables the use of unlabeled data in domains where labeled data is not available.

Page generated in 0.0409 seconds