• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1590
  • 568
  • 227
  • 185
  • 155
  • 89
  • 46
  • 41
  • 33
  • 32
  • 21
  • 19
  • 16
  • 15
  • 15
  • Tagged with
  • 3610
  • 643
  • 423
  • 418
  • 358
  • 316
  • 292
  • 273
  • 243
  • 235
  • 210
  • 193
  • 188
  • 185
  • 183
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
381

Study of Complementary Coded MIMO-CDMA Systems and Design of Columnwise Complementary Codes

Chen, Guan-ting 02 September 2005 (has links)
In thesis, we design a new orthogonal complementary codes- columnwise complementary codes. Its advantage is: (1)As traditional orthogonal complementary codes, it has perfect orthogonal complementary property of auto-correlation and cross-correlation. (2)We can choose different type of columnwise complementary codes to restrain frequency selective fading and time selective fading. (3)Compared with complete complementary codes, it can support more users. (4)The generating method of columnwise complementary codes is easy to understand. (5)It can be used in multi-rate transmission system. (6)The generality: complete complementary codes and 2D OVSF codes are subsets of columnwise complementary codes. We also discuss its bit error rate in a frequency selective Rayleigh fading channel and a time selective fading channel by computer simulation.
382

A Feasible Evaluation and Analysis of Visual Air Quality Index in Urban Areas

Chang, Kuo-chung 21 July 2006 (has links)
This research analyzed the weather information (temperature, wind velocity, visibility, and total cloudiness) from the Taipei and the Kaohsiung Weather Station of Central Weather Bureau, and air pollution from the Air Quality Monitoring Station of Environmental Protection Administration, Executive Yuan ( Shihlin, Shihlin, Jhongshan, Wanhua, Guting, Songshan¡A), Nanzih, Zuoying, Cianjin, and Siaogan ) to evaluate the feasibility of using visibility as the ambient air quality index by statistical analysis¡C In regard to the visibility in Taipei metropolis, the visibility between 1983~1992 showed a steady status between 5~11 kilometers. The visibility after 1993 has increased gradually between 6~16 kilometers, which indicated that the visual air quality has been improved year by year in Taipei metropolis. In regard to the visibility in Kaohsiung metropolis, the index has a trend of decreasing year by year from 10~24 kilometers to 2~12 kilometers, and the decrease was particularly obvious after 1993. Analyzing the air quality index greater than 100 in the metropolis, the visibility is categorized as the level of "poor", which means that the visibility is within 3 kilometers. When the air quality index ranges between 76~100, the visibility is categorized as the level of median, which means the visibility is within 4 kilometers. When the air quality index ranges between 50~75, the visibility is categorized as the level of "good", which means the visibility is within 7 kilometers. When the air quality index ranges between 20~49, the visibility is categorized as the level of "excellent", which means the visibility is beyond 7 kilometers.
383

Single molecule investigating Rhodamine B dilute solution at confocal and TIR configurations

Wei, Yi-chung 18 January 2007 (has links)
The motion of dye molecules in the solution is highly influenced by the Brownian motion caused by the stochastic collisions with the solvents, and it results the fluorescence intensity fluctuation. The thesis study the fluorescence intensity fluctuation of dilute dye molecule (Rhodamine B) in methanol solution ( - ), under confocal and total internal reflection (TIR) microscopy configurations. Five parameters are used to probe the fluorescence characteristics: (1) the difference between confocal and the TIR configurations. The configuration influences the laser focusing area and consequently the intensity distribution. The effective focusing area in confocal configuration is an ellipsoid shape, while that of TIR configuration is a disk shape around the interface with depth 100-200 nm. It results the TIR configuration less background and higher concentration capability. (2) concentration. We control the concentration from much less than one molecule to more than one molecule in the effective focusing area, and we observe the change of burst intensity distribution. (3) the focus position. By changing the focusing position, we study the effective focusing region changes. (4) excited intensity, and (5) fluorescence correlation spectroscopy (FCS). Our results indicate that TIR configuration exhibits lower background, and is suitable to higher concentration solution. In addition, when the dye concentration in the focusing area is much less than 1, the FCS amplitude is no longer follow 1/N, but rather be proportional to N, where N is the concentration.
384

Improving the Treated Water for Water Quality and Good Tastes from Traditional and Advanced Water Treatment Plants

HAn, Chia-Yun 19 July 2007 (has links)
The purpose of this research is to compare the performance for the water quality of two traditional water treatment plants (WTP) and three advanced water treatment plants (AWTP), and to investigate the treated drinking water in distribution systems in Kaohsiung area for promoting the consumers¡¦ self-confidence. Samples of the treated water from five major water supplies¡¦ WTP(noted numbers: WF1, WF2, WF3, WF4 and WF5) and the tap water at user¡¦s end were selected in planning of this work. It was the traditional WTP stage with treated drinking water and distribution systems in Kaohsiung area During 91 year to 92 year, so we conducted WF1 and WF2 of 8 times sampling and WF3, WF4 and WF5 for 2 times sampling at this stage. In and after 93 year, we conducted WF1, WF2, WF3, WF4 and WF5 of 8 times sampling from 93 year to 94year for the advanced WTP stage. The major tests related with the parameters of influencing operation condition included pH, odor (abbreviated as TON), total trihalomethane (abbreviated as THMs), haloacetic acids (abbreviated as HAAs), nitrogen (abbreviated as, NH3-N, hardness, total dissolved solid (abbreviated as TDS), alkalinity, total organic carbon (abbreviated as TOC), calcium ion, flavor profile analysis (abbreviated as FPA), and suspension observation in boiling with treated waters from two WTP , three AWTP and the tap water at user¡¦s end in a distribution system. It point out the better quality of treated water used the advanced water treatment plants than that of traditional water treatment plant. The items with improvement of water quality, including THMs, HAAs, hardness, TON, 2-MIB, TOC, alkinality and Ca ions concentration, is presented. Their efficiency for improvement are respectively 47%, 29%, 43%, 11%, 29%, 15%, 14% and 34%. The insignificant efficiency were concentrated at TDS, NH3-N, pH and FPA. Water quality of six items are fitted for the drinking water standard at present in Taiwan (such as: odor<3 TON; THMs<0.1 mg/L; NH3-N<0.1 mg/L; TDS< 600 mg/L; Hardness <400 mg as CaCO3/L; 6.0<pH <8.5). The HAAs is fit for water quality USEPA first stage water standard (HAAs<80 £gg/L). In the suspension observation in boiling experimentation, we cooperate with the experiment of suspension observation in boiling to do contrast with TDS and hardness experiment, which can find out, the treated water after the advanced procedure, the time with boiling increases, the condition of its suspended substance has great reduction. It show treated drinking water after the advanced WTP can huge improve the traditional WTP¡¦s white suspended substance or white material precipitate questions in the boiling. In the contour map for water quality , we found that Gushan District, Lingya District, Qianzhen District, Xiaogang District, Fongshan City and Daliao Shiang etc had higher concentration profile in the four season (included spring, summer, fall and winter ) and during two seasons (included raining and drying) in the water supplies systems. We hope the contour map can offer a clear information of conveyer system administrator of drinking water and let administrator know where areas have high concentration produced in water quality management planning, in order to having priority or effective solutions (included washing the pipeline, changing the pipeline, changing the water flow, etc.).
385

A biomedical engineering approach to investigating flow and wall shear stress in contracting lymphatics

Dixon, James Brandon 16 August 2006 (has links)
Collecting microlymphatics play a vital role in promoting lymph flow from the initial lymphatics in the interstitial spaces to the large transport lymph ducts. In most tissues, the primary mechanism for producing this flow is the spontaneous contractions of the lymphatic wall. Individual units, known as lymphangion, are separated by valves that help prevent backflow when the vessel contracts, thus promoting flow through the lymphatic network. Lymphatic contractile activity is inhibited by flow in isolated lymphatics, however there are virtually no in situ measurements of lymph flow in these vessels. Initially, a high speed imaging system was set up to image in situ preparations at 500 fps. These images were then manually processed to extract information regarding lymphocyte velocity (-4 to 10 mm/sec), vessel diameter (25 to 165 um), and particle location. Fluid modeling was performed to obtain reasonable estimates of wall shear stress (-8 to 17 dynes/cm2). One of the difficulties encountered was the time consuming methods of manual particle tracking. Using previously captured images, an image correlation method was developed to automate lymphatic flow measurements and to track wall movements as the vessel contracts. Using this method the standard error of prediction for velocity measurements was 0.4 mm/sec and for diameter measurements it was 7.0 µm. It was found that the actual physical quantity being measured through this approach is somewhere between the spatially averaged velocity and the maximum velocity of a Poiseuille flow model.
386

Modeling correlation in binary count data with application to fragile site identification

Hintze, Christopher Jerry 30 October 2006 (has links)
Available fragile site identification software packages (FSM and FSM3) assume that all chromosomal breaks occur independently. However, under a Mendelian model of inheritance, homozygosity at fragile loci implies pairwise correlation between homologous sites. We construct correlation models for chromosomal breakage data in situations where either partitioned break count totals (per-site single-break and doublebreak totals) are known or only overall break count totals are known. We derive a likelihood ratio test and Neyman’s C( α) test for correlation between homologs when partitioned break count totals are known and outline a likelihood ratio test for correlation using only break count totals. Our simulation studies indicate that the C( α) test using partitioned break count totals outperforms the other two tests for correlation in terms of both power and level. These studies further suggest that the power for detecting correlation is low when only break count totals are reported. Results of the C( α) test for correlation applied to chromosomal breakage data from 14 human subjects indicate that detection of correlation between homologous fragile sites is problematic due to sparseness of breakage data. Simulation studies of the FSM and FSM3 algorithms using parameter values typical for fragile site data demonstrate that neither algorithm is significantly affected by fragile site correlation. Comparison of simulated fragile site misclassification rates in the presence of zero-breakage data supports previous studies (Olmsted 1999) that suggested FSM has lower false-negative rates and FSM3 has lower false-positive rates.
387

An investigation into the predictive performance of pavement marking retroreflectivity measured under various conditions of continuous wetting

Pike, Adam Matthew 25 April 2007 (has links)
This thesis research investigated the predictive performance of pavement marking retroreflectivity measured under various conditions of continuous wetting. The researcher compared nighttime detection distance of pavement markings in simulated rain conditions and the retroreflectivity of the same pavement markings in several continuous wetting conditions. Correlation analyses quantified the predictive performance of the resulting retroreflectivity values from the continuous wetting conditions. The researcher measured the retroreflectivity of 18 pavement marking samples under 14 different conditions. The American Society for Testing and Materials (ASTM) has three standards for measuring the retroreflectivity of pavement markings under: dry (E-1710), recovery (E-2177), and continuous wetting conditions (E-2176). Using three ASTM standard conditions resulted in three sets of retroreflectivity data, and variations of the continuous wetting standard produced an additional 11 sets of continuous wetting condition data. The researcher also incorporated detection distance values measured for the same 18 pavement marking samples under three different simulated rainfall conditions at night. The three conditions included: high (0.87 in/hr), medium (0.52 in/hr), and low (0.28 in/hr) flow rates, these rates were to simulate typical rainfall rates in the state of Texas. The correlation analyses measures the linear relationship as well as the logarithmic relationship between the detection distance and the retroreflectivity of the pavement markings. A pavement markings' retroreflectivity is typically used as a detection distance performance indicator, therefore a high degree of correlation between retroreflectivity and detection distance would be desired. A high degree of correlation would indicate that a measured retroreflectivity value of a pavement marking would provide a good indication of the expected detection distance. The researcher conducted analyses for several subgroups of the pavement markings based on the markings type or characteristics. Dry, recovery, and all the continuous wetting retroreflectivity data were correlated to the detection distances. Correlation values found during this thesis research did not show a high degree of correlation for most of the subgroups analyzed. This indicates that measured retroreflectivity would not provide very good predictive performance of the pavement markings detection distance in rainy conditions.
388

Technology and Economics Affecting Unconventional Reservoir Development

Flores Campero, Cecilia P. 15 January 2010 (has links)
Worldwide, unconventional resources are important sources of oil and gas when most conventional resources are declining and demand for hydrocarbons is growing. The Masters? (1979) concept of the energy resource triangle suggest that the exploitation of unconventional reservoirs is particularly sensitive to both technology and commodity price parameters. In the United States, production from unconventional reservoirs has been stimulated by a combination of Federal tax credits, technical development programs -supported by government agencies and private organizations- and high commodity prices. In this work, the effect of technology and different economic events for selected unconventional oil and gas plays in the United States was evaluated according to the concept of the Resource Triangle Theory (RTT). Studies conducted in the Austin Chalk -our textbook case- and other seven unconventional plays in the United States have supported the RTT concept that high prices and better technologies do result in more drilling activity and more oil and gas production from unconventional reservoirs. For instance, two approaches were employed to support RTT concept: Correlation study and Forecasting graphs. On the first one, correlations of commodity prices and technology with drilling activity demonstrated that periods of high commodity prices coincide with increase in unconventional producing wells approximately 75% from selected plays in this study. The second one shows that high prices and technological advances also translate into additional oil and gas production and reserves. This behavior was observed through the analysis of a series of decline production curves using a VBA program in Excel that compute oil and gas production volumes and their corresponding economic values under specific conditions. The results indicated that maximum value of approximately $50 billion oil plus gas would have been possible using conventional hydraulic fracturing technology only. Moreover, subsequent episodes of high commodity allow the introduction of new technologies that have boosted even more oil and gas production from the plays. Great examples are the use of horizontal and multilateral wells which has opened up additional areas for development, such as the Barnett Shale and the Bakken Shale. Using horizontal wells has also revived older plays, such as the Austin Chalk. The combination of horizontal well technology and water fracturing technology has led to a dramatic increase in the development of both oil and gas from shale reservoirs. Current production schemes suggest that the plays could produce an additional of $320 billion when producing at rates higher than 5 BOE/day. Our results confirm the concept of the resource triangle that natural gas and oil resources can be produced from low quality resources when either product prices increase or when better technology is available. The seven oil and gas plays studied in this research are demonstrative examples.
389

Nonlinear Analysis of Stock Correlations among East Asian Countries, and The U.S., Japan, and German

Huang, Hsiao-wen 14 July 2008 (has links)
With gradually increasing interdependence of international political and economic environments, part of Asian countries' financial markets reform adopted progressive policies towards liberalization and internationalization. Therefore, the integration of international financial markets has attracted a bunch of scholars to investigate related topics of international stock market. Granger and (1993) documented that most of the economic variables have nonlinear characters. Chelley-Steeley (2004) uses smooth transition regression model to explore the financial market integration of regional and global markets among emerging and developed countries. Smooth transition regression model considered the possibility of nonlinear changes in regression parameters. This paper applies the smooth transition regression model to reinvestigate Chelley-Steeley¡¦s (2004) study of nonlinear relationship of stock markets among some East Asian countries and the United States, Japan and Germany. The main difference of our model and Chelley-Steeley¡¦ model is that we relax his constant market index correlation between two countries by allowing the autoregressive process on market index correlation. Empirical evidences of linear model, original non-linear model and our non-linear extension model show that our non-linear extension model outperformedthe other two models in terms of goodness of fit.
390

Studies on N-Heterocyclic Compounds

Armugam, S 03 1900 (has links)
The thesis entitled "Studies on N-Hetero cyclic Compounds: (a) Reaction of 5,6,7,8-Tetrahydroisoquinolines with Vilsmeier Reagent and (b) Amide Induced in situ Alkylation of 5,6-Dihydroisoquinolines" is presented in two parts. Part I involves a study of the Vilsmeier reaction of 4-cyano-1,3-dihydroxy-5,6,7,8 tetrahydroisoquinoline derivatives, while Part II concerns the in situ alkylation of l-alkyl-4-cyano-3-methoxy-5,6- dihydroisoquinolines in presence of KNH2/liq.NH3.

Page generated in 0.1181 seconds