• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 59
  • 39
  • 23
  • 8
  • 6
  • 5
  • 3
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 197
  • 25
  • 20
  • 17
  • 16
  • 16
  • 15
  • 15
  • 14
  • 14
  • 13
  • 12
  • 12
  • 12
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

An Investigatory Study of Relationships Among Selected Theoretical components of Letter-Writing fluency

Reutzel, Pamela C. 01 May 2015 (has links)
Research that shows the need for letter-writing fluency as a foundation for being able to attend to higher-level thinking skills in writing calls for more research as to what the components of letter-writing fluency actually are and how they are related to writing efficiency. This hierarchical multiple regression study entailed two parts. First, results of assessments of three selected subskills of letter-writing fluency were analyzed to determine how much variance they contribute to the task of letterwriting fluency in 49 kindergarten students in December of their kindergarten year. The first assessed subskill was letter-naming fluency (LNF), which has previously been shown to be predictive of reading ability. The other two subskills that were assessed focus on critical features of letters: (a) letter construction of lowercase letters using physical manipulation and placement of critical features, and (b) critical feature production (CFP) in the form of writing pseudo-letters made up of the same critical features as Roman alphabet letters. As LNF was suspected to be a strong indicator of letter-writing fluency, the other two subskills of critical feature identification and CFP were also analyzed to see how much variance they accounted for in LNF. LNF, CFP, and letter construction were shown to account for a total of 49% of the variance in the skill of letter-writing fluency. LNF accounted for 39% and thus most strongly correlated with writing fluency. Letter construction using critical features and writing of pseudo-letters together added 10% more to the variance of letter-writing fluency. Critical feature identification and CFP were shown to account for 20% of the variance in LNF. This study has implications for letter-writing instruction in early childhood education classrooms, including a strong emphasis on letter-naming activities in the early stages of letter writing. Exploratory, developmentally sensitive instruction may be beneficial involving early writers in activities that require identification, manipulation, and writing of basic critical features of letters. These instructional options are worthy of further research.
42

Analyzing Selected Mapping for Peak-to-Average Power Reduction in OFDM

Baxley, Robert John 20 April 2005 (has links)
Orthogonal frequency division multiplexing (OFDM) has become a popular modulation method in high-speed wireless communications. By partitioning a wideband fading channel into flat narrowband channels, OFDM is able to mitigate the detrimental effects of multipath fading using a simple one-tap equalizer. However, in the time domain OFDM signals suffer from large envelope variations, which are often characterized by the peak-to-average ratio (PAR). High PAR signals, like OFDM, require that transmission amplifiers operate at very low power efficiencies to avoid clipping. In this thesis we review the most popular OFDM PAR-reduction techniques and demonstrate that selected mapping (SLM) is a particularly promising reduction technique. In a SLM system, an OFDM symbol is mapped to a set of quasi-independent equivalent symbols and then the lowest-PAR symbol is selected for transmission. The tradeoff for PAR reduction in SLM is computational complexity as each mapping requires an additional inverse fast fourier transform (IFFT) operation in the transmitter. In additional to an overview of current SLM work, we present a thorough analysis of SLM as well as several novel SLM proposals. First, we derive the closed-form expression for the expected PAR in an SLM system. The expected PAR can be thought of as a metric of PAR reduction capability. Second, we provide a power analysis of SLM to determine if the computational power costs outweigh the power saved through PAR reduction. Through this analysis, we show that SLM is capable of several Watts of net power savings when used in a wireless transmission system. Third, we propose that a PAR threshold should be set in SLM. Such thresholding leads to significant complexity decreases. Fourth, we derive the maximum likelihood (ML) and maximum extit{a posteriori} (MAP) detection metrics for blind SLM (BSLM) and threshold BSLM respectively. Lastly, we demonstrate that by using monomial phase sequences in SLM blind phase sequence detection is possible with a single FFT operation in the receiver.
43

A PAPR Reduction Scheme Without Side Information in Pilot-Aided OFDM Systems

Kuo, Keng-wei 26 August 2010 (has links)
High peak to average power ratio (PAPR) is one of the major drawbacks in orthogonal frequency division multiplexing (OFDM) systems. In recently years, various methods have been proposed to reduce the PAPR performance. The selected mapping (SLM) scheme is perhaps the most popular one because it provides outstanding PAPR reduction performance. In addition, the subcarrier magnitude remains the same in the SLM scheme. However, there are two major shortcomings in the SLM scheme. First of all, it requires a number of inverse fast Fourier transforms (IFFTs) to produce candidate signals, dramatically increasing the computational complexity. In addition, side information has to be transmitted to the receiver to indicate the candidate signal that results in the best PAPR, leading to the decrease in bandwidth utilization. To overcome these two drawbacks, this thesis proposes a novel SLM scheme that does not need side information. The proposed scheme is based on a low complexity SLM scheme [C.-P. Li, S.-H. Wang, and C.-L. Wang, ¡§Novel low-complexity SLM schemes for PAPR reduction in OFDM systems,¡¨ IEEE Trans. Signal Process., vol. 58, no. 5, pp. 2916¡V2921, May 2010] in pilot-aided OFDM system. Simulation experiments are conducted to verify the performance of the proposed scheme. It is shown that the bit error rate (BER) performance of the proposed scheme is very similar to that of the traditional SLM scheme with perfect knowledge of the side information. Therefore, the proposed scheme not only has the advantages of low complexity and high bandwidth utilization, but also has a superior BER performance.
44

Pressure and doping effects on the anomalous phase transition in ternary superconductor Bi2Rh3Se2

Chen, Ching-Yuan 23 July 2012 (has links)
Bi2Rh3Se2 have been known as a charge-density-wave (CDW) superconductor, where the superconducting critical temperature Tc and the CDW phase transition are about 0.7 K and 250 K, respectively. Since there has no definite proof that the anomaly at around 250 K comes from charge-density-wave, we wished to provide another evidence to study whether the superconductor had the properties of CDW by electric resistivity measurements applied different pressures. Bi2Rh3Se2 was prepared by using the solid state reaction method and heating in the quartz tube. After the sample was synthesized, the quality was identified by XRD, MPMS, and specific heat probe. With the confirmation of the above-mentioned measurements, we can determine the sample¡¦s quality is good. Furthermore, temperature-dependent resistivity (2-340 K) under pressure (up to 22.23 kbar) on the ternary superconductor Bi2Rh3Se2 are performed to study the possible coexistence of CDW and superconductivity. Interestingly, the resistive anomaly occurred at Ts~250 K, is shifted to higher temperature with increasing pressure. This experimental finding is not consistent with a traditional CDW transition. Moreover, the temperature-dependent Transmission Electron Microscopy (TEM) electron diffraction is evident a structural phase transition from space group ¡§C1 2/m 1¡¨ (Ts > 250 K) to ¡§P1 2/m 1¡¨ (Ts < 250 K). Finally, We do the Co doping to make sure the effects of chemical pressure on this phase transition. The results are opposite to imposed by physical pressure that the transition is shift to lower temperature with more Co inside the sample.
45

Novel Low-Complexity SLM Schemes for PAPR Reduction in OFDM Systems

Lee, Kun-Sheng 10 August 2008 (has links)
Selected mapping (SLM) schemes are commonly employed to reduce the peak-to-average power ratio (PAPR) in orthogonal frequency division multiplexing (OFDM) systems. It has been shown that the computational complexity of the traditional SLM scheme can be substantially reduced by adopting conversion vectors obtained by using the inverse fast Fourier transform (IFFT) of the phase rotation vector in place of the conventional IFFT operations [21]. Unfortunately, however, the elements of these phase rotation vectors of the conversion vectors in [21] do not generally have an equal magnitude, and thus a significant degradation in the bit error rate (BER) performance is incurred. This problem can be remedied by utilizing conversion vectors having the form of a perfect sequence. This paper presents three novel classes of perfect sequence, each of which comprises certain base vectors and their cyclic-shifted versions. Three novel low-complexity SLM schemes are then proposed based upon the unique structures of these perfect sequences. It is shown that while the PAPR performances of the proposed schemes are marginally poorer than that of the traditional SLM scheme, the three schemes achieve an identical BER performance and have a substantially lower computational complexity.
46

Mass Spectrometry-based Methods for the Detection and Characterization of Protein-Tyrosine Nitration

Seeley, Kent W. 01 January 2013 (has links)
Protein tyrosine nitration (PTN) is a posttranslational modification resulting from oxidative/nitrosative stress that has been implicated in a wide variety of disease states. Characterization of PTN is challenging due to several factors including its low abundance in a given proteome, preferential site modification, multiple target site proximity within unique peptide sequences, and analytical method and instrument limitations. Current analytical techniques are insufficiently sensitive to identify endogenous nitration sites without incorporation of either nitrotyrosine or target protein enrichment. However, enrichment proficiency can also be inadequate. Chemical derivatization of the nitro- moiety can be incomplete or result in undesirable byproduct formation, while immunoaffinity proficiency is contingent upon antibody specificity. To overcome analytical method and enrichment deficiencies, we aimed to develop a comprehensive nitroproteome-specific workflow using molecular methods combined with mass spectrometry. Our approach was to systematically address all relevant factors contributing to PTN such as primary sequence, protein conformation, solvent accessibility, and nitrating agent concentration. Our ultimate goal was to increase mass spectrometric sensitivity for PTN identification. All putative nitroprotein/nitropeptide identifications were then subjected to rigorous validation by either manual spectrum analyses or peptide synthesis. We further developed MS methods for quantitation of nitropeptides from complex mixtures with minimal sample processing. Successful application of our nitroproteome-specific mass spectrometric workflow is expected to provide powerful tools for comprehensive PTN investigation that will elucidate its role in the onset and progression of a variety of disease states as well as facilitate discovery of therapeutic targets.
47

An Economic Evaluation of Selected Treatments for Avian Botulism in Waterfowl on Utah Marshes, 1953-54

Smith, Donald A. 01 January 1955 (has links)
Each year thousands of western waterfowl succumb to disease, predators, mechanical injury and other decimating factors. Based on a review of records it is conservatively estimated that an average of 25,000 ducks have succumbed to botulism on western marsh areas annually. In a recent study, the United States Fish and Wildlife Services valued each duck and goose at $8.00 (McLeod, 1950). Applying this value to the estimated annual numerical loss, a total of $200,000 has been lost each season in mortality of western waterfowl from botulism. Control of this malady would reduce annual waterfowl and monetary losses. Prevention and cure are the only means of controlling botulism in wild ducks. At present, no economical preventative measure exists and control is based on curing stricken birds. The purpose of this study was to evaluate the cost of treatment and rate of recovery of birds stricken with botulism when treated by selected methods. The 4 treatments selected for evaluation were: (1) hospital inoculation, (2) fresh water, (3) field inoculation, and (4) no treatment or control. Research included a comprehensive evaluation of factors such as age, sex, species, body condition, degrees of affliction, reaction to various amounts of antitoxin, and reaction to selected treatment methods, thought to be pertinent in botulism control. This study was conducted during botulism outbreaks of 1953 and 1954, and was confined to state-owned marshlands of Utah.
48

Volatile Organic Compounds and Antioxidants in Olive Oil: Their Analysis by Selected Ion Flow Tube Mass Spectrometry

Davis, Brett Murray January 2007 (has links)
The application of Selected Ion Flow Tube Mass Spectrometry (SIFT MS) to the analysis of olive oil shows several distinct advantages over more conventional analysis techniques. The two areas described in this thesis examining olive oil quality are the analysis of Volatile Organic Compounds (VOCs) and the assessment of antioxidant activity. VOCs are responsible for the aroma and much of the taste of olive oil, while antioxidants afford some protection from harmful reactions involving radical species inside the body by scavenging radicals when olive oil is ingested. The VOCs of olive oil are used by sensory panel judges to classify oils by their degree of suitability for human consumption. The major parameters used for this evaluation are the strengths of any defects and the degree of fruitiness. A defect is an indication of an undesired process which has occurred in the oil, while fruitiness is a fragile attribute which denotes a good quality oil and is easily masked by defects. SIFT MS was used to measure the strengths of the olive oil defects rancid, winey, musty, fusty and muddy. Great potential was demonstrated for all defects except musty and the concentrations of VOCs in olive oil head space were correlated with the peroxide value, a measure of the degree of oil oxidation. A study aimed at correlating the strength of the fruitiness attribute as determined by a sensory panel with the concentrations of VOCs in olive oil head space was unsuccessful. The SIFT MS Total Oxyradical Scavenging Capacity (TOSC) assay was used to measure olive oil antioxidants. This assay measures all antioxidants in oil, not only those removed by extraction with a solvent, as it is conducted in an emulsion. SIFT MS TOSC assay results were found to correlate well with those of the widely used Folin Ciocalteu assay and the total concentration of phenolic compounds present in olive oil. Discrepancies between the two assays were most likely due to hydrophobic antioxidants which are measured by the SIFT MS TOSC assay but not the other tests.
49

SIFT-MS: development of instrumentation and applications.

Francis, Gregory James January 2007 (has links)
Data is presented for a range of experiments that have been performed using a selected ion flow tube (SIFT) instrument operated at room temperature (~ 298K) with carrier gas pressures typically in the range of 0.3 – 0.6 Torr. The majority of the experiments discussed are performed on a Voice100 instrument that has not been described in detail previously. The Voice100 is a novel instrument that has been designed particularly for quantitative trace gas analysis using the SIFT-MS technique. A mixture of helium and argon carrier gases are employed in the Voice100 flow tube. By mixing carrier gases, the flow dynamics and diffusion characteristics of a flow tube are altered when compared to classic single carrier gas models. Therefore firstly, optimal flow conditions for the operation of a Voice100 are characterised. The diffusion of an ion in a mixture of carrier gases is then characterised using theoretical models and experimental techniques. This research requires that a new parameter Mp be defined regarding the mass discrimination of an ion in the non-field-free region near the downstream ion sampling orifice. Furthermore, a new method is described for the simultaneous measurement of rate coefficients for the reactions of H₃O⁺.(H₂O)n (n = 1, 2, 3) ions with analytes. Rate coefficients and branching ratios for the reactions of SIFT-MS precursor ions with specific analytes related to four individual applications are presented. For each application, the kinetic parameters are determined so as to facilitate the quantitative detection of the analytes relevant to that application. The GeoVOC application involves the measurement of hydrocarbon concentrations in the headspace of soil and water across a range of humidities. Alkyl esters are investigated to allow for the quantitative detection of each compound in fruits and vegetables. Chemical warfare agents, their surrogates and precursor compounds are studied which allows for the quantitative or semi-quantitative detection of a range of highly toxic compounds. Finally, 17 compounds classified by the US-EPA as hazardous air pollutants are studied that enables SIFT-MS instruments to replicate sections of the TO-14A and TO-15 methods.
50

Är en schimpans bättre på att skapa avkastning än en professionell fondförvaltare? : En jämförande studie om historisk avkastning av förvaltade fonder och slumpmässigt genererade portföljer

Thaarup, Mattias, Örjes, David January 2013 (has links)
Background: Investors have several options to choose from when the goal is to achieve the highest yield at the lowest cost and risk. Stocks are a common investment options, but is also associated with risks. Portfolios are usually constructed with several different assets to reduce the unsystematic risk of investment. Funds are similar to composite stock portfolios, the big difference is that they dealt with in their entirety and investors may not affect the fund's content. The problem remains that whether you choose stocks or mutual funds there is still uncertainty as to how the future will unfold. Which stocks will yield a high return, and what will bring losses? This is a problem that all investors have to deal with, and by economic theory seeks to create models and mathematical estimates forecasting the future. Studies indicate that the opposite of such economic models can provide at least the same rate of return, for example by allowing a monkey, baby, dog or other non-analytical choose the shares to the portfolio. Objective: Our aim was to investigate the possibility to provide an equal or higher returns than actively managed funds, but the study would also examine the number of shares a portfolio should contain, then random selection acting factor. Delimitation: The study will not take into consideration commissions, dividends, transaction costs, taxes or other issues than those stated. Method: The study extends between the years 2003 - 2013, and was performed by assembling a total of fifteen portfolios according to three different compositions of shares in the portfolios, ie 10, 15 and 20 shares. Five portfolios were randomly composed for each of the three portfolio categories, which are then compared against ten professionally managed funds, as well as an index for the same measurement period. Both the ten funds and stock composition of the fifteen portfolios were randomly reselected for each one of the total ten measurement periods. Conclusion: The managed funds outperformed the index OMXSPI by 2.8%, but the study found that randomly assembled portfolios delivers a significantly higher return than managed funds provide. The portfolio composition of twenty shares was found to provide the most representative results as the portfolio type had the lowest volatility and hence the lowest spread within the results.

Page generated in 0.0525 seconds