• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1919
  • 598
  • 576
  • 417
  • 240
  • 177
  • 57
  • 54
  • 45
  • 26
  • 26
  • 25
  • 24
  • 23
  • 20
  • Tagged with
  • 4811
  • 533
  • 503
  • 497
  • 429
  • 421
  • 376
  • 362
  • 354
  • 345
  • 340
  • 336
  • 320
  • 318
  • 317
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
291

Iterative receiver techniques for coded multiple access communication systems

Reed, Mark C January 1999 (has links)
The introduction of cellular wireless systems in the 1980s has resulted in a huge demand for personal communication services. This demand has made larger capacity systems necessary. This has been partially satisfied by the introduction of second generation digital systems. New third generation systems are now under going standardisation and will require even more efficient utilisation of the spectrum if the high bandwidth features and larger capacity are to become a reality. Motivated by these growing requirements we discuss methods of achieving large improvements in spectral efficiency and performance. Multiple-user communications over a channel can only be achieved with some form of diversity. In this work we point out that the efficient utilisation of the dimensions of space, time, and frequency will ultimately maximise the system capacity of a multiple-user system. We apply our receiver techniques solely to the base-station design where capacity limitations are currently present. We note however, that some of these techniques could also be applied at the mobile terminal receiver. We primarily focus our attention on the direct-sequence code-division multiple-access (DS/CDMA) channel, since this channel is inherently interference limited by other users in the cell of interest. We exploit a new powerful channel coding technique named " turbo coding" for its iterative decoding approach. We show how we can substitute the inner convolutional code of a turbo code encoder with the CDMA channel. By " iterative detection/decoding" or " turbo equalisation" at the receiver we achieve performance results which show the interference from other users to approach complete removal. We develop and analyse a new, low complexity, iterative interference canceller/decoder. This receiver has complexity per user linear with the memory of the channel and independent of the number of users in the system. We extend this receiver to more realistic channels that are asynchronous and include multi-path, and include spatial diversity by using an antenna array at the receiver. The CDMA channel we study exclusively uses randomly generated spreading codes. With this channel model we still achieve single user performance (no interference from other users) with a 10logL gain from L antenna elements and a gain of up to 10logP from P multi-path components. With any new receiver design, sensitivity to channel parameter errors is of paramount interest. We find that the sensitivity of our receiver is low with respect to the parameter errors induced. This is as we desire for a realisable receiver design. Finally we investigate the application of this new iterative interference canceller/decoder receiver to a number of other interference channels. These include the intersymbol interference (ISI) channel, partial response signalling (PRS), and continuous phase modulation (CPM). For these channels excellent performance improvement is generally achieved by the utilisation of the iterative interference canceller/decoder solution. / Thesis (PhD)--University of South Australia, 1999
292

Development of a high throughput fluorescent screening assay for genetic recoding

Cardno, Tony Stuart, n/a January 2007 (has links)
The development of new drug therapies traditionally requires mass screening of thousands if not millions of substances to identify lead compounds. They are then further optimised to increase potency. The screening of the large pharmaceutical compound libraries can be incredibly expensive, with the industry responding by miniaturising the assays to smaller formats, enabling the compound screening to be automated and, importantly, eliminating assay reagents that are a major contributing cost for running large screens. A potential target for such an approach is the genetic recoding site of viruses like HIV-1 and SARS. They use programmed recoding of the genetic code to regulate the translation of necessary proteins required for viable virus production. For example HIV-1 uses a -1 frameshift mechanism to regulate the ratio of the Gag to the Pol proteins, crucial for viable virus formation. The study of recoding, including readthrough of premature termination codons have most recently used bicistronic reporters with different combinations of enzymes. The most widely used plasmid bicistronic reporter utilises a dual luciferase arrangement comprised of firefly luciferase and Renilla luciferase reporters flanking the DNA being studied. Both of the luciferase enzymatic reporters emit light in response to their respective substrates. The cost of these substrates is the major issue to using luciferase reporters for high throughput screening. My study aimed at designing and developing a bicistronic assay suitable for genetic recoding that was amenable to high throughput screening. The luciferase reporters were replaced with Green Fluorescent Protein (GFP) reporters that do not require the addition of substrates. The development of a dual GFP assay required the appropriate selection of GFP fluorophores, the best arrangement of the GFPs to maximise the ratio of relative fluorescence intensity signal to background, the optimisation of the cells and growth conditions, DNA transfection, plate reader selection, and optical filter sets. Cassettes encoding protein linkers were also incorporated into the design of the constructs to separate the fluorescent proteins spatially to facilitate unimpaired folding into their functional units within the fusion protein. The assay was further improved by moving from transient transfection to stably expressing cell lines. A viable assay was almost achieved for 96 (and 384) well plates with a Z� factor compatible with the assay being suitable for high throughput screening. The assay was used to test a small collection of compounds known to interact with the ribosome and compounds known in the literature to affect frameshifting. This proof of concept was important, since it showed that the assay, with the various modifications, optimisations and miniaturisation steps, still retained the capability of correctly measuring the -1 frameshifting efficiency at the HIV-1 recoding site, and recording compound-induced modulations to the frameshifting efficiency. The compounds cycloheximide and anisomycin, for example, were shown to decrease -1 frameshifting albeit at some expense to overall protein synthesis. The dual GFP assay was also shown to be able to measure accurately changes in the frameshift efficiency brought about by mutations to the frameshift element, and additionally, it would be suitable for the detection and study of compounds, like the recently reported PTC-124 (currently undergoing phase II clinical trial for Duchenne Muscular Dystrophy and cystic fibrosis) that increases readthrough of a UGA premature stop codon mutation. The dual GFP assay developed in this study is at most only 1/10th of the cost of a comparable dual luciferase assay, largely due to removal of assay substrates and transfection reagents. The assay has a robust Z� factor comparable to that of the dual luciferase assay, and would substantially decrease the costs of high throughput screening in situations where a bicistronic reporter is required. The HIV-1 frameshift element is such a site.
293

Aspects of language shift in a Hong Kong Chiu Chow family

Cheung, Y. Y., Vivian. January 2006 (has links)
Thesis (M. A.)--University of Hong Kong, 2006. / Title proper from title frame. Also available in printed format.
294

A case study of child-directed speech (CDS) a Cantonese child living in Australia /

Wong, Shuk-wai, Connie Waikiki, January 2006 (has links)
Thesis (M. A.)--University of Hong Kong, 2006. / Title proper from title frame. Also available in printed format.
295

Code Inspection

Krishnamoorthy, Shyaamkumaar January 2009 (has links)
<p>Real time systems, used in most of the day-to-day applications, require time critical exection of tasks. Worst Case Execution Time Analysis (WCET) is performed to ensure the upper bound on the time they can take to execute. This work aims to perform a static analysis of the industry standard code segments to provide valuable information to aid in choosing the right apporach towards WCET analysis. Any code segment can be analyzed syntactically, to gain some insights to the effects that a particular coding syntax format may have. With focus on the functions and looping statements, vaulable information regarding the code segments inspected can be obtained. For this purpose, the code segments from CC systems, Vasteras , were inspected. Scope graphs generated by SWEET, the Swedish Execution Time tool were used extensively to aid this work. It was found that syntactical analysis could be performed effectively for the code segments that were analysed as a part of this task.</p>
296

Att vara tvåspråkig : En studie av gymnasieelever med turkisk bakgrund och deras syn på att vara tvåspråkig

Arisoy, Fatma January 2009 (has links)
<p>The purpose of the thesis is to see how Turkish students understand their bilingualism. I have tried to explain bilingualism as a phenomenon and give an insight in national steering documents on the basis of different concepts used by researchers.</p><p>The method consists of qualitative interviews with standardized character. Five upper-secondary school students participated and independently discussed the answers.</p><p>The result from the interviews is the participators view on bilingualism. The answers were varying. I have got the answer of my question through the interviews and researches. The result is also connected to relevant theories.</p>
297

Code optimization and detection of script conflicts in video games

Yang, Yi 11 1900 (has links)
Scripting languages have gained popularity in video games for specifying the interactive content in a story. Game designers do not necessarily possess programming skills and often demand code-generating tools that can transform textual or graphical descriptions of interactions into scripts interpreted by the game engine. However, in event-based games, this code generation process may lead to potential inefficiencies and conflicts if there are multiple independent sources generating scripts for the same event. This thesis presents solutions to both perils: transformations to eliminate redundancies in the generated scripts and an advisory tool to provide assistance in detecting unintended conflicts. By incorporating traditional compiler techniques with an original code-redundancy-elimination approach, the code transformation is able to reduce code size by 25% on scripts and 14% on compiled byte-codes. With the proposed alternative view, the advisory tool is suitable for offering aid to expose potential script conflicts.
298

Interference cancellation for shot-code DS-CDMA in the presence of channel fading

Dutta, Amit K. 21 August 1997 (has links)
Interference from other adjacent users in wireless applications is a major problem in direct-sequence code-division multiple-access (DS-CDMA). This is also known as the near-far problem where a strong signal from one user interferes with other users. The current approach to deal with the near-far problem in DS-CDMA systems is to use strict transmitter power control. An alternative approach is to use near-far resistant receivers. The practical near-far resistance receiver structure is the adaptive decorrelating detectors since it avoids complex matrix inversion. The existing CDMA standard known as IS-95 uses a long signature code sequence. However for simplicity, the adaptive multi-user receiver uses short signature code sequence. The problem is that adaptive receivers lose near-far resistance as the number of users increases in the system. This thesis describes a novel method of multistage decision feedback cancellation (DFC) scheme immune from the near-far problem. The performance of the new DFC structure is constructed using three different adaptive algorithms: the least mean squared (LMS), the recursive least squared (RLS) and the linearly constraint constant modulus (LCCM) adaptive algorithms. It is found that LMS adaptive algorithm provides the best result considering its simple hardware complexity. It is also found that the LMS adaptive receiver along with the DFC structure provides a better bit synchronization capability to the over all system. Since the receiver is near-far resistant, the LMS adaptive receiver along with the decision feedback cancellation structure also performs better in the presence of Rayleigh fading. / Graduation date: 1998
299

Joint convolutional and orthogonal decoding of interleaved-data frames for IS-95 CDMA communications

Rabinowitz, David 29 February 1996 (has links)
IS-95, an interim standard proposed for future digital personal communications systems, uses two levels of encoding of digital data for error control and compatibility with code-division multiple access (CDMA) transmission. The data is first convolutionally encoded and the resulting symbols are interleaved and then groups are encoded as orthogonal Walsh sequences. Decoding these two separate encodings is traditionally done in separate sequential steps. By combining the decoding and applying feedback of the final decision of the second level of decoding to the first level decoder it is possible to reduce the error rate of the decoder. Each Walsh sequence encodes six non-adjacent symbols of the convolutional code. The receiver computes an estimate of the probability that each of the sixty-four possible Walsh sequences has been sent, and uses this estimate as an estimate for each of the convolution symbols which specified the Walsh sequence. Since the convolution symbols are non-adjacent, it is likely that the actual value of some of the earlier symbols will have been determined by the final decoder before later symbols specifying the same Walsh sequence are used by the convolution decoder. The knowledge of the values of these symbols can be used to adjust the probability estimates for that Walsh sequence, improving the likelihood that future convolutional symbols will be correctly decoded. Specific metrics for estimating probabilities that each convolutional symbol was sent were tested with and without the proposed feedback, and error rates were estimated based on extensive computer simulations. It was found that applying feedback does improve error rates. Analytical methods were also applied to help explain the effects. / Graduation date: 1996
300

Code Inspection

Krishnamoorthy, Shyaamkumaar January 2009 (has links)
Real time systems, used in most of the day-to-day applications, require time critical exection of tasks. Worst Case Execution Time Analysis (WCET) is performed to ensure the upper bound on the time they can take to execute. This work aims to perform a static analysis of the industry standard code segments to provide valuable information to aid in choosing the right apporach towards WCET analysis. Any code segment can be analyzed syntactically, to gain some insights to the effects that a particular coding syntax format may have. With focus on the functions and looping statements, vaulable information regarding the code segments inspected can be obtained. For this purpose, the code segments from CC systems, Vasteras , were inspected. Scope graphs generated by SWEET, the Swedish Execution Time tool were used extensively to aid this work. It was found that syntactical analysis could be performed effectively for the code segments that were analysed as a part of this task.

Page generated in 0.0682 seconds