• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1927
  • 598
  • 576
  • 417
  • 240
  • 177
  • 57
  • 54
  • 45
  • 26
  • 26
  • 25
  • 24
  • 23
  • 20
  • Tagged with
  • 4822
  • 533
  • 504
  • 499
  • 433
  • 421
  • 376
  • 362
  • 354
  • 346
  • 340
  • 337
  • 320
  • 319
  • 317
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Applying the Canadian Association of Social Workers Code of Ethics in uniquely-situated northern geographical locations: are there factors in practice environments that impact adherence to the 2005 code?

Wilson Marques, Louise 16 September 2010 (has links)
The purpose of this study was to explore social work practitioner familiarity with and interpretation of the 2005 Canadian Association of Social Workers (CASW) Code of Ethics (“Code”) to determine whether specific sections of the existing Code enable social work practitioners to deliver ethical service in uniquely-situated geographical locations in northern Manitoba. A qualitative research style with exploratory methodology was employed in this study. In-person interviews were conducted with six social workers who have practiced in a northern Manitoba setting for at least three years post- graduation. Once given the opportunity to read and interpret the 2005 Code, five of six respondents reported that the intent of the Code is reflective of their practice, an apparent contradiction between how they described their understanding of the intent of the Code versus their practice realities. All reported that dual roles and potential conflicts of interest are very difficult to avoid when practicing social work in the North. Other key findings indicate that the study participants believe that social workers in the North: are not familiar with the 2005 Code; have a lack of education, knowledge, discussion and accessibility in this Code, and find there is a lack of application of this Code in practice environments. The emergence of the Manitoba College of Social Workers (MCSW), through legislation, will require that all practicing social workers adhere to the 2005 Code of Ethics. All six participants reported factors in northern, rural, remote and isolated environments that affect his/her ability to adhere to the 2005 Code. When social workers are required to register to use the title of Social Worker, the MCSW will be in a position to recognize the environmental factors of practicing in northern, rural, remote and isolated environments. MCSW may choose to consider that, where social workers are required to adhere to the 2005 Code, it may not be feasible in northern, rural, remote and isolated practice areas due to specific factors that have been identified throughout this study. In future studies, consideration and flexibility on behalf of the Psychology/Sociology Research Ethics Board (PSREB) may be necessary in order to ensure that research subjects are protected versus the need to understand, through research, the realities of social work practice. Secondly, researchers interested in expanding or replicating the findings of this study may choose to consider the disclosure and provision of the interview questions prior to the actual interviews. Additionally, future research to review ethical practice in the social work profession would facilitate a broader understanding of whether all social workers practice under existing requirements set out in current MIRSW by-laws and, in the future, provincial legislation that is applicable to every social worker in Manitoba. This study was limited to six participants. Broader research to more fully investigate the practical realities of applying the 2005 Code in northern environments could inform how the new legislation is implemented.
352

Visualization and analysis of assembly code in an integrated comprehension environment

Pucsek, Dean W. 26 June 2013 (has links)
Computing has reached a point where it is visible in almost every aspect of one’s daily activities. Consider, for example, a typical household. There will be a desktop computer, game console, tablet computer, and smartphones built using different types of processors and instruction sets. To support the pervasive and heterogeneous nature of computing there has been many advances in programming languages, hardware features, and increasingly complex software systems. One task that is shared by all people who work with software is the need to develop a concrete understanding of foreign code so that tasks such as bug fixing, feature implementation, and security audits can be conducted. To do this tools are needed to help present the code in a manner that is conducive to comprehension and allows for knowledge to be transferred. Current tools for program comprehension are aimed at high-level languages and do not provide a platform for assembly code comprehension that is extensible both in terms of the supported environment as well as the supported analysis. This thesis presents ICE, an Integrated Comprehension Environment, that is de- veloped to support comprehension of assembly code while remaining extensible. ICE is designed to receive data from external tools, such as disassemblers and debuggers, which is then presented in a series of visualizations: Cartographer, Tracks, and a Control Flow Graph. Cartographer displays an interactive function call graph while Tracks displays a navigable sequence diagram. Support for new visualizations is provided through the extensible implementation enabling analysts to develop visual- izations tailored to their needs. Evaluation of ICE is completed through a series of case studies that demonstrate different aspects of ICE relative to currently available tools. / Graduate / 0984 / dpucsek@uvic.ca
353

Call admission control for multimedia CDMA

Dimitriou, Nikos January 2000 (has links)
No description available.
354

Grammatical constraints on child bilingual code mixing

Sauvé, Deanne. January 2000 (has links)
This study examined structural constraints on early child code mixing. Constraints are widely attested in adult bilinguals (Myers-Scotton, 1993; Poplack, 1980). It has been argued that these constraints preserve the structural properties of both languages. It is uncertain whether constraints on early child code mixing are the same as constraints on adult code mixing. The present analysis was based on Poplack's two structural constraints: the free morpheme and the equivalence constraints. Ten French-English bilingual subjects were observed at 4 time periods, between approximately 2;00 and 3;06 years of age. The children's utterances containing elements from both languages were analysed for violations of Poplack's constraints. The violation rate was extremely low, less than 2% of the total mixed utterances. These results corroborate Lanza (1997), Vihman (1998), Allen et al. (2000), and Paradis et al. (2000), who likewise found that structural constraints on code mixing are operational from early in acquisition.
355

Widely linear minimum variance channel estimation with application to multicarrier CDMA systems

Abdallah, Saeed. January 2007 (has links)
Conventional Minimum-Variance (MV) channel estimation is affected by two sources of error, namely the finite number of samples used to estimate the covariance matrix and the asymptotic bias due to interference and additive noise. On the other hand, widely linear (WL) filtering has been shown to improve the estimation of improper complex signals. Researchers have recently demonstrated that the application of WL processing principles can significantly improve the performance of subspace-based channel estimation algorithms. However, in contrast to MV estimation algorithms, subspace-based algorithms assume knowledge of the total number of users in the system, and must be coupled with sophisticated user enumeration algorithm at the expense of increased complexity. In this work, in an effort to combine the practical advantages of MV channel estimation algorithms with the performance of WL filters we propose a widely linear version of the MV channel estimator in the context of multicarrier(MC) CDMA systems employing real modulation. We use numerical simulations to demonstrate that the widely linear minimum-variance algorithm yields more accurate channel estimates compared to the conventional MV algorithm. By considering two simplified transmission/reception models, we also show analytically that the widely linear estimator on average reduces both types of error.
356

Le code d'éthique dans les organisations du réseau de la santé: outil de régulation des conduites?

Poirier, Yves 12 1900 (has links)
Au Québec, la Loi sur les services de santé et les services sociaux, Chapitre S-4.2, à son article 233, demande à ce que chacun des établissements de santé, dispose d’un code d’éthique qui essentiellement demande de préciser les droits des usagers et de fixer les conduites attendues du personnel. Le législateur souhaitait améliorer les conduites du personnel dès le début des années 1990 et envisageait désigner un organisme de surveillance pour s’en assurer. Cette contrainte ne fut pas retenue et 20 ans plus tard, la volonté d’assurer des conduites attendues n’est toujours pas assujettie de contraintes ou de contrôles même si elle est toujours souhaitée. En 2003 toutefois, le Ministre a mis en place un processus de visites ministérielles dans les milieux d’hébergement et à ce jour quelques 150 établissements ont été visités. Ces équipes se sont préoccupées entre autre de la fonction du code d’éthique pour soutenir les directions de ces établissements. Elles ne réussissent pas à pouvoir s’appuyer sur le code d’éthique pour qu’il soit l’assise pour baser les décisions cliniques, organisationnelles et de gestion de chacune des organisations du réseau de la santé et des services sociaux du Québec. Il faut à ce moment-ci faire le constat que le code d’éthique, obligatoire, figure au nombre des nombreuses contraintes rencontrées par les organisations. Les établissements doivent passer un processus d’agrément aux trois ans et le code d’éthique n’est pas davantage un élément dynamique retenu à ce processus de validation de normes de qualité. De plus, une revue québécoise spécialisée en gestion de la santé a consacré un numéro complet de 15 articles sur « éthique et comportements » et le code d’éthique y est absent sauf pour deux articles qui s’y attardent spécifiquement. Est-ce une question d’éthique dont il est question par ce code, ou si ce n’est pas davantage de la déontologie, d’autant que le législateur veut avant tout s’assurer de comportements adéquats de la part des employés et des autres personnes qui exercent leur profession. Est-ce qu’un code de conduite ne serait pas plus approprié pour atteindre les fins visées? Cette question est répondue dans ce mémoire qui regarde les concepts d’éthique, de déontologie, de codes, de régulation des comportements. De plus, des analyses détaillées de 35 codes d’éthique actuels de divers établissements et de diverses régions du Québec iv sont présentées. La littérature nous donne les conditions de réussite pour un code et outre l’importance à accorder aux valeurs énoncées dans l’organisation, il est également question des sanctions à prévoir au non-respect de ces valeurs. Elles se doivent d’être claires et appliquées. Enfin, beaucoup d’organisations parlent maintenant de code de conduite et ce terme serait tout à fait approprié pour rejoindre le souhait du législateur qui veut assurer des conduites irréprochables des employés et autres personnes qui y travaillent. C’est la conclusion de ce travail, énoncée sous forme de recommandation. / Quebec’s Health and Social Services Law, ch. S-4.2, art. 233, requires that every health institution have a code of ethics that, in essence, sets out the rights of patients and the manner in which staff are expected to conduct themselves. The legislator had hoped that improvements in the conduct of personnel would begin to be seen at the start of the 1990s, and wanted to set up a watchdog body to ensure that progress was made. In the end, no such body was created, and 20 years later, even though they are still very much wished for, constraints and controls over staff conduct remain sorely lacking. In 2003 the Minister of Health and Social Services began a series of official visits to hospitals which to date have covered 150 institutions, and in each of these visits the minister’s teams have, with the backing of the hospitals’ administrators, made a point of looking at how each institution’s code of ethics is working. The general consensus of administrators, however, is that no health institution in Quebec has been able to use the ethics code as a basis for making clinical, organizational or managerial decisions. On the contrary, having a mandatory ethics code is seen by many as a hindrance, one among many that the institutions have to deal with. Every three years each institution goes through a process of re-accreditation to ensure it complies with government standards of quality, but its ethics code is not considered an important and dynamic element in this re-evaluation. One example of this blind spot: When a Quebec periodical specializing in health-care management published a special issue on “ethics and behaviour,” only two of its 15 articles specifically mentioned the notion of a code of ethics. This raises the question: Is “ethics” too general a term? Given that the legislator’s goal is to ensure proper behaviour on the part of staff and others who exercise their profession in the institutions – in other words, a preoccupation with professional ethics – would it not be more appropriate to instead refer to a “code of conduct”? This question is addressed in this thesis, through an examination of the concepts of ethics, professional ethics, codes and regulation of behaviour. As well, a detailed analyses of 35 ethics codes in diverse institutions throughout Quebec is presented. The vi academic literature provides ways of measuring the success of a code of ethics, and besides the importance given to institutional values, there is also the question of sanctions to impose when those values are not respected. Values must be clear to be properly applied. Finally, many organizations now refer to “codes of conduct” – a highly appropriate term, given that the legislator’s goal is to ensure that the conduct of employees and other personnel in health establishments is beyond reproach. This, in fact, is my conclusion, spelled out in the form of a recommendation.
357

JQuery - a tool for combining query results and a framework for building code perspectives

Markle, Lloyd 11 1900 (has links)
In this dissertation we identify two problems with current integrated development environments (IDEs) and present JQuery as a tool to address these issues. The first problem is that IDE views answer low level questions and do not provide a mechanism to combine results to answer complex higher level questions. Even relatively simple questions force the developers to mentally combine results from different views. The second problem is that IDEs do not provide an easy way to create perspectives on project specific concerns such as naming conventions or annotations. Most IDEs do offer support for creating custom perspectives but the effort required to create a perspective is considerably more than the benefit a custom perspective provides. JQuery is an Eclipse plugin which generates code views using an expressive query language. We have redesigned JQuery to support a number of new user interface (UI) features and add a more flexible architecture with better support for extending the UI. To address the first problem, we have added multiple views to JQuery where each view supports drag and drop of results, selection linking, and regular expression search. These features enable a user to combine results from different views to answer more complex higher level questions. To address the second problem, we can leverage the fact that JQuery is built on an expressive query language. Through this query language we are able to define project specific concerns such as naming conventions or annotations and then create views and perspectives for these concerns through the JQuery UI.
358

The bilingual assessment of cognitive abilities in French and English

Lacroix, Serge 11 1900 (has links)
In this study the role that language plays in the expression of intelligence, bilingualism, and the process of assessing selected cognitive abilities was explored. The primary purpose of the study was to determine if individuals who are allowed to move from one language to another when they provide responses to test items produce results that are different than those obtained by bilingual examinees assessed in one language only. The results indicate that the Experimental Group obtained significantly higher results than the Control Group on all the tests and subtests used. The Experimental Group code-switched more frequently and the examiners only code-switched with that group. The frequency of the code-switching behaviours explains, in great part, all the differences noted in the results as very few other sources of differences were identified, even when groups were compared on sex, first language and relative proficiency in French and in English.
359

Development of a high throughput fluorescent screening assay for genetic recoding

Cardno, Tony Stuart, n/a January 2007 (has links)
The development of new drug therapies traditionally requires mass screening of thousands if not millions of substances to identify lead compounds. They are then further optimised to increase potency. The screening of the large pharmaceutical compound libraries can be incredibly expensive, with the industry responding by miniaturising the assays to smaller formats, enabling the compound screening to be automated and, importantly, eliminating assay reagents that are a major contributing cost for running large screens. A potential target for such an approach is the genetic recoding site of viruses like HIV-1 and SARS. They use programmed recoding of the genetic code to regulate the translation of necessary proteins required for viable virus production. For example HIV-1 uses a -1 frameshift mechanism to regulate the ratio of the Gag to the Pol proteins, crucial for viable virus formation. The study of recoding, including readthrough of premature termination codons have most recently used bicistronic reporters with different combinations of enzymes. The most widely used plasmid bicistronic reporter utilises a dual luciferase arrangement comprised of firefly luciferase and Renilla luciferase reporters flanking the DNA being studied. Both of the luciferase enzymatic reporters emit light in response to their respective substrates. The cost of these substrates is the major issue to using luciferase reporters for high throughput screening. My study aimed at designing and developing a bicistronic assay suitable for genetic recoding that was amenable to high throughput screening. The luciferase reporters were replaced with Green Fluorescent Protein (GFP) reporters that do not require the addition of substrates. The development of a dual GFP assay required the appropriate selection of GFP fluorophores, the best arrangement of the GFPs to maximise the ratio of relative fluorescence intensity signal to background, the optimisation of the cells and growth conditions, DNA transfection, plate reader selection, and optical filter sets. Cassettes encoding protein linkers were also incorporated into the design of the constructs to separate the fluorescent proteins spatially to facilitate unimpaired folding into their functional units within the fusion protein. The assay was further improved by moving from transient transfection to stably expressing cell lines. A viable assay was almost achieved for 96 (and 384) well plates with a Z� factor compatible with the assay being suitable for high throughput screening. The assay was used to test a small collection of compounds known to interact with the ribosome and compounds known in the literature to affect frameshifting. This proof of concept was important, since it showed that the assay, with the various modifications, optimisations and miniaturisation steps, still retained the capability of correctly measuring the -1 frameshifting efficiency at the HIV-1 recoding site, and recording compound-induced modulations to the frameshifting efficiency. The compounds cycloheximide and anisomycin, for example, were shown to decrease -1 frameshifting albeit at some expense to overall protein synthesis. The dual GFP assay was also shown to be able to measure accurately changes in the frameshift efficiency brought about by mutations to the frameshift element, and additionally, it would be suitable for the detection and study of compounds, like the recently reported PTC-124 (currently undergoing phase II clinical trial for Duchenne Muscular Dystrophy and cystic fibrosis) that increases readthrough of a UGA premature stop codon mutation. The dual GFP assay developed in this study is at most only 1/10th of the cost of a comparable dual luciferase assay, largely due to removal of assay substrates and transfection reagents. The assay has a robust Z� factor comparable to that of the dual luciferase assay, and would substantially decrease the costs of high throughput screening in situations where a bicistronic reporter is required. The HIV-1 frameshift element is such a site.
360

Maximal ratio combining for iterative multiuser decoding /

Lin, Tao. Unknown Date (has links)
Modern communications has become far more than point-to-point calling and wireless communications is part of every-day life. Driven by ever growing demand for high data rate communication, multiple-access techniques are of interest for allowing multiple users to share limited resources, such as frequency, time and space. Commercially introduced in 1995, Code-Division Multiple-Access (CDMA) quickly became one of the world's fastest-growing wireless technologies. However, CDMA is subject to some limiting factors, such as multiple-access interference (MAI), which dramatically affects the capacity of the wireless system and degrades performance. Fortunately, these effects can be alleviated by applying advanced signal processing techniques such as multiuser detection (MUD), which potentially provides a large increase in system capacity, enhances spectral efficiency, and relaxes requirements for power control. / Further improvements of MUD can be obtained through joint multiuser detection/decoding. However this is a very complex approach. Inspired by Turbo codes and iterative decoding, Turbo-MUD and iterative multiuser decoding have been proposed. The main objective of this research is to analyse the existing iterative techniques applied to Turbo multiuser decoding for coded CDMA systems and propose new decoder structures to improve the system performance. / In this thesis, we observe that many of the iterative multiuser decoding algorithms in the literature are focused on exchanging information obtained within the most current iteration. However, if correlations over iterations are low, then in principle the bit error rate (BER) performance can be improved by combining signal estimates over iterations. Inspired by this idea, iterative maximal ratio combining (MRC) is proposed in this thesis for application to iterative decoding structures. With this approach all previous estimates are recursively weighted and combined to refine the current signal estimates. The derivation of the corresponding weighting factors is based on the statistics of the decoder outputs over iterations, which leads to maximizing the resultant signal-to-noise ratio (SNR) for each current signal estimate. It is shown that the recursive MRC scheme can be widely applied to many existing iterative structures and provide significantly improved system performance with acceptable computational complexity. In addition, the analytic and numerical results illustrate that the resulting performance gain from the application of MRC is inversely proportional to the correlation of the decoder estimates across iterations. The more correlated the signal estimates over consecutive iterations are, the slower system convergence will be, if MRC is employed over all iterations. MRC over only a few initial iterations where correlation across those iterations is low provides faster convergence. A truncated MRC is suggested, which provides better performance while maintaining low computational complexity. Simulation results based on monte carlo averaging demonstrate that the system performance for the proposed techniques is better than many existing algorithms in the literature. / Thesis (MA(Telecommunications))--University of South Australia, 2005.

Page generated in 0.0594 seconds