• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 32971
  • 15309
  • 9136
  • 4760
  • 3708
  • 3558
  • 831
  • 691
  • 615
  • 587
  • 508
  • 479
  • 449
  • 376
  • 365
  • Tagged with
  • 88877
  • 11799
  • 9394
  • 6308
  • 6198
  • 5273
  • 5035
  • 4572
  • 3983
  • 3898
  • 3808
  • 3756
  • 3556
  • 3432
  • 3388
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
721

Parametric WCET Analysis

Bygde, Stefan January 2013 (has links)
In a real-time system, it is crucial to ensure that all tasks of the system hold their deadlines. A missed deadline in a real-time system means that the system has not been able to function correctly. If the system is safety critical, this could potentially lead to disaster. To ensure that all tasks keep their deadlines, the Worst-Case Execution Time (WCET) of these tasks has to be known. Static analysis analyses a safe model of the hardware together with the source or object code of a program to derive an estimate of the WCET. This estimate is guaranteed to be equal to or greater than the real WCET. This is done by making calculations which in all steps make sure that the time is exactly or conservatively estimated. In many cases, however, the execution time of a task or a program is highly dependent on the given input. Thus, the estimated worst case may correspond to some input or configuration which is rarely (or never) used in practice. For such systems, where execution time is highly input dependent, a more accurate timing analysis which take input into consideration is desired. In this thesis we present a method based on abstract interpretation and counting of semantic states of a program that gives a WCET in terms of some input to the program. This means that the WCET is expressed as a formula of the input rather than a constant. This means that once the input is known, the actual WCET may be more accurate than the absolute and global WCET. Our research also investigate how this analysis can be safe when arithmetic operations causes integers to wrap-around, where the common assumption in static analysis is that variables can take the value of any integer. Our method has been implemented as a prototype and as a part of a static WCET analysis tool in order to get experience with the method and to evaluate the different aspects. Our method shows that it is possible to obtain very complex and detailed information about the timing of a program, given its input.
722

A formal theory of cohesive discourse

Van Wolkenten, Raymond 08 1900 (has links)
No description available.
723

Polymer degradation in sliding elastohydrodynamic lubrication

Walker, David Leroy 12 1900 (has links)
No description available.
724

Digital calculation of real time velocity profiles using ultrasonics

Cohen, Joseph P. 12 1900 (has links)
No description available.
725

Effect Size and Moderators of Effects for Token Economy Interventions

Soares, Denise 2011 December 1900 (has links)
There is a clear call to use evidence-based practice (EBP) in schools, and a growing knowledge base of practices that have proven to be effective in helping students achieve in educational settings. In addition, the current trends of Positive Behavior Supports (PBS) and Response to Intervention (RtI) advocate for preventative and proactive strategies. Token economies (TE) are one intervention that is proactive and can be flexible to use with students across a wide range of behaviors and settings. According to Higgins, Williams, and McLaughlin, token economy (TE) is an effective way to improve classroom behavior. Unfortunately, limited recent research is available that evaluated the effects and moderators of token economies in classroom settings. The purpose of this investigation was to Meta-analyze the single case research on TE implemented in school and is the first to offer effect size analysis and identify moderators. The use of TE's has been widely established as an evidence-based intervention for use in prisons, psychiatric hospitals, and school settings. However, very few articles discuss size of effects to expect, the essential elements required, or the practical implementation issues within a classroom. Many myths surround the use of a TE, i.e., many assume a token system is effective only for individuals and this is not so, as TE is effective for groups as well as individuals. In an age of accountability and emphasis on preventative evidence based practice evidence for using a TE and how to implement a TE is needed in our literature. Empirical evidence for the use of a token economy in a classroom is presented along with suggested implementation ideas. Twenty four studies were included in this Meta-analysis with an overall combined Tau-U ES of .78 of data showing improvement between phase A and B with CI90 [.72, .83]. Tau-U effect sizes ranged from .35 to 1.0. TE is effective with all ages evaluated (ages 3 - 15); however, statistically significant results indicated it was more effective with ages 6 - 15. Active ingredients (i.e. procedural steps) were evaluated, combined, and reported. Results indicate that TE is an evidence-based intervention to increase academic readiness behaviors and to decrease inappropriate behaviors.
726

Unbundling a Big Deal: Comparing Three Data Sources to Come to Decisions

Dawson, Diane January 2014 (has links)
Academic libraries in Canada acquire many “big deal” journal packages through a national consortium, the Canadian Research Knowledge Network (CRKN). Recently, negotiations with the American Chemical Society (ACS) broke down and it appeared that member libraries would need to negotiate with ACS individually for the Web Editions bundle of 40+ titles. It soon became clear that the University Library, University of Saskatchewan would likely no longer be able to afford the entire package, and tough decisions would need to be made. Usage data on each title were readily available – but is that enough evidence? Working under the common assumption that the primary users of this package are the Chemistry Department researchers, a citation analysis was conducted on what ACS journals these users recently published in and cited in their articles. The Chemistry Department was kept informed of developments in the ACS/CRKN situation, and expressed interest and concern in the outcome. In an effort to continue to engage chemistry researchers and offer them a voice in the process, a survey of their opinions on each ACS title was also conducted. It was hoped that combining data from these three discrete sources: usage statistics, citation analyses, and user feedback, would enable us to arrive at the most conscientious, evidence-based decisions possible. This presentation will discuss the outcome of this thorough analysis and consider the benefits and challenges of this comprehensive methodology and whether it is practical in every situation. / Slides from "Featured Paper" presentation to STS Research Forum, American Library Association Annual Conference, Las Vegas, June 2014
727

Trace elements in a porphyry copper deposit by atomic absorption spectrometry

Jeon, Gyoo Jeong January 1983 (has links)
This study concerns the origin of a porphyry copper deposit. For the study 25 rock samples were taken from a porphyry copper deposit. Rock samples were analyzed for these trace elements: strontium, barium, copper, zinc, and vanadium. The results indicated that elevation correlations are not apparent for copper, zinc, and vanadium content, but definite correlations appear between strontium and barium content. This study determined that strontium is one of the trace elements which could be used as a possible exploration guide to porphyry copper mineralization. This study also provided that distribution of trace elements demonstrates the presence of a regional difference in the strontium and barium contents of the sample locations between the Dome Peak Quadrangle area and the Agnes Mountain area of the Cloudy Pass batholith; the volcanics of the former possess higher levels of strontium and barium concentrations.
728

The communication of West Nile virus risk: a newspaper analysis

Watts, Dorian E. 01 September 2011 (has links)
The purpose of this research was to understand how the risks associated with West Nile virus (WNV) were presented by the Winnipeg Free Press. A detailed content analysis was completed on all Winnipeg Free Press articles and Manitoba Health news releases, between 1999 and 2008, containing information related to West Nile. Additional data included interviews with government and media representatives. Several recurring frames, including blame, controversy, rights and fairness, risk, and uncertainty were found in the newspaper data. Over time there was a decrease in both the coverage and prominence of WNV-related issues by the Winnipeg Free Press. In terms of the use of sources by media, the provincial government was found to be the most commonly used source in this context. Reporting of WNV-related issues by the Winnipeg Free Press has been relatively clear and balanced despite some initial alarmist coverage surrounding the uncertainty of the arrival of WNV.
729

Automatic identification of segments in written texts

Sardinha, Antonio Paulo Berber January 1997 (has links)
No description available.
730

Estimation of survival of left truncated and right censored data under increasing hazard

Shinohara, Russell. January 2007 (has links)
When subjects are recruited through a cross-sectional survey they have already experienced the initiation of the event of interest, say the onset of a disease. This method of recruitment results in the fact that subjects with longer duration of the disease have a higher chance of being selected. It follows that censoring in such a case is not non-informative. The application of standard techniques for right-censored data thus introduces a bias to the analysis; this is referred to as length-bias. This paper examines the case where the subjects are assumed to enter the study at a uniform rate, allowing for the analysis in a more efficient unconditional manner. In particular, a new method for unconditional analysis is developed based on the framework of a conditional estimator. This new method is then applied to the several data sets and compared with the conditional technique of Tsai [23].

Page generated in 0.2432 seconds