• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1918
  • 597
  • 576
  • 417
  • 240
  • 177
  • 57
  • 53
  • 40
  • 26
  • 26
  • 25
  • 24
  • 23
  • 20
  • Tagged with
  • 4803
  • 533
  • 503
  • 497
  • 429
  • 421
  • 375
  • 362
  • 354
  • 345
  • 340
  • 336
  • 318
  • 318
  • 316
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
451

Street Codes, Routine Activities, Neighborhood Context, and Victimization: An Examination of Alternative Models

McNeeley, Susan January 2013 (has links)
No description available.
452

Investigating global positioning system helibowl antenna performance sensitivity with variation in design parameters

Surathu, Mahesh January 1999 (has links)
No description available.
453

A translator converting symbolic microprogram into microcodes

Lin, Wen-Tai January 1981 (has links)
No description available.
454

Implementation of forth with floating point capabilities on an 8085 system

Graham, Douglas R. January 1985 (has links)
No description available.
455

Automatic Source Code Transformation To Pass Compiler Optimization

Kahla, Moustafa Mohamed 03 January 2024 (has links)
Loop vectorization is a powerful optimization technique that can significantly boost the runtime of loops. This optimization depends on functional equivalence between the original and optimized code versions, a requirement typically established through the compiler's static analysis. When this condition is not met, the compiler will miss the optimization. The process of manually rewriting the source code to pass an already missed compiler optimization is time-consuming, given the multitude of potential code variations, and demands a high level of expertise, making it impractical in many scenarios. In this work, we propose a novel framework that aims to take the code blocks that the compiler failed to optimize and transform them to another code block that passes the compiler optimization. We develop an algorithm to efficiently search for a code structure that automatically passes the compiler optimization (weakly verified through a correctness test). We focus on loop-vectorize optimization inside OpenMP directives, where the introduction of parallelism adds complexity to the compiler's vectorization task and is shown to hinder optimizations. Furthermore, we introduce a modified version of TSVC, a loop vectorization benchmark in which all original loops are executed within OpenMP directives. Our evaluation shows that our framework enables " loop-vectorize" optimizations that the compiler failed to pass, resulting in a speedup up to 340× in the blocks optimized. Furthermore, applying our tool to HPC benchmark applications, where those applications are already built with optimization and performance in mind, demonstrates that our technique successfully enables extended compiler optimization, thereby accelerating the execution time of the optimized blocks in 15 loops and the entire execution time of the three applications by up to 1.58 times. / Master of Science / Loop vectorization is a powerful technique for improving the performance of specific sections in computer programs known as loops. Particularly, it simultaneously executes instructions of different iterations in a loop, providing a considerable speedup on its runtime due to this parallelism. To apply this optimization, the code needs to meet certain conditions, which are usually checked by the compiler. However, sometimes the compiler cannot verify these conditions, and the optimization fails. Our research introduces a new approach to fix these issues automatically. Normally, fixing the code manually to meet these conditions is time-consuming and requires high expertise. To overcome this, we've developed a tool that can efficiently find ways to make the code satisfy the conditions needed for optimization. Our focus is on a specific type of code that uses OpenMP directives to split the loop on multiple processor cores and runs them simultaneously, where adding this parallelism makes the code more complex for the compiler to optimize. Our tests show that our approach successfully improves the speed of computer programs by enabling optimizations initially missed by the compiler. This results in significant speed improvements for specific parts of the code, sometimes up to 340 times faster. We've also applied our method to well-optimized computer programs, and it still managed to make them run up to 1.58 times faster.
456

THE SEVEN LAWS OF NOAH OR NOVAK: AN ANALYSIS OF DAVID NOVAK’S ACCOUNTS OF NATURAL LAW

Milevsky, Jonathan 16 November 2017 (has links)
This thesis identifies two accounts within David Novak’s Jewish natural law theory. In the earlier account, Novak locates natural law within the Noahide commandments; in the later account, he also locates it within the reasons for the commandments and rabbinic enactments. The change between these accounts is marked by a shift in his description of rationality. The norms of the Noahide code are originally described as known strictly by reference to themselves. As he begins grounding the norms in the imago Dei, that knowledge becomes dependent on a “cultural heritage,” by which Novak comes to mean an explanation based on a doctrine of creation. By comparing the original presentation of the later account with its more developed iteration and highlighting the features that are unique to the earlier and later accounts, it becomes possible to identify components of the later account that are added to his subsequent treatment of the Noahide code and facets of the earlier account that are later added to his discussion of the reasons for the commandments and rabbinic enactments. These efforts at reconciliation include the normative content incorporated into the later account, the metaphysical background added to the later treatment of the Noahide code, the mediating concept of personhood, the phenomenological retrieval of the Noahide commandments, and the argument for minimal and maximal claims. Finally, this thesis analyzes the relationship between Novak’s natural law theory and his view of redemption. Given that as Novak’s natural law theory becomes less dependent on reason and more heavily based on a doctrine of creation, his treatment of redemption changes from being associated with a period of greater human understanding to a time that is characterized by God’s accomplishments on humanity’s behalf, I argue that there is a parallel between those concepts. I then draw on that parallel to show that Novak’s natural law is compatible with, and perhaps inseparable from, his covenantal thought. / Thesis / Doctor of Philosophy (PhD)
457

A Design Language for Scientific Computing Software in Drasil / Design Language for SCS

MacLachlan, Brooks January 2020 (has links)
Drasil is a framework for generating high-quality documentation and code for Scientific Computing Software (SCS). Despite the tendency of SCS code to follow an Input-Calculate-Output design pattern, there are many design variabilities in SCS. Drasil should therefore allow its users to specify a design for their generated program. To this end, a language that encodes the design variabilities present in SCS was implemented in Drasil. Drasil's code generator was updated to generate code based on a user's design choices. A Generic Object-Oriented Language, GOOL, from which Python, Java, C#, or C++ code can be generated, was implemented. Drasil's code generator targets GOOL, enabling the choice of programming language to be included in the design language. Other choices included in the language are related to the modularity, the generation of record classes to bundle variables, the inlining of constants, the code types used to represent different mathematical spaces, which external library to use to solve ODEs, the behaviour when a constraint is violated, and the presence of documentation, logging, or a controller. The design language is implemented as a record type in Haskell, and is designed to be understandable, extensible, and usable. A design specification can be easily changed, to generate a different version of the same program, which would often require time-consuming refactoring if the program was written manually. During the regular build process of Drasil, working code is generated in all four target languages for three examples of SCS, each using a different design specification. / Thesis / Master of Applied Science (MASc)
458

A TRACE/PARCS Coupling, Uncertainty Propagation and Sensitivity Analysis Methodology for the IAEA ICSP on Numerical Benchmarks for Multi-Physics Simulation of Pressurized Heavy Water Reactor Transients

Groves, Kai January 2020 (has links)
The IAEA ICSP on Numerical Benchmarks for Multiphysics Simulation of Pressurized Heavy Water Reactor Transients was initiated in 2016 to facilitate the development of a set of open access, standardized, numerical test problems for postulated accident scenarios in a CANDU styled Reactor. The test problems include a loss of coolant accident resulting from an inlet header break, a loss of flow accident caused by a single pump trip, and a loss of regulation accident due to inadvertently withdrawn adjusters. The Benchmark was split into phases, which included stand-alone physics and thermal-hydraulics transients, coupled steady state simulations, and coupled transients. This thesis documents the results that were generated through an original TRACE/PARCS coupling methodology that was developed specifically for this work. There is a strong emphasis on development methods and step by step verification throughout the thesis, to provide a framework for future research in this area. In addition to the Benchmark results, additional studies on propagation of fundamental nuclear data uncertainty, and sensitivity analysis of coupled transients are reported in this thesis. Two Phenomena and Key Parameter Identification and Ranking Tables were generated for the loss of coolant accident scenario, to provide feedback to the Benchmark Team, and to add to the body of work on uncertainty/sensitivity analysis of CANDU style reactors. Some important results from the uncertainty analysis work relate to changes in the uncertainty of figures of merit such as integrated core power, and peak core power magnitude and time, between small and large break loss of coolant accidents. The analysis shows that the mean and standard deviation of the integrated core power and maximum integrated channel power, are very close between a 30% header break and a 60% header break, despite the peak core power being much larger in the 60% break case. Furthermore, it shows that there is a trade off between the uncertainty in the time of the peak core power, and the magnitude of the peak core power, with smaller breaks showing a smaller standard deviation in the magnitude of the peak core power, but a larger standard deviation in when this power is reached during the transient, and vice versa for larger breaks. From the results of the sensitivity analysis study, this thesis concludes that parameters related to coolant void reactivity and shutoff rod timing and effectiveness have the largest impact on loss of coolant accident progressions, while parameters that can have a large impact in other transients or reactor designs, such as fuel temperature reactivity feedback and control device incremental cross sections, are less important. / Thesis / Master of Science (MSc) / This thesis documents McMaster’s contribution to an International Atomic Energy Agency Benchmark on Pressurized Heavy Water Reactors that closely resemble the CANDU design. The Benchmark focus is on coupling of thermal-hydraulics and neutron physics codes, and simulation of postulated accident scenarios. This thesis contains some select results from the Benchmark, comparing the results generated by McMaster to other participants. This thesis also documents additional work that was performed to propagate fundamental nuclear data uncertainty through the coupled transient calculations and obtain an estimate of the uncertainty in key figures of merit. This work was beyond the scope of the Benchmark and is a unique contribution to the open literature. Finally, sensitivity studies were performed on one of the accident scenarios defined in the Benchmark, the loss of coolant accident, to determine which input parameters have the largest contribution to the variability of key figures of merit.
459

Widely linear minimum variance channel estimation with application to multicarrier CDMA systems

Abdallah, Saeed January 2007 (has links)
No description available.
460

Mandatory Uniform Dress Code Implementation and the Impact on Attendance, Achievement, and Perceptions of Classroom Environment

Ward, Ella Porter 24 April 1999 (has links)
One of the many attempts to solve problems that plague America's schools is the implementation of uniform dress code policies. Those who favor uniforms contend that uniforms will increase attendance, enhance academic achievement, and improve classroom environment. Prior research studies ( Behling, 1991; Hughes, 1996; and Hoffler-Riddick, 1998) on the effects of mandatory school uniforms have been inconclusive in their findings. The purpose of this study was to examine the impact of mandatory uniform dress codes on student attendance, student achievement, and teachers' perceptions of classroom environment in two middle schools. The dependent variables were student attendance, student achievement, and teachers' perceptions of classroom environment. The independent variables were gender, race/ethnicity and time/years of teaching experience. Descriptive statistics and Analyses of Variance were used to analyze the data. Repeated Measures Analyses of Variance was used to analyze the attendance data in School A for three consecutive years. Analyses of Variance was used to measure the attendance and achievement data in School B for two consecutive years. A self-report questionnaire was designed to measure teachers' perceptions of the impact of uniforms on four domains of classroom environment: student attendance, student behavior, student achievement, and students' self-image. Three-way Analysis of Variance was used to analyze the data collected from the questionnaire. The results of this study determined that there were no statistically significant differences in overall student attendance or achievement in School A. There were improvements in student achievement in School B after the change in dress to school uniforms. There were inconsistent differences between race/ethnicity and gender with respect to attendance after uniform implementation in schools A and B. Absences increased in School A after the second year with uniforms. Student achievement improved for students in School B, but showed no change in School A. Based on the results of the Uniform Survey administered to teachers in both schools, the perception of classroom environment after uniforms was generally positive. Teachers overwhelmingly supported the uniform policy, but they were inconsistent in their opinions of the overall impact on classroom environment. Teachers in School A felt that student achievement and student self-image improved after the implementation of school uniforms, but they saw no improvements in student attendance or behavior. Teachers in School B felt that student attendance declined after the first year of uniform implementation; however, they felt that there were improvements in student behavior, student achievement, and student self-image. Future research should examine the impact of mandatory uniform dress codes on school climate, students' self-esteem, and the perceptions of parents, students and members of the community. / Ed. D.

Page generated in 0.3727 seconds