• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1605
  • 955
  • 151
  • 128
  • 126
  • 103
  • 102
  • 49
  • 43
  • 38
  • 32
  • 32
  • 32
  • 32
  • 32
  • Tagged with
  • 3927
  • 853
  • 541
  • 392
  • 375
  • 325
  • 266
  • 249
  • 231
  • 230
  • 227
  • 227
  • 224
  • 221
  • 216
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
521

Production and comparative ecology of Euphausiis in the Gulf of St. Lawrence.

Berkes, Fikret January 1973 (has links)
No description available.
522

Development in North East People's Republic of China: An Analysis of Enterprise Performance 1995-2002.

Weiss, John A., Geng, Xiao January 2007 (has links)
No / Regional disparities within China are now an important policy question. Recently the three provinces of the north-east region have been identified as priority areas for regional development, along with the Western part of the country. The north-east is the old industrial heartland of the country and its economy is based around heavy industry, mineral extraction and state owned enterprises. This paper uses a unique database on medium and large-scale enterprises to establish how far enterprise performance in the north-east differs from the national average and the reasons for any such differences. It finds that even allowing for industrial structure and ownership, performance in the north-east is significantly below that in the rest of the country. This is attributed to aspects of the investment climate in the region.
523

A neuropsychological investigation of verbal and nonverbal fluency: perspectives on asymmetries in frontal lobe functioning

Demakis, George J. 06 June 2008 (has links)
The neuropsychological construct of fluency refers to productivity under timed and limited search conditions. Verbal fluency tasks, such as the Controlled Oral Word Association Test (COWAT), require production of words under specific constraints (i.e. words that begin with "F"), whereas nonverbal fluency tests, such as the Design Fluency Test (DFT), require generation of unique and unnameable designs. Asymmetric deficits have been obtained, with verbal impairments associated with left frontal pathology and nonverbal impairments with right frontal involvement. These fluency constructs, however, have not been systematically examined in conjunction with one another, or with other frontally-mediated tasks. Therefore, this experiment evaluated verbal and nonverbal fluency, and their relationship to tasks purported to be sensitive to frontal lobe functioning. A double dissociation was predicted; the COWAT would be associated with left but not right frontal tasks, but the reverse pattern would be evident for the DFT. Left frontal tasks included the Trail-Making Tests and the Stroop Color-Word Test, whereas right frontal tasks consisted of the Ruff Figural Fluency Test (RFFT) and newly created measures of facial accuracy and intensity. Bilateral motor tasks (i.e. dynamometer and finger tapping) were also evaluated. In this sample of male college students (N = 60), multivariate analyses (MANOVAS) indicated that the left-hemisphere variables successfully discriminated subjects classified as verbally fluent or verbally nonfluent, but did not do so with nonverbal fluency. The opposite pattern was obtained with right hemisphere variables, which was due to the RFFT rather than the facial measures. Measures of facial accuracy and intensity, as well as motor tasks, were minimally related to either of the fluency measures. Additionally, univariate analyses (ANOVAS) revealed that verbally fluent subjects performed better than verbally nonfluent subjects on Trails B, but not A, and on selected components of the Stroop. These findings are discussed in regard to their capacity to demonstrate asymmetries in frontal lobe functioning. Relationships between the presumed frontal measures and fluency constructs is also used to examine the unique ways in which fluency measures may challenge the frontal lobes in higher-order cognitive processing. Recommendations for the development of improved verbal and nonverbal fluency measures are provided. / Ph. D.
524

A multiple case study research to determine and respond to management information needs using Total-Factor Productivity Measurement

Pineda, Antonio J. 08 August 2007 (has links)
This study (1) determines the information managers commonly need to make decisions and initiate actions to improve performance, based on selected case studies, (2) investigates and explains the features and issues involved with how the different versions of TFPM address these information needs, and (3) develops a teaching model of TFPM. Based on the literature review, interviews with experts, and experiences with applications, the features and differences of the available TFPM versions were explained, providing sample applications whenever necessary. Using four selected cases, common user information needs were identified and compared with results of previous surveys. Alternative TFPM applications for each case were developed and evaluated using Archer's (1978) Design Process as implemented with VPC's (1990) PRFORM software. Based on the evaluations of the TFPM applications in each of the case studies, a teaching TFPM model was developed incorporating the features of the available TFPM versions that most appropriately responded to the common information needs. Some other features not portrayed in the available TFPM versions were added to facilitate portrayal, understanding, and acceptance for new users. There are basically two models of TFPM - the Productivity Indices (PI) Model and the Profitability = Productivity + Price Recovery (PPPR) Model. I proved that as implemented with discrete variables, Gollop's Model is equivalent to the PPPR Model. Various versions of these two models feature differences in deflation, aggregation of Outputs, inputs, and/or organizational units, treatment of capital, computation of dollar effects of changes in performance, and how to use TFPM for planning. The common information needs identified were (1) measures of a firm's past performance using physical productivity related to profitability; (2) measures of individual organizational units’ productivity aggregated into plant, division, or firm level productivity; (3) partial measures to explain what factors dr.ve the total performance measures; and (4) evaluations of plans/budgets to ensure performance improvement. Based on the evaluations of possible TFPM versions appropriate for each application, REALST stands out as the most advanced and flexible version. However, it has become too complicated for first-time users. Hence, the teaching TFPM model I have developed is a simplified version of REALST. / Ph. D.
525

A Management Paradigm for FPGA Design Flow Acceleration

Tavaragiri, Abhay 21 July 2011 (has links)
Advances in FPGA density and complexity have not been matched by a corresponding improvement in the performance of the implementation tools. Knowledge of incremental changes in a design can lead to fast turnaround times for implementing even large designs. A high-level overview of an incremental productivity flow, focusing on the back-end FPGA design is provided in this thesis. This thesis presents a management paradigm that is used to capture the design specific information in a format that is reusable across the entire design process. A C++ based internal data structure stores all the information, whereas XML is used to provide an external view of the design data. This work provides a vendor independent, universal format for representing the logical and physical information associated with FPGA designs. / Master of Science
526

Methodology for structured VHDL model development

Gummadi, Ram 17 March 2010 (has links)
The Rapid Prototyping of Application Specific Signal Processors (RASSP) program seeks an improvement in the time required to take a design from concept to fielded prototype or to upgrade an existing design, with similar improvements in design quality and life cycle cost. The term Rapid System Prototyping signifies the need to develop systems in significantly less time or with significantly less effort, and thus provides a solution to the main problem facing the design community. Entire systems are synthesized from models in hardware description languages (HDLs). The goal of this thesis is to provide a methodology for rapidly creating a database, that can be reused thus decreasing design cost and time for both current and future projects. To demonstrate the methodology, this thesis describes the development of VHDL primitives supporting digital signal processing (DSP) and image processing operations for two of the RASSP specific applications: 1) Synthetic aperture radar image processor (SAR) and 2) Automatic target recognition (ATR) image processing algorithm. Different techniques are investigated to populate these VHDL libraries using commercial tools. The thesis proposes techniques for solving some problems related to the use of commercial tools to generate VHDL code. It includes a full implementation of the SAR processor algorithm developed from DSP primitives. / Master of Science
527

Methods of estimating repeatability and most probable producing ability in beef cattle

Thompson, Carl Eugene January 1971 (has links)
The purpose of this study was to determine the most accurate method of calculating the most probable producing ability (MPPA) of beef cows based on one, two or three records. A preliminary analysis of preweaning records on 9,515 calves from 4 Angus herds enrolled in the Virginia Beef Cattle Improvement Association program was conducted on average daily gain, 205-day weight and index value. Adjusted 205-day weight was selected for further study. To obtain a realistic repeatability value of adjusted 205-day calf weight, needed for calculating MPPA, separate estimates were first obtained for each herd by intraclass correlation. These values were 0.34, 0.00, 0.35 and 0.14. The individual herd sums of squares and degrees of freedom were then pooled to obtain an overall estimate of 0.31. Repeatability was also calculated by the correlation of adjacent records (r = 0.48) and by the regression of the second record on the first (b = 0.51). The pooled intraclass correlation coefficient (0.31) was selected as the best for the calculation of all MPPAs. MPPAs were calculated by the method of Lush (1949), using the dam's progeny record: (1) as a deviation from the herd-year average, and (2) as a deviation from the sire progeny average within years. MPPAs were calculated by both methods based on each of one, tuo and three records and each of these correlated with the average of all subsequent records (maximum of six). The first, second and third MPPA values by the herd-year method yielded correlation coefficients of 0.18, 0.10 and 0.04 with the average of all subsequent records, respectively. Corresponding sire-year correlation coefficients were 0.23, 0.31 and 0.36, respectively. The differences between the correlation coefficients obtained by the two methods for MPPA one, two and three were significant at the level of P < .10, P < .05 and P < .0005, respectively, indicating that the sire-year method is more accurate. The variance was also lower for the sire-year method (427 ± 44.2 pounds) than for the herd-year method (428 ± 55.2 pounds). Thus, the author has concluded from these results that the sire-year method of calculating most probable producing ability is a more accurate measure of the true productivity of beef cows. / Ph. D.
528

Human factors display considerations for real-time position measurement system: construction applications

Gibson, Michael P. 25 April 2009 (has links)
Proper layout and positioning of building components is essential for an accurate and productive construction process. Knowing the specific location and position where a building component belongs in real-time (instantaneous), would enhance the crafts person's performance significantly (Beliveau, 1992a). The Real-time Position Measurement (RtPM) system was developed to increase productivity and accuracy in the construction industry by providing the crafts person real-time position and measurement information of specific building locations or components. This measurement information is relayed to the crafts person through the human interface, which consists of graphical navigational maps and alphanumeric displays. Two experiments were conducted. Experiment A tested the graphical computer interfaces and Experiment B compared the graphical interfaces versus an alphanumeric interface. Both experiments simulated building component positioning tasks. Subjects acted as construction workers whose jobs were to accurately place simulated building components on specified target locations utilizing the RtPM system. Participants used the RtPM's navigational maps, displays and/or coordinate system to place several anchor bolts onto specific locations on the concrete piers. The experiments provided data on the speed and accuracy of positioning tasks while the subject was using the RtPM system. The data consisted of the time required to accomplish each building component positioning task and the accuracy of the subject's placement of the receiver pole on the target point. The results of Experiment A indicate that a heading-up map is better for navigational tasks using the RtPM system than that of a vector-up map. The treatment conditions which incorporated a heading-up map gave lower navigational times than those using a vector-up map. Subjects were also accurate across all treatment conditions in Experiment A. The results of Experiment A suggest that participants interpreted the navigational information across all conditions in the same manner thus achieving similar accuracies. The results of Experiment B indicate that the graphical interface provides good directional information and a high degree of accuracy when subjects perform positioning tasks using the RtPM system. The alphanumeric format, while providing a high degree of accuracy, did not provide a fast and effective way of navigation using the RtPM system. Graphical displays should be incorporated into the new software design for the RtPM system. These navigational experiments should provide a means for designing future RtPM surveying system software for the human user. Humans are critical components in most systems and are often neglected during the design process. The results from this research could significantly enhance the effectiveness by which humans interact with the RtPM system. / Master of Science
529

Systems Dynamics Simulation To Improve Timber Harvesting System Management

McDonagh, Kieran D. 06 December 2002 (has links)
Two computer simulation models were developed to address harvest system - stand assignment and wood flow variability problems in the southeast United States. The Harvest System Assignment (HSA) model is used to evaluate the impact of a particular stand assignment on harvest system effectiveness and is designed to assist with harvest system assignment decisions. Four general harvesting systems: manual, mechanized, shovel and cut-to-length can be modeled to harvest timber, from standing trees to processed logs loaded on to trucks. Model testing showed that as terrain, tract and system characteristics changed, the effectiveness of each of the four systems varied. The most effective system can be determined for any combination of terrain, tract and system characteristics. The model output shows production potential as well as cost per unit, and identifies the causes and magnitude of inefficiency. The Machine Allocation (MA) model is used to evaluate the potential of a given machine combination and is designed as a research tool to investigate the cause and impact of machine interactions. This model has a defined system structure and can incorporate up to five machines for each of three phases in the harvesting operation: felling, skidding and processing. Particular system configurations can be evaluated and possible improvements to machine combination determined. The HSA model is a widely applicable tool that will be available for industry in the southeastern United States. It has utility for training of personnel and for operational use. The MA model is a detailed tool that will be used in a research capacity to advance harvesting system management. / Master of Science
530

A comparison of human performance on computer text editing tasks using windowed and non-windowed strategies

O'Keefe, Timothy J. January 1987 (has links)
Software packages which use windows have become increasingly popular in the last two years. Their popularity derives from the belief that windows will improve productivity by decreasing task completion time. However, two studies (Silver, 1985 and Davies, Bury and Darnell, 1985) have found this not to be the case. In fact, one of the studies (Davies at al., 1985) found that task completion time was increased when using windows. It is thought that performance using window systems is a function of the number of responses required to be executed as well as the amount of information which must be found and used to complete a task. The purpose of the present study was to determine under what conditions, in terms of memory load and task complexity, performance using windows and non-window strategies differed. Forty-eight subjects were placed in one of four environments and each performed six editing tasks which varied on complexity and memory load level. Human performance in one windowed environment was compared to three non-window strategies. These three strategies, note-taking, memorizing, or switching between files, were included to allow comparisons in terms of working memory and number and types of responses executed. The tasks required subjects to locate information from a supplementary file and type it into a main file. The three memory load levels which were used required subjects to find either 2, 4 or 8 pieces of information. The two complexity levels referred to the placement of needed information in the supplementary file; whether or not information was located in close proximity to other needed information. Results indicated that it made little difference which system was used in the low memory load condition. However, as memory load increased, more subjects were found to make errors in the non-window conditions. More responses were executed in the Switch condition than in the Window or Memorization conditions in the high memory load condition. Mental workload was also found to be higher in the Memorization and Switch conditions than in the Window and Notes conditions as memory load increased. Nevertheless, there was no significant interaction for task completion times between Memory Load and Environment. This was thought to be due to a failure to adequately load working memory as well as a failure of a test of verbal and spatial ability to account for individual differences. It was concluded that the benefits of windows are not apparent until one°s working memory capacity is exceeded. As memory load increases beyond this point, it is thought that memorization will quickly become an inefficient strategy due to limitations of memory capacity. As memory load continues to increase, a switching strategy should become inefficient due to limitations of both memory and response capacities. A strategy of note taking should not become inefficient until a large memory load is placed upon the user. This is because note taking is a well learned uncomplicated response. The benefits of windows include a reduction in the number of responses, errors, and mental workload due to their ability to reduce the amount of mental resources required by providing the user with a very efficient and accurate memory aid. / Ph. D.

Page generated in 0.0965 seconds