• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 486
  • 109
  • 77
  • 77
  • 72
  • 64
  • 19
  • 16
  • 16
  • 15
  • 13
  • 12
  • 7
  • 7
  • 3
  • Tagged with
  • 1170
  • 180
  • 127
  • 127
  • 109
  • 99
  • 95
  • 95
  • 91
  • 90
  • 84
  • 83
  • 76
  • 71
  • 67
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Design-build vs design-bid-build a procurement method selection framework

Stauffer, Griffin K. 08 1900 (has links)
Proper procurement method selection is an integral part of project success. Better informed owners are able to more successfully select the project delivery systems that best suit their needs. This study utilizes utility theory to construct a framework to assist in the procurement decision making process. Through the use of expert weighting of important procurement criteria, real world projects were used to develop an overall threshold to which future owner's can compare their subsequent projects. This threshold, which marks the boundary between Design-Build and Design-Bid-Build, can be used to measure an owner's propensity to use either procurement method. It is fully tailorable to any owner, as owner-specific inputs are used. This ability for owners to objectify the largely subjective procurement decision making process allows owners to create a predictable, measurable trend, thereby improving their overall decision making ability. / US Navy (USN) author.
62

A Restricted Analysis of the Relationship between Property Tax Assessments and Electric Utility Earnings in Denton

Rudd, Edwin Derle 05 1900 (has links)
The primary purpose of this study will be to compare the equitableness of utility earnings as a source of municipal revenue when compared to that of property taxes.
63

Utility Stock Splits: Signaling Motive Versus Liquidity Motive

Miranda, Maria Mercedes 20 May 2005 (has links)
Despite the rich literature on theories of stock splits, studies have omitted public utility firms from their analysis and only analyzed split by industrial firms when examining managerial motives for splitting their stock. I examine the liquidity-marketability hypothesis, which states that stock splits enhance the attractiveness of shares to individual investors and increase trading volume by adjusting prices to an optimum trading range. Changes in the regulatory process, resulting from EPACT, have opened a window of opportunity for the study and comparison of the two traditional motives for splitting stock --signaling versus liquidity-marketability motives. Public electric utility firms provide a clean testing ground for these two non-mutually exclusive theories as liquidity/marketability hypothesis should dominate before the enactment of the EPACT since the conventional signaling theory of common stock splits should not apply given the low levels of information asymmetry in regulated utility companies. In the post-EPACT period, however, the signaling effect is expected to play a more dominant role. Based on both univariate and multivariate analyses, my results are consistent with the hypothesis posed. For the pre-EPACT period, liquidity motive seems to predominate in explaining the abnormal announcement return of utility stock splits. On the other hand, the results support the signaling motive as a leading explanation of abnormal returns in the post-EPACT period.
64

Utility-oriented internetworking of content delivery networks

Pathan, Al-Mukaddim Khan January 2009 (has links)
Today’s Internet content providers primarily use Content Delivery Networks (CDNs) to deliver content to end-users with the aim to enhance their Web access experience. Yet the prevalent commercial CDNs, operating in isolation, often face resource over-provisioning, degraded performance, and Service Level Agreement (SLA) violations, thus incurring high operational costs and limiting the scope and scale of their services. / To move beyond these shortcomings, this thesis sets out to establish the basis for developing advanced and efficient content delivery solutions that are scalable, high performance, and cost-effective. It introduces techniques to enable coordination and cooperation between multiple content delivery services, which is termed as “CDN peering”. In this context, this thesis addresses five key issues ― when to peer (triggering circumstances), how to peer (interaction strategies), whom to peer with (resource discovery), how to manage and enforce operational policies (re-quest-redirection and load sharing), and how to demonstrate peering applicability (measurement study and proof-of-concept implementation). / Thesis Contributions: To support the thesis that the resource over-provisioning and degraded performance problems of existing CDNs can be overcome, thus improving Web access experience of Internet end-users, we have: / - identified the key research challenges and core technical issues for CDN peering, along with a systematic understanding of the CDN space by covering relevant applications, features and implementation techniques, captured in a comprehensive taxonomy of CDNs; / - developed a novel architectural framework, which provides the basis for CDN peering, formed by a set of autonomous CDNs that cooperate through an interconnection mechanism, providing the infrastructure and facilities to virtualize the service of multiple providers; / - devised Quality-of-Service (QoS)-oriented analytical performance models to demonstrate the effects of CDN peering and predict end-user perceived performance, thus facilitating to make concrete QoS performance guarantees for a CDN provider; / - developed enabling techniques, i.e. resource discovery, server selection, and request-redirection algorithms, for CDN peering to achieve service responsiveness. These techniques are exercised to alleviate imbalanced load conditions, while minimizing redirection cost; / - introduced a utility model for CDN peering to measure its content-serving ability by capturing the traffic activities in the system and evaluated through extensive discrete-event simulation analysis. The findings of this study provide incentive for the exploitation of critical parameters for a better CDN peering system design; and / - demonstrated a proof-of-concept implementation of the utility model and an empirical measurement study on MetaCDN, which is a global overlay for Cloud-based content delivery. It is aided with a utility-based redirection scheme to improve the traffic activities in the world-wide distributed network of MetaCDN.
65

Modeling and Performance Evaluation of a Delay and Marking Based Congestion Controller

Wickramarathna, Thamali Dilusha N. 01 January 2008 (has links)
Achieving high performance in high capacity data transfers over the Internet has long been a daunting challenge. The current standard of Transmission Control Protocol (TCP), TCP Reno, does not scale efficiently to higher bandwidths. Various congestion controllers have been proposed to alleviate this problem. Most of these controllers primarily use marking/loss or/and delay as distinct feedback signals from the network, and employ separate data transfer control strategies that react to either marking/loss or delay. While these controllers have achieved better performance compared to existing TCP standard, they suffer from various shortcomings. Thus, in our previous work, we designed a congestion control scheme that jointly exploits both delay and marking; D+M (Delay Marking) TCP. We demonstrated that D+M TCP can adapt to highly dynamic network conditions and infrastructure using ns-2 simulations. Yet, an analytical explanation of D+M TCP was needed to explain why it works as observed. Furthermore, D+M TCP needed extensive simulations in order to assess its performance, especially in relation to other high-speed protocols. Therefore, we propose a model for D+M TCP based on distributed resource optimization theory. Based on this model, we argue that D+M TCP solves the network resource allocation problem in an optimal manner. Moreover, we analyze the fairness properties of D+M TCP, and its coexistence with different queue management algorithms. Resource optimization interpretation of D+M TCP allows us to derive equilibrium values of steady state of the controller, and we use ns-2 simulations to verify that the protocol indeed attains the analytical equilibria. Furthermore, dynamics of D+M TCP is also explained in a mathematical framework, and we show that D+M TCP achieves analytical predictions. Modeling the dynamics gives insights to the stability and convergence properties of D+M TCP, as we outline in the thesis. Moreover, we demonstrate that D+M TCP is able to achieve excellent performance in a variety of network conditions and infrastructure. D+M TCP achieved performance superior to most of the existing high-speed TCP versions in terms of link utilization, RTT fairness, goodput, and oscillatory behavior, as confirmed by comparative ns-2 simulations.
66

Towards a Framework For Resource Allocation in Networks

Ranasingha, Maththondage Chamara Sisirawansha 26 May 2009 (has links)
Network resources (such as bandwidth on a link) are not unlimited, and must be shared by all networked applications in some manner of fairness. This calls for the development and implementation of effective strategies that enable optimal utilization of these scarce network resources among the various applications that share the network. Although several rate controllers have been proposed in the literature to address the issue of optimal rate allocation, they do not appear to capture other factors that are of critical concern. For example, consider a battlefield data fusion application where a fusion center desires to allocate more bandwidth to incoming flows that are perceived to be more accurate and important. For these applications, network users should consider transmission rates of other users in the process of rate allocation. Hence, a rate controller should consider application specific rate coordination directives given by the underlying application. The work reported herein addresses this issue of how a rate controller may establish and maintain the desired application specific rate coordination directives. We identify three major challenges in meeting this objective. First, the application specific performance measures must be formulated as rate coordination directives. Second, it is necessary to incorporate these rate coordination directives into a rate controller. Of course, the resulting rate controller must co-exist with ordinary rate controllers, such as TCP Reno, in a shared network. Finally, a mechanism for identifying those flows that require the rate allocation directives must be put in place. The first challenge is addressed by means of a utility function which allows the performance of the underlying application to be maximized. The second challenge is addressed by utilizing the Network Utility Maximization (NUM) framework. The standard utility function (i.e. utility function of the standard rate controller) is augmented by inserting the application specific utility function as an additive term. Then the rate allocation problem is formulated as a constrained optimization problem, where the objective is to maximize the aggregate utility of the network. The gradient projection algorithm is used to solve the optimization problem. The resulting solution is formulated and implemented as a window update function. To address the final challenge we resort to a machine learning algorithm. We demonstrate how data features estimated utilizing only a fraction of the flow can be used as evidential input to a series of Bayesian Networks (BNs). We account for the uncertainty introduced by partial flow data through the Dempster-Shafer (DS) evidential reasoning framework.
67

Evolving Paradigms in the Treatment of Hepatitis B

Woo, Gloria 05 September 2012 (has links)
Hepatitis B is a serious global health problem with over 2 billion people infected worldwide and 350 million suffering from chronic hepatitis B (CHB) infection. Infection can lead to chronic hepatitis, cirrhosis and hepatocellular carcinoma (HCC) accounting for 320,000 deaths per year. Numerous treatments are available, but with a growing number of therapies each with considerable trade-offs, the optimal treatment strategy is not transparent. This dissertation investigates the relative efficacy of treatments for CHB and estimates the health related quality of life (HRQOL) and health utilities of mild to advanced CHB patients. A systematic review of published randomized controlled trials comparing surrogate outcomes for the first year of treatment was performed. Bayesian mixed treatment comparison meta-analysis was used to synthesize odds ratios, including 95% credible intervals and predicted probabilities of each outcome comparing all currently available treatments in HBeAg-positive and/or HBeAg-negative CHB patients. Among HBeAg-positive patients, tenofovir and entecavir were most effective, while in HBeAg-negative patients, tenofovir was the treatment of choice. Health state utilities and HRQOL for patients with CHB stratified by disease stage were elicited from patients attending tertiary care clinics at the University Health Network in Toronto. Respondents completed the standard gamble, EQ5D, Health Utilities Index Mark 3 (HUI3), Short-Form 36 version-2 and a demographics survey in their preferred language of English, Cantonese or Mandarin. Patient charts were accessed to determine disease stage and co-morbidities. The study included 433 patients of which: 294 had no cirrhosis, 79 had compensated cirrhosis, 7 had decompensated cirrhosis, 23 had HCC and 30 had received liver transplants. Mean standard gamble utilities were 0.89, 0.87, 0.82, 0.84 and 0.86 for the respective disease stages. HRQOL in CHB patients was only impaired at later stages of disease. Neither chronic infection nor antiviral treatment lowered HRQOL. Patients with CHB do not experience lower HRQOL as seen in patients with hepatitis C. The next step in this area of research is to incorporate the estimates synthesized by the current studies into a decision model evaluating the cost-effectiveness of treatment to provide guidance on the optimal therapy for patients with HBeAg-positive and HBeAg-negative CHB.
68

Evolving Paradigms in the Treatment of Hepatitis B

Woo, Gloria 05 September 2012 (has links)
Hepatitis B is a serious global health problem with over 2 billion people infected worldwide and 350 million suffering from chronic hepatitis B (CHB) infection. Infection can lead to chronic hepatitis, cirrhosis and hepatocellular carcinoma (HCC) accounting for 320,000 deaths per year. Numerous treatments are available, but with a growing number of therapies each with considerable trade-offs, the optimal treatment strategy is not transparent. This dissertation investigates the relative efficacy of treatments for CHB and estimates the health related quality of life (HRQOL) and health utilities of mild to advanced CHB patients. A systematic review of published randomized controlled trials comparing surrogate outcomes for the first year of treatment was performed. Bayesian mixed treatment comparison meta-analysis was used to synthesize odds ratios, including 95% credible intervals and predicted probabilities of each outcome comparing all currently available treatments in HBeAg-positive and/or HBeAg-negative CHB patients. Among HBeAg-positive patients, tenofovir and entecavir were most effective, while in HBeAg-negative patients, tenofovir was the treatment of choice. Health state utilities and HRQOL for patients with CHB stratified by disease stage were elicited from patients attending tertiary care clinics at the University Health Network in Toronto. Respondents completed the standard gamble, EQ5D, Health Utilities Index Mark 3 (HUI3), Short-Form 36 version-2 and a demographics survey in their preferred language of English, Cantonese or Mandarin. Patient charts were accessed to determine disease stage and co-morbidities. The study included 433 patients of which: 294 had no cirrhosis, 79 had compensated cirrhosis, 7 had decompensated cirrhosis, 23 had HCC and 30 had received liver transplants. Mean standard gamble utilities were 0.89, 0.87, 0.82, 0.84 and 0.86 for the respective disease stages. HRQOL in CHB patients was only impaired at later stages of disease. Neither chronic infection nor antiviral treatment lowered HRQOL. Patients with CHB do not experience lower HRQOL as seen in patients with hepatitis C. The next step in this area of research is to incorporate the estimates synthesized by the current studies into a decision model evaluating the cost-effectiveness of treatment to provide guidance on the optimal therapy for patients with HBeAg-positive and HBeAg-negative CHB.
69

Practical impact of predictor reliability for personnel selection decisions

Ock, Jisoo 06 September 2012 (has links)
In personnel selection, employment tests are intended to reduce selection errors and increase mean performance. The current thesis examines the impact of psychometric properties of the predictors on selection accuracy, or the consistency between selection on observed scores versus true scores. Implications for validity and subsequent levels of job performance, or prediction accuracy, are also examined in light of common top-down personnel selection procedures. Results reflect the importance of having reliable and valid predictor measures; the work also extends ideas in the area of utility analysis.
70

The Performance Assessment of Public Electricity Enterprise Under Control¡ÐA Case Study of Taipower Company

Yang, Chuan-ming 16 June 2010 (has links)
International electricity industry has encountered many external changes in recently years, like the surge of electricity liberation. Moreover, the issue of energy and green-environment has been a global concern. When facing changes, organizations often adjust their strategies to deal with them. And a good performance evaluation system is a bridge that links the strategy and operation in this process. Performance management is an indispensable part of long-term development strategy. It can not only transfer executive¡¦s vision into strategy and action plans but also lead the organization to its initial objectives. This research takes into account the differences of nature and operating environment between electricity industry and private corporations and wants to discuss the performance evaluation system and indicators of electricity in changing environment. After considering the relevant literatures and international practical methods, this research comes up with a performance-evaluation dimension based on industrial value-chain, including manufacturing, supplying, sales and service, and social responsibility. This method takes into account the organization¡¦s mission and vision, and its industrial attribute. Moreover, the consideration of social responsibility is the biggest contribution of this research because it can echo the hiking environmental concern in recent years. In the final simulation result, we can find out that it can, compared with the real grades, reveal the nature of public utility and match the characteristic of stability.

Page generated in 0.0526 seconds