• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 553
  • 64
  • 45
  • 35
  • 24
  • 16
  • 10
  • 6
  • 5
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 876
  • 876
  • 134
  • 127
  • 120
  • 115
  • 105
  • 104
  • 89
  • 75
  • 72
  • 71
  • 68
  • 63
  • 63
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Cognitive radios : fundamental limits and applications to cellular and wireless local networks / Fundamental limits and applications to cellular and wireless local networks

Chung, Goochul 12 July 2012 (has links)
An ever increasing number of wirelessly-enabled applications places a very high demand on stringent spectral resources. Cognitive radios have the potential of enhancing spectral efficiency by improving the usage of channels that are already licensed for a specific purpose. Research on cognitive radios involves answering questions such as: how can a cognitive radio transmit at a high data rate while maintaining the same quality of service for the licensed user? There are multiple forms of cognition studied in literature, and each of these models must be studied in detail to understand its impact on the overall system performance. Specifically, the information-theoretic capacity of such systems is of great interest. Also, the design of cognitive radio is necessary to achieve those capacities in real applications. In this dissertation, we formulate different problems that relate to the performance of such systems and methods to increase their efficiency. This dissertation discusses, firstly, the means of "sensing" in cognitive systems, secondly, the optimal resource allocation algorithms for interweave cognitive radio, and finally, the fundamental limits of partially and overly cognitive overlay systems. / text
352

Workload-aware network processors : improving performance while minimizing power consumption

Iqbal, Muhammad Faisal 06 September 2013 (has links)
Network Processors are multicore processors capable of processing network packets at wire speeds of multi-Gbps. Due to their high performance and programmability, these processors have become the main computing elements in many demanding network processing equipments like enterprise, edge and core routers. With the ever increasing use of the internet, the processing demands of these routers have also increased. As a result, the number and complexity of the cores in network processors have also increased. Hence, efficiently managing these cores has become very challenging. This dissertation discusses two main issues related to efficient usage of large number of parallel cores in network processors: (1) How to allocate work to the processing cores to optimize performance? (2) How to meet the desired performance requirement power efficiently? This dissertation presents the design of a hash based scheduler to distribute packets to cores. The scheduler exploits multiple dimensions of locality to improve performance while minimizing out of order delivery of packets. This scheduler is designed to work seamlessly when the number of cores allocated to a service is changed. The design of a resource allocator is also presented which allocates different number of cores to services with changing traffic behavior. To improve the power efficiency, a traffic aware power management scheme is presented which exploits variations in traffic rates to save power. The results of simulation studies are presented to evaluate the proposals using real and synthetic network traces. These experiments show that the proposed packet scheduler can improve performance by as much as 40% by improving locality. It is also observed that traffic variations can be exploited to save significant power by turning off the unused cores or by running them at lower frequencies. Improving performance of the individual cores by careful scheduling also helps to reduce the power consumption because the same amount of work can now be done with fewer cores with improved performance. The proposals made in this dissertation show promising improvements over the previous work. Hashing based schedulers have very low overhead and are very suitable for data rates of 100 Gbps and even beyond. / text
353

Financial resource allocation in Texas : how does money matter

Villarreal, Rosa Maria, active 2010 30 April 2014 (has links)
The study examined school district expenditures in Texas and their correlations with student achievement. The following research question guided this study: Which resource allocations produce statistically significant correlations between the resource allocation variances among school district and student achievement? An ordinal logistic regression analysis included 1009 school districts in the State of Texas, 18 of 26 possible finance function codes provided per-pupil dollar amounts, and 9 of 11 possible demographic categories were utilized for the study. The study held the school district as the unit of analysis. The statistical model was used to regress the dollar amounts categorized by financial function codes and percent student demographics to determine if a relationship existed with the dependent variable of the Texas Education Agency’s defined accountability rating during the 5-year time period—2004-2008. At the national level, there is a long-standing debate over whether the amount of money allocated to education affects student achievement. The literature review presents two sides of the debate concerning whether financial resources make a difference with regard to student achievement as represented through district-level accountability ratings. The research revealed that specific school district resource allocations by function code are statistically significant with regard to district level accountability measures through the Texas Education Agency (TEA) accountability system. However, the odds ratios temper the impact of the significance. The research also revealed that demographics are statistically significant in the State of Texas accountability system. / text
354

Optimal data dissemination in stochastic and arbitrary wireless networks

Li, Hongxing, 李宏兴 January 2012 (has links)
Data dissemination among wireless devices is an essential application in wireless networks. In contrast to its wired counterparts which have more stable network settings, wireless networks are subject to network dynamics, such as variable network topology, channel availability and capacity, which are due to user mobility, signal collision, random channel fading and scattering, etc. Network dynamics complicate the protocol design for optimal data disseminations. Although the topic has been intensively discussed for many years, existing solutions are still not completely satisfactory, especially for stochastic or arbitrary networks. In this thesis, we address optimal data dissemination in both stochastic and arbitrary wireless networks, using techniques of Lyapunov optimization, graph theory, network coding, multi-resolution coding and successive interference cancellation. We first discuss the maximization of time-averaged throughput utility over a long run for unicast and multirate multicast, respectively, in stochastic wireless networks without probing into the future. For multi-session unicast communications, a utility-maximizing cross-layer design, composed of joint end-to-end rate control, routing, and channel allocation, is proposed for cognitive radio networks with stochastic primary user occupations. Then, we study optimal multirate multicast to receivers with non-uniform receiving rates, also making dynamic cross-layer decisions, in a general wireless network with both a timevarying topology and random channel capacities, by utilizing random linear network coding and multi-resolution coding. In both solutions, we assume users are selfish and prefer only to relay data for others with strong social ties. Such social selfishness of users is a new constraint in network protocol design. Its impact on efficient data dissemination in wireless networks is largely unstudied, especially under stochastic settings. Lyapunov optimization is applied in our protocol design achieving close-to-optimal utilities. Next, we turn to latency-minimizing data aggregation in wireless sensor networks having arbitrary network topologies under the physical interference model. Different from our effort for stochastic networks where we target at time-averaged optimality over a long run, the objective here is to minimize the time-span to accomplish one round of aggregation scheduling for all sensors in an arbitrary topology. This problem is NP-hard, involving both aggregation tree construction and collision-free link scheduling. The current literature mostly considers the protocol interference model, which has been shown to be less practical than the physical interference model in characterizing the interference relations in the real world. A distributed solution under the physical interference model is challenging since cumulative interferences from all concurrently transmitting devices need to be well measured. In this thesis, we present a distributed aggregation protocol with an improved approximation ratio as compared with previous work. We then discuss the tradeoff between aggregation latency and energy consumption for arbitrary topologies when the successive interference cancellation technique is in force. Another distributed algorithm is introduced with asymptotic optimality in both aggregation latency and latency-energy tradeoff. Through theoretical analysis and empirical study, we rigorously examine the optimality of our protocols comparing with both the theoretical optima and existing solutions. / published_or_final_version / Computer Science / Doctoral / Doctor of Philosophy
355

Strategic political resource allocation

Mastronardi, Nick 28 April 2015 (has links)
Economics is the study of the allocation of resources. Since Arrow's Fundamental Welfare Theorems, we know that competitive-markets achieve Pareto allocations when governments correct market failures. Thus, it has largely been the mission of economists to serve as 'Market Engineers': To identify and quantify market failures so the government can implement Pareto-improving policy (make everyone better without making anyone worse). Do Pareto- improving policies get implemented? How does policy become implemented? Achieving a Pareto efficient allocation of a nation's resources requires studying the implementation of policy, and therefore studying the allocation of political resources that influence policy. Policy implementation begins with the electoral process. In this dissertation, I use auction analysis, econometrics, and game theory to study political resource allocations in the electoral process. This dissertation consists of three research papers: Finance-Augmented Median-Voter Model, Vote Empirics, and Colonel Blotto Strategies. The Finance-Augmented Median-Voter Model postulates that candidates' campaign expenditures are bids in a first-price asymmetric all-pay auction in order to explain campaign expenditure behavior. Vote Empirics empirically analyzes the impacts of campaign expenditures, incumbency status, and district voter registration statistics on observed vote-share results from the 2004 congressional election. Colonel Blotto Strategies postulates that parties' campaign allocations across congressional districts may be a version of the classic Col Blotto game from Game Theory. While some equilibrium strategies and equilibrium payoffs have been identified, this paper completely characterizes players' optimal strategies. In total, this dissertation solves candidates' optimal campaign expenditure strategies when campaign expenditures are bids in an all-pay auction. The analysis demonstrates the need for understanding exactly the impacts of various factors, including strategic expenditures, on final vote results. The research uses econometric techniques to identify the effects. Last, the research derives the complete characterization of Col Blotto strategies. Discussed extensions provide testable predictions for cross-district Party contributions. I present this research not as a final statement to the literature, but in hopes that future research will continue its explanation of political resource allocation. An even greater hope is that in time this literature will be used to identify optimal "policy-influencing policies"; constitutional election policies that provide for the implementation of Pareto-improving government policies. / text
356

Small cell and D2D offloading in heterogeneous cellular networks

Ye, Qiaoyang 08 September 2015 (has links)
Future wireless networks are evolving to become ever more heterogeneous, including small cells such as picocells and femtocells, and direct device-to-device (D2D) communication that bypasses base stations (BSs) altogether to share stored and personalized content. Conventional user association schemes are unsuitable for heterogeneous networks (HetNets), due to the massive disparities in transmit power and capabilities of different BSs. To make the most of the new low-power infrastructure and D2D communication, it is desirable to facilitate and encourage users to be offloaded from the macro BSs. This dissertation characterizes the gain in network performance (e.g., the rate distribution) from offloading users to small cells and the D2D network, and develops efficient user association, resource allocation, and interference management schemes aiming to achieve the performance gain. First, we optimize the load-aware user association in HetNets with single-antenna BSs, which bridges the gap between the optimal solution and a simple small cell biasing approach. We then develop a low-complexity distributed algorithm that converges to a near-optimal solution with a theoretical performance guarantee. Simulation results show that the biasing approach loses surprisingly little with appropriate bias factors, and there is a large rate gain for cell-edge users. This framework is then extended to a joint optimization of user association and resource blanking at the macro BSs – similar to the enhanced intercell interference coordination (eICIC) proposed in the global cellular standards, 3rd Generation Partnership Project (3GPP). Though the joint problem is nominally combinatorial, by allowing users to associate to multiple BSs, the problem becomes convex. We show both theoretically and through simulation that the optimal solution of the relaxed problem still results in a mostly unique association. Simulation shows that resource blanking can further improve the network performance. Next, the above framework with single-antenna transmission is extended to HetNets with BSs equipped with large-antenna arrays and operating in the massive MIMO regime. MIMO techniques enable the option of another interference management: serving users simultaneously by multiple BSs – termed joint transmission (JT). This chapter formulates a unified utility maximization problem to optimize user association with JT and resource blanking, exploring which an efficient dual subgradient based algorithm approaching optimal solutions is developed. Moreover, a simple scheduling scheme is developed to implement near-optimal solutions. We then change direction slightly to develop a flexible and tractable framework for D2D communication in the context of a cellular network. The model is applied to study both shared and orthogonal resource allocation between D2D and cellular networks. Analytical SINR distributions and average rates are derived and applied to maximize the total throughput, under an assumption of interference randomization via time and/or frequency hopping, which can be viewed as an optimized lower bound to other more sophisticated scheduling schemes. Finally, motivated by the benefits of cochannel D2D links, this dissertation investigates interference management for D2D links sharing cellular uplink resources. Showing that the problem of maximizing network throughput while guaranteeing the service of cellular users is non-convex and hence intractable, a distributed approach that is computationally efficient with minimal coordination is proposed instead. The key algorithmic idea is a pricing mechanism, whereby BSs optimize and transmit a signal depending on the interference to D2D links, who then play a best response (i.e., selfishly) to this signal. Numerical results show that our algorithms converge quickly, have low overhead, and achieve a significant throughput gain, while maintaining the quality of cellular links at a predefined service level. / text
357

A framework for resource assignments in skill-based environments

Otero, Luis Daniel 01 June 2009 (has links)
The development of effective personnel assignment methodologies has been the focus of research to academicians and practitioners for many years. The common theory among researchers is that improvements to the effectiveness of personnel assignment decisions are directly associated with favorable outcomes to organizations. Today, companies continue to struggle to develop high quality products in a timely fashion. This elevates the necessity to further explore and improve the decision-making science of personnel assignments. The central goal of this research is to develop a novel framework for human resource assignments in skill-based environments. An extensive literature review resulted in the identification of the following three areas of the general personnel assignment problem as potential improvement opportunities: determining assignment criteria, properly evaluating personnel capabilities, and effectively assigning resources to tasks. Thus, developing new approaches to improve each of these areas constitute the objectives of this dissertation work. The main contributions of this research are threefold. First, this research presents an effective two-stage methodology to determine assignment criteria based on data envelopment analysis (DEA) and Tobit regression. Second, this research develops a novel fuzzy expert system for resource capability assessments in skill-based scenarios. The expert system properly evaluates the capabilities of resources in particular skills as a function of imprecise relationships that may exist between different skills. Third, this research develops an assignment model based on the fuzzy goal programming (FGP) technique. The model defines capabilities of resources, tasks requirements, and other important parameters as imprecise/fuzzy variables. The novelty of the research presented in this dissertation stems from the fact that it advances the science of personnel assignments by combining concepts from the fields of statistics, economics, artificial intelligence, and mathematical programming to develop a solution approach with an expected high practical value.
358

A unified framework for optimal resource allocation in multiuser multicarrier wireless systems

Wong, Ian Chan 28 August 2008 (has links)
Not available / text
359

Devolution in a Texas school system : redefining the efforts of three central office directors at the school site

Moynihan-McCoy, Toni Marsh, 1945- 13 July 2011 (has links)
Not available / text
360

A unified framework for optimal resource allocation in multiuser multicarrier wireless systems

Wong, Ian Chan, 1978- 22 August 2011 (has links)
Not available / text

Page generated in 0.1353 seconds