Spelling suggestions: "subject:"age off information"" "subject:"age oof information""
1 |
Minimizing age of information for semi-periodic arrivals of multiple packetsChen, Mianlong 04 December 2019 (has links)
Age of information (AoI) captures the freshness of information and has been used broadly for scheduling data transmission in the Internet of Things (IoT). We consider a general scenario where a meaningful piece of information consists of multiple packets and the information would not be considered complete until all related packets have been correctly received. This general scenario, seemingly a trivial extension of exiting work where information update is in terms of single packet, is actually challenging in both scheduling algorithm design and theoretical analysis, because we need to track the history of received packets before a complete piece of information can be updated. We first analyse the necessary condition for optimal scheduling based on which we present an optimal scheduling method. The optimal solution, however, has high time complexity. To address the problem, we investigate the problem in the framework of restless multi-armed bandit (RMAB) and propose an index-based scheduling policy by applying Whittle index. We also propose a new transmission strategy based on erasure codes to improve the performance of scheduling policies in lossy networks. Performance evaluation results demonstrate that our solution outperforms other baseline policies such as greedy policy and naive Whittle index policy in both lossless and lossy networks. / Graduate
|
2 |
A Reinforcement Learning-based Scheduler for Minimizing Casualties of a Military Drone SwarmJin, Heng 14 July 2022 (has links)
In this thesis, we consider a swarm of military drones flying over an unfriendly territory, where a drone can be shot down by an enemy with an age-based risk probability. We study the problem of scheduling surveillance image transmissions among the drones with the objective of minimizing the overall casualty. We present Hector, a reinforcement learning-based scheduling algorithm. Specifically, Hector only uses the age of each detected target, a piece of locally available information at each drone, as an input to a neural network to make scheduling decisions. Extensive simulations show that Hector significantly reduces casualties than a baseline round-robin algorithm. Further, Hector can offer comparable performance to a high-performing greedy scheduler, which assumes complete knowledge of global information. / Master of Science / Drones have been successfully deployed by the military. The advancement of machine learning further empowers drones to automatically identify, recognize, and even eliminate adversary targets on the battlefield. However, to minimize unnecessary casualties to civilians, it is important to introduce additional checks and control from the control center before lethal force is authorized. Thus, the communication between drones and the control center becomes critical.
In this thesis, we study the problem of communication between a military drone swarm and the control center when drones are flying over unfriendly territory where drones can be shot down by enemies. We present Hector, an algorithm based on machine learning, to minimize the overall casualty of drones by scheduling data transmission. Extensive simulations show that Hector significantly reduces casualties than traditional algorithms.
|
3 |
5G Scheduling for Distributed Control in MicrogridsIyer, Rahul Rajan 12 November 2021 (has links)
There is an increasing integration of distributed energy resources (DER), controllable loads, and other technologies that are making the grid more robust, reliable, and decentralized. Communication is a major aspect that enables this decentralization and can improve control of important system parameters by allowing different grid components to communicate their states with each other. This information exchange requires a reliable and fast communication infrastructure. Different communication techniques can be used towards this objective, but with recent technological advancements, 5G communication is proving to be a very viable option. 5G is being widely deployed throughout the world due to its high data rates combined with increased reliability compared with its predecessor technologies. This thesis focuses on application and performance analysis of a 5G network for different power system test cases. These test cases are microgrids, and consist of DERs that use distributed control for efficient operation. Under distributed control, the DERs communicate with each other to achieve fast and improved dynamic response. This work develops a co-simulation platform to analyze the impact that a 5G network has in this distributed control objective. This offers key insights on 5G's capability to support critical functions. Different scenarios including set point changes and transients are evaluated. Since distributed control is a time-critical application and DERs rely on the availability of up-to-date information, the scheduling aspect of 5G becomes very important and is given more focus. Information freshness measured using age of information (AoI) is used in this work. Information freshness is a measure of how recent and updated the information communicated by DERs is. This thesis compares the performance of AoI-based schedulers against standard schedulers. These different schedulers are then used on test systems employing distributed control. / Master of Science / Communication has become an important aspect of modern power systems due to increased integration of distributed energy resources (DER), controllable loads and other components that have communication capabilities for improved grid performance. Of the various communication techniques available for power systems, 5G is very promising due to its advantages over its predecessors and other wired communication methods. This work develops a cosimulation framework to implement a 5G network for different microgrid test cases that employ distributed control. Under distributed control, the DERs communicate with each other to achieve fast and improved dynamic response. Due to the time-critical nature of distributed control, DERs rely on the availability of up-to-date information. Hence the scheduling aspect of 5G becomes very important and is given more focus in this work. 5G schedulers that account for the availability of up-to-date information, also referred to as information freshness, are compared with standard 5G schedulers and their performance in distributed control test systems is analyzed.
|
4 |
Building Distributed Systems for Fresh and Low-latency Data Delivery for Internet of ThingsToutounji Alkallas, Adnan January 2019 (has links)
Internet of Things (IoT) is a system of interrelated computing devices with the ability to transfer data over the network and collected by the applications that rely on fresh information, where the freshness of data can be measured by a metric called Age of Information (AoI). Age of Information is the time that is measured by the receiving node from the time the data has generated at the source. It is an important metric for many IoT applications such as, collecting data from temperature sensors, pollution rates in a specific city. However, the bottleneck problem occurs at sensors because they are constrained devices in terms of energy (power via battery), and also have limited memory and computational power. Therefore, they cannot serve many requests at the same time and thus, it will decrease the information quality which means more unnecessary aging. As a result, we suggest as a solution a distributed system that takes into account the AoI transmitted by the sensors so that IoT applications will receive the expected information quality. This thesis describes the three algorithms that can be used tobuild and test three different topologies. The first algorithm builds a Random graph while second and thirds algorithms shapes Clustered and Hybrid graphs respectively. For testing, we use Python based SimPy package which is a process-based discrete-event simulation framework. Finally, we compare Random, Clustered and Hybrid graphs results. Overall, the Hybrid graph delivers more fresh information than other graphs. / Internet of Things (IoT) är ett system med sammanhängande datorenheter med förmågan att överföra data över nätverket och samlas in av applikationer som förlitar sig på ny information, där datorns färskhet kan mätas med en metrisk som kallas Age of Information (AoI ). Age of Information är den tid som mäts av den mottagande noden från den tid datan har genererat vid källan. Det är en viktig metrisk för många IoT-applikationer, till exempel att samla in data från temperatursensorer, föroreningar i en specifik stad. Flaskhalsproblemet uppstår emellertid vid sensorer eftersom de är begränsade enheter i termer av energi (ström via batteri), och också har begränsat minne och beräkningskraft. Därför kan de inte betjäna många förfrågningar samtidigt och det kommer därför att minska informationskvaliteten vilket innebär mer onödigt åldrande. Som ett resultat föreslår vi som en lösning ett distribuerat system som tar hänsyn till AoI som sänds av sensorerna så att IoT-applikationer får den förväntade informationskvaliteten. Den här avhandlingen beskriver de tre algoritmerna som kananvändas för att bygga och testa tre olika topologier. Den första algoritmen bygger ett slumpmässigt diagram medan andra och tredjedels algoritmer formar Clustered respektive hybriddiagram. För testning använder vi ett Python-baserat SimPy-paket somär en processbaserad diskret händelsimuleringsram. Slutligen jämför vi slumpmässiga, klusterade och hybriddiagramresultat. Sammantaget ger hybridgrafen mer ny information än andra grafer.
|
5 |
An exploratory study of environmental risk factors to elderly falls in Hong Kong a GIS case study of Mong Kok, 2006-2007 /Low, Chien-tat. January 2008 (has links)
Thesis (M. Phil.)--University of Hong Kong, 2008. / Includes bibliographical references (p. 148-159) Also available in print.
|
6 |
Age of Information in Multi-Hop Status Update Systems: Fundamental Bounds and Scheduling Policy DesignFarazi, Shahab 03 June 2020 (has links)
Freshness of information has become of high importance with the emergence of many real- time applications like monitoring systems and communication networks. The main idea behind all of these scenarios is the same, there exists at least a monitor of some process to which the monitor does not have direct access. Rather, the monitor indirectly receives updates over time from a source that can observe the process directly. The common main goal in these scenarios is to guarantee that the updates at the monitor side are as fresh as possible. However, due to the contention among the nodes in the network over limited channel resources, it takes some random time for the updates before they are received by the monitor. These applications have motivated a line of research studying the Age of Information (AoI) as a new performance metric that captures timeliness of information. The first part of this dissertation focuses on the AoI problem in general multi-source multi-hop status update networks with slotted transmissions. Fundamental lower bounds on the instantaneous peak and average AoI are derived under general interference constraints. Explicit algorithms are developed that generate scheduling policies for status update dissem- ination throughout the network for the class of minimum-length periodic schedules under global interference constraints. Next, we study AoI in multi-access channels, where a number of sources share the same server with exponentially distributed service times to communicate to a monitor. Two cases depending on the status update arrival rates at the sources are considered: (i) random arrivals based on the Poisson point process, and (ii) active arrivals where each source can generate an update at any point in time. For each case, closed-form expressions are derived for the average AoI as a function of the system parameters. Next, the effect of energy harvesting on the age is considered in a single-source single- monitor status update system that has a server with a finite battery capacity. Depending on the server’s ability to harvest energy while a packet is in service, and allowing or blocking the newly-arriving packets to preempt a packet in service, average AoI expressions are derived. The results show that preemption of the packets in service is sub-optimal when the energy arrival rate is lower than the status update arrival rate. Finally, the age of channel state information (CSI) is studied in fully-connected wire- less networks with time-slotted transmissions and time-varying channels. A framework is developed that accounts for the amount of data and overhead in each packet and the CSI disseminated in the packet. Lower bounds on the peak and average AoI are derived and a greedy protocol that schedules the status updates based on minimizing the instantaneous average AoI is developed. Achievable average AoI is derived for the class of randomized CSI dissemination schedules.
|
7 |
Optimizing Information Freshness in Wireless NetworksLi, Chengzhang 18 January 2023 (has links)
Age of Information (AoI) is a performance metric that can be used to measure the freshness of information. Since its inception, it has captured the attention of the research community and is now an area of active research. By its definition, AoI measures the elapsed time period between the present time and the generation time of the information. AoI is fundamentally different from traditional metrics such as delay or latency as the latter only considers the transit time for a packet to traverse the network.
Among the state-of-the-art in the literature, we identify two limitations that deserve further investigation. First, many existing efforts on AoI have been limited to information-theoretic exploration by considering extremely simple models and unrealistic assumptions, which are far from real-world communication systems. Second, among most existing work on scheduling algorithms to optimize AoI, there is a lack of research on guaranteeing AoI deadlines. The goal of this dissertation is to address these two limitations in the state-of-the-art. First, we design schedulers to minimize AoI under more practical settings, including varying sampling periods, varying sample sizes, cellular transmission models, dynamic channel conditions, etc. Second, we design schedulers to guarantee hard or soft AoI deadlines for each information source. More important, inspired by our results from guaranteeing AoI deadlines, we develop a general design framework that can be applied to construct high-performance schedulers for AoI-related problems.
This dissertation is organized into three parts. In the first part, we study two problems on AoI minimization under general settings. (i) We consider general and heterogeneous sampling behaviors among source nodes, varying sample size, and a cellular-based transmission model.
We develop a near-optimal low-complexity scheduler---code-named Juventas---to minimize AoI. (ii) We study the AoI minimization problem under a 5G network with dynamic channels. To meet the stringent real-time requirement for 5G, we develop a GPU-based near-optimal algorithm---code-named Kronos---and implement it on commercial off-the-shelf (COTS) GPUs.
In the second part, we investigate three problems on guaranteeing AoI deadlines. (i) We study the problem to guarantee a hard AoI deadline for information from each source. We present a novel low-complexity procedure, called Fictitious Polynomial Mapping (FPM), and prove that FPM can find a feasible scheduler for any hard deadline vector when the system load is under ln 2. (ii) For soft AoI deadlines, i.e., occasional violations can be tolerated, we present a novel procedure called Unstable Tolerant Scheduler (UTS). UTS hinges upon the notions of Almost Uniform Schedulers (AUSs) and step-down rate vectors. We show that UTS has strong performance guarantees under different settings. (iii) We investigate a 5G scheduling problem to minimize the proportion of time when the AoI exceeds a soft deadline. We derive a property called uniform fairness and use it as a guideline to develop a 5G scheduler---Aequitas. To meet the real-time requirement in 5G, we implement Aequitas on a COTS GPU.
In the third part, we present Eywa---a general design framework that can be applied to construct high-performance schedulers for AoI-related optimization and decision problems. The design of Eywa is inspired by the notions of AUS schedulers and step-down rate vectors when we develop UTS in the second part. To validate the efficacy of the proposed Eywa framework, we apply it to solve a number of problems, such as minimizing the sum of AoIs, minimizing bandwidth requirement under AoI constraints, and determining the existence of feasible schedulers to satisfy AoI constraints. We find that for each problem, Eywa can either offer a stronger performance guarantee than the state-of-the-art algorithms, or provide new/general results that are not available in the literature. / Doctor of Philosophy / Age of Information (AoI) is a performance metric that can be used to measure the freshness of information. It measures the elapsed time period between the present time and the generation time of the information. Through a literature review, we have identified two limitations: (i) many existing efforts on AoI have employed extremely simple models and unrealistic assumptions, and (ii) most existing work focuses on optimizing AoI, while overlooking AoI deadline requirements in some applications.
The goal of this dissertation is to address these two limitations. For the first limitation, we study the problem to minimize the average AoI in general and practical settings, such as dynamic channels and 5G NR networks. For the second limitation, we design schedulers to guarantee hard or soft AoI deadlines for information from each source. Finally, we develop a general design framework that can be applied to construct high-performance schedulers for AoI-related problems.
|
8 |
Practical Algorithms and Analysis for Next-Generation Decentralized Vehicular NetworksDayal, Avik 19 November 2021 (has links)
The development of autonomous ground and aerial vehicles has driven the requirement for radio access technologies (RATs) to support low latency applications. While onboard sensors such as Light Detection and Ranging (LIDAR), Radio Detection and Ranging (RADAR), and cameras can sense and assess the immediate space around the vehicle, RATs are crucial for the exchange of information on critical events, such as accidents and changes in trajectory, with other vehicles and surrounding infrastructure in a timely manner. Simulations and analytical models are critical in modelling and designing efficient networks.
In this dissertation, we focus on (a) proposing and developing algorithms to improve the performance of decentralized vehicular communications in safety critical situations and (b) supporting these proposals with simulation and analysis of the two most popular RAT standards, the Dedicated Short Range Communications (DSRC) standard, and the Cellular vehicle-to-everything (C-V2X) standard.
In our first contribution, we propose a risk based protocol for vehicles using the DSRC standard. The protocol allows a higher beacon transmission rate for vehicles that are at a higher risk of collision. We verify the benefits of the risk based protocol over conventional DSRC using ns-3 simulations. Two risk based beacon rate protocols are evaluated in our ns-3 simulator, one that adapts the beacon rate between 1 and 10 Hz, and another between 1 and 20 Hz. Our results show that both protocols improve the packet delivery ratio (PDR) performance by up to 45% in congested environments using the 1-10 Hz adaptive beacon rate protocol and by 38% using the 1-20 Hz adaptive scheme. The two adaptive beacon rate protocol simulation results also show that the likelihood of a vehicle collision due to missed packets decreases by up to 41% and 77% respectively, in a three lane dense highway scenario with 160 vehicles operating at different speeds.
In our second contribution, we study the performance of a distance based transmission protocol for vehicular ad hoc network (VANET) using tools from stochastic geometry. We consider a risk based transmission protocol where vehicles transmit more frequently depending on the distance to adjacent vehicles. We evaluate two transmission policies, a listen more policy, in which the transmission rate of vehicles decreases as the inter-vehicular distance decreases, and a talk more policy, in which the transmission rate of vehicles increases as the distance to the vehicle ahead of it decreases. We model the layout of a highway using a 1-D Poisson Point process (PPP) and analyze the performance of a typical receiver in this highway setting. We characterize the success probability of a typical link assuming slotted ALOHA as the channel access scheme. We study the trends in success probability as a function of system parameters.
Our third contribution includes improvements to the 3rd Generation Partnership Project (3GPP) Release 14 C-V2X standard, evaluated using a modified collision framework. In C-V2X basic safety messages (BSMs) are transmitted through Mode-4 communications, introduced in Release 14. Mode-4 communications operate under the principle of sensing-based semi-persistent scheduling (SPS), where vehicles sense and schedule transmissions without a base station present. We propose an improved adaptive semi-persistent scheduling, termed Ch-RRI SPS, for Mode-4 C-V2X networks. Specifically, Ch-RRI SPS allows each vehicle to dynamically adjust in real-time the BSM rate, referred to in the LTE standard as the resource reservation interval (RRI). Our study based on system level simulations demonstrates that Ch-RRI SPS greatly outperforms SPS in terms of both on-road safety performance, measured as collision risk, and network performance, measured as packet delivery ratio, in all considered C-V2X scenarios. In high density scenarios, e.g., 80 vehicles/km, Ch-RRI SPS shows a collision risk reduction of 51.27%, 51.20% and 75.41% when compared with SPS with 20 ms, 50 ms, and 100 ms RRI respectively.
In our fourth and final contribution, we look at the tracking error and age-of-information (AoI) of the latest 3GPP Release 16 NR-V2X standard, which includes enhancements to the 3GPP Release 14 C-V2X standard. The successor to Mode-4 C-V2X, known as Mode-2a NR-V2X, makes slight changes to sensing-based semi-persistent scheduling (SPS), though vehicles can still sense and schedule transmissions without a base station present. We use AoI and tracking error, which is the freshness of the information at the receiver and the difference in estimated vs actual location of a transmitting vehicle respectively, to measure the impact of lost and outdated BSMs on a vehicle's ability to localize neighboring vehicles. In this work, we again show that such BSM scheduling (with a fixed RRI) suffers from severe under- and over- utilization of radio resources, which severely compromises timely dissemination of BSMs and increases the system AoI and tracking error. To address this, we propose an RRI selection algorithm that measures the age or freshness of messages from neighboring vehicles to select an RRI, termed Age of Information (AoI)-aware RRI (AoI-RRI) selection. Specifically, AoI-aware SPS (i) measures the neighborhood AoI (as opposed to channel availability) to select an age-optimal RRI and (ii) uses a modified SPS procedure with the chosen RRI to select BSM transmission opportunities that minimize the overall system AoI. We compare AoI-RRI SPS to Ch-RRI SPS and fixed RRI SPS for NR-V2X. Our experiments based on the Mode-2a NR-V2X standard implemented using system level simulations show both Ch-RRI SPS and AoI-RRI SPS outperform SPS in high density scenarios in terms of tracking error and age-of-information. / Doctor of Philosophy / An increasing number of vehicles are equipped with a large set of on-board sensors that enable and support autonomous capabilities. Such sensors, which include Light Detection and Ranging (LIDAR), Radio Detection and Ranging (RADAR), and cameras, are meant to increase passenger and driver safety. However, similar to humans, these sensors are limited to line-of-sight (LOS) visibility, meaning they cannot see beyond other vehicles, corners, and buildings. For this reason, efficient vehicular communications are essential to the next generation of vehicles and could significantly improve road safety. In addition, vehicular communications enable the timely exchange of critical information with other vehicles, cellular and roadside infrastructure, and pedestrians. However, unlike typical wireless and cellular networks, vehicular networks are expected to operate in a distributed manner, as there is no guarantee of the presence of cellular infrastructure.
Accurate simulations and analytical models are critical in improving and guaranteeing the performance of the next generation of vehicular networks. In this dissertation, we propose and develop novel and practical distributed algorithms to enhance the performance of decentralized vehicular communications. We support these algorithms with computer simulations and analytical tools from the field of stochastic geometry.
|
9 |
Information Freshness: How To Achieve It and Its Impact On Low- Latency Autonomous SystemsChoudhury, Biplav 03 June 2022 (has links)
In the context of wireless communications, low latency autonomous systems continue to grow in importance. Some applications of autonomous systems where low latency communication is essential are (i) vehicular network's safety performance depends on how recently the vehicles are updated on their neighboring vehicle's locations, (ii) updates from IoT devices need to be aggregated appropriately at the monitoring station before the information gets stale to extract temporal and spatial information from it, and (iii) sensors and controllers in a smart grid need to track the most recent state of the system to tune system parameters dynamically, etc. Each of the above-mentioned applications differs based on the connectivity between the source and the destination. First, vehicular networks involve a broadcast network where each of the vehicles broadcasts its packets to all the other vehicles. Secondly, in the case of UAV-assisted IoT networks, packets generated at multiple IoT devices are transmitted to a final destination via relays. Finally for the smart grid and generally for distributed systems, each source can have varying and unique destinations. Therefore in terms of connectivity, they can be categorized into one-to-all, all-to-one, and variable relationship between the number of sources and destinations. Additionally, some of the other major differences between the applications are the impact of mobility, the importance of a reduced AoI, centralized vs distributed manner of measuring AoI, etc. Thus the wide variety of application requirements makes it challenging to develop scheduling schemes that universally address minimizing the AoI.
All these applications involve generating time-stamped status updates at a source which are then transmitted to their destination over a wireless medium. The timely reception of these updates at the destination decides the operating state of the system. This is because the fresher the information at the destination, the better its awareness of the system state for making better control decisions. This freshness of information is not the same as maximizing the throughput or minimizing the delay. While ideally throughput can be maximized by sending data as fast as possible, this may saturate the receiver resulting in queuing, contention, and other delays. On the other hand, these delays can be minimized by sending updates slowly, but this may cause high inter-arrival times. Therefore, a new metric called the Age of Information (AoI) has been proposed to measure the freshness of information that can account for many facets that influence data availability. In simple terms, AoI is measured at the destination as the time elapsed since the generation time of the most recently received update. Therefore AoI is able to incorporate both the delay and the inter-packet arrival time. This makes it a much better metric to measure end-to-end latency, and hence characterize the performance of such time-sensitive systems. These basic characteristics of AoI are explained in detail in Chapter 1. Overall, the main contribution of this dissertation is developing scheduling and resource allocation schemes targeted at improving the AoI of various autonomous systems having different types of connectivity, namely vehicular networks, UAV-assisted IoT networks, and smart grids, and then characterizing and quantifying the benefits of a reduced AoI from the application perspective.
In the first contribution, we look into minimizing AoI for the case of broadcast networks having one-to-all connectivity between the source and destination devices by considering the case of vehicular networks. While vehicular networks have been studied in terms of AoI minimization, the impact of mobility and the benefit of a reduced AoI from the application perspective has not been investigated. The mobility of the vehicles is realistically modeled using the Simulation of Urban Mobility (SUMO) software to account for overtaking, lane changes, etc. We propose a safety metric that indicates the collision risk of a vehicle and do a simulation-based study on the ns3 simulator to study its relation to AoI. We see that the broadcast rate in a Dedicated Short Range Network (DSRC) that minimizes the system AoI also has the least collision risk, therefore signifying that reducing AoI improves the on-road safety of the vehicles. However, we also show that this relationship is not universally true and the mobility of the vehicles becomes a crucial aspect. Therefore, we propose a new metric called the Trackability-aware AoI (TAoI) which ensures that vehicles with unpredictable mobility broadcast at a faster rate while vehicles that are predicable are broadcasting at a reduced rate. The results obtained show that minimizing TAoI provides much better on-road safety as compared to plain AoI minimizing, which points to the importance of mobility in such applications.
In the second contribution, we focus on networks with all-to-one connectivity where packets from multiple sources are transmitted to a single destination by taking an example of IoT networks. Here multiple IoT devices measure a physical phenomenon and transmit these measurements to a central base station (BS). However, under certain scenarios, the BS and IoT devices are unable to communicate directly and this necessitates the use of UAVs as relays. This creates a two-hop scenario that has not been studied for AoI minimization in UAV networks. In the first hop, the packets have to be sampled from the IoT devices to the UAV and then updated from the UAVs to the BS in the second hop. Such networks are called UAV-assisted IoT networks. We show that under ideal conditions with a generate-at-will traffic generation model and lossless wireless channels, the Maximal Age Difference (MAD) scheduler is the optimal AoI minimizing scheduler. When the ideal conditions are not applicable and more practical conditions are considered, a reinforcement learning (RL) based scheduler is desirable that can account for packet generation patterns and channel qualities. Therefore we propose to use a Deep-Q-Network (DQN)-based scheduler and it outperforms MAD and all other schedulers under general conditions. However, the DQN-based scheduler suffers from scalability issues in large networks. Therefore, another type of RL algorithm called Proximal Policy Optimization (PPO) is proposed to be used for larger networks. Additionally, the PPO-based scheduler can account for changes in the network conditions which the DQN-based scheduler was not able to do. This ensures the trained model can be deployed in environments that might be different than the trained environment.
In the final contribution, AoI is studied in networks with varying connectivity between the source and destination devices. A typical example of such a distributed network is the smart grid where multiple devices exchange state information to ensure the grid operates in a stable state. To investigate AoI minimization and its impact on the smart grid, a co-simulation platform is designed where the 5G network is modeled in Python and the smart grid is modeled in PSCAD/MATLAB. In the first part of the study, the suitability of 5G in supporting smart grid operations is investigated. Based on the encouraging results that 5G can support a smart grid, we focus on the schedulers at the 5G RAN to minimize the AoI. It is seen that the AoI-based schedulers provide much better stability compared to traditional 5G schedulers like the proportional fairness and round-robin. However, the MAD scheduler which has been shown to be optimal for a variety of scenarios is no longer optimal as it cannot account for the connectivity among the devices. Additionally, distributed networks with heterogeneous sources will, in addition to the varying connectivity, have different sized packets requiring a different number of resource blocks (RB) to transmit, packet generation patterns, channel conditions, etc. This motivates an RL-based approach. Hence we propose a DQN-based scheduler that can take these factors into account and results show that the DQN-based scheduler outperforms all other schedulers in all considered conditions. / Doctor of Philosophy / Age of information (AoI) is an exciting new metric as it is able to characterize the freshness of information, where freshness means how representative the information is of the current system state. Therefore it is being actively investigated for a variety of autonomous systems that rely on having the most up-to-date information on the current state. Some examples are vehicular networks, UAV networks, and smart grids. Vehicular networks need the real-time location of their neighbor vehicles to make maneuver decisions, UAVs have to collect the most recent information from IoT devices for monitoring purposes, and devices in a smart grid need to ensure that they have the most recent information on the desired system state. From a communication point of view, each of these scenarios presents a different type of connectivity between the source and the destination. First, the vehicular network is a broadcast network where each vehicle broadcasts its packets to every other vehicle. Secondly, in the UAV network, multiple devices transmit their packets to a single destination via a relay. Finally, with the smart grid and the generally distributed networks, every source can have different and unique destinations. In these applications, AoI becomes a natural choice to measure the system performance as the fresher the information at the destination, the better its awareness of the system state which allows it to take better control decisions to reach the desired objective.
Therefore in this dissertation, we use mathematical analysis and simulation-based approaches to investigate different scheduling and resource allocation policies to improve the AoI for the above-mentioned scenarios. We also show that the reduced AoI improves the system performance, i.e., better on-road safety for vehicular networks and better stability for smart grid applications. The results obtained in this dissertation show that when designing communication and networking protocols for time-sensitive applications requiring low latency, they have to be optimized to improve AoI. This is in contrast to most modern-day communication protocols that are targeted at improving the throughput or minimizing the delay.
|
10 |
Information Freshness Optimization in Real-time Network ApplicationsLiu, Zhongdong 12 June 2024 (has links)
In recent years, the remarkable development in ubiquitous communication networks and smart portable devices spawned a wide variety of real-time applications that require timely information updates (e.g., autonomous vehicular systems, industrial automation systems, and live streaming services). These real-time applications all have one thing in common: they desire their knowledge of the information source to be as fresh as possible. In order to measure the freshness of information, a new metric, called the Age-of-Information (AoI) is proposed. AoI is defined as the time elapsed since the generation time of the freshest delivered update. This metric is influenced by both the inter-arrival time and the delay of the updates. As a result of these dependencies, the AoI metric exhibits distinct characteristics compared to traditional delay and throughput metrics.
In this dissertation, our goal is to optimize AoI under various real-time network applications. Firstly, we investigate a fundamental problem of how exactly various scheduling policies impact AoI performance. Though there is a large body of work studying the AoI performance under different scheduling policies, the use of the update-size information and its combinations with other information (such as arrival-time information and service preemption) to reduce AoI has still not been explored yet. Secondly, as a recently introduced measure of freshness, the relationship between AoI and other performance metrics remains largely ambiguous. We analyze the tradeoffs between AoI and additional performance metrics, including service performance and update cost, within real-world applications.
This dissertation is organized into three parts. In the first part, we realize that scheduling policies leveraging the update-size information can substantially reduce the delay, one of the key components of AoI. However, it remains largely unknown how exactly scheduling policies (especially those making use of update-size information) impact the AoI performance. To this end, we conduct a systematic and comparative study to investigate the impact of scheduling policies on the AoI performance in single-server queues and provide useful guidelines for the design of AoI-efficient scheduling policies.
In the second part, we analyze the tradeoffs between AoI and other performance metrics in real-world systems. Specifically, we focus on the following two important tradeoffs. (i) The tradeoff between service performance and AoI that arises in the data-driven real-time applications (e.g., Google Maps and stock trading applications). In these applications, the computing resource is often shared for processing both updates from information sources and queries from end users. Hence there is a natural tradeoff between service performance (e.g., response time to queries) and AoI (i.e., the freshness of data in response to user queries). To address this tradeoff, we begin by introducing a simple single-server two-queue model that captures the coupled scheduling between updates and queries. Subsequently, we design threshold-based scheduling policies to prioritize either updates or queries. Finally, we conduct a rigorous analysis of the performance of these threshold-based scheduling policies. (ii) The tradeoff between update cost and AoI that appear in the crowdsensing-based applications (e.g., Google Waze and GasBuddy). On the one hand, users are not satisfied if the responses to their requests are stale; on the other side, there is a cost for the applications to update their information regarding certain points of interest since they typically need to make monetary payments to incentivize users. To capture this tradeoff, we first formulate an optimization problem with the objective of minimizing the sum of the staleness cost (which is a function of the AoI) and the update cost, then we obtain a closed-form optimal threshold-based policy by reformulating the problem as a Markov decision process (MDP).
In the third part, we study the minimization of data freshness and transmission costs (e.g., energy cost) under an (arbitrary) time-varying wireless channel without and with machine learning (ML) advice. We consider a discrete-time system where a resource-constrained source transmits time-sensitive data to a destination over a time-varying wireless channel. Each transmission incurs a fixed cost, while not transmitting results in a staleness cost measured by the AoI. The source needs to balance the tradeoff between these transmission and staleness costs. To tackle this challenge, we develop a robust online algorithm aimed at minimizing the sum of transmission and staleness costs, ensuring a worst-case performance guarantee. While online algorithms are robust, they tend to be overly conservative and may perform poorly on average in typical scenarios. In contrast, ML algorithms, which leverage historical data and prediction models, generally perform well on average but lack worst-case performance guarantees. To harness the advantages of both approaches, we design a learning-augmented online algorithm that achieves two key properties: (i) consistency: closely approximating the optimal offline algorithm when the ML prediction is accurate and trusted; (ii) robustness: providing a worst-case performance guarantee even when ML predictions are inaccurate. / Doctor of Philosophy / In recent years, the rapid growth of communication networks and smart devices has spurred the emergence of real-time applications like autonomous vehicles and industrial automation systems. These applications share a common need for timely information. The freshness of information can be measured using a new metric called Age-of-Information (AoI). This dissertation aims to optimize AoI across various real-time network applications, organized into three parts. In the first part, we explore how scheduling policies (particularly those considering update size) impact the AoI performance. Through a systematic and comparative study in single-server queues, we provide useful guidelines for the design of AoI-efficient scheduling policies. The second part explores the tradeoff between update cost and AoI in crowdsensing applications like Google Waze and GasBuddy, where users demand fresh responses to their requests; however, updating information incurs update costs for applications. We aim to minimize the sum of staleness cost (a function of AoI) and update cost. By reformulating the problem as a Markov decision process (MDP), we design a simple threshold-based policy and prove its optimality. In the third part, we study the minimization of data freshness and transmission costs (e.g., energy cost) under a time-varying wireless channel. We first develop a robust online algorithm that achieves a competitive ratio of 3, ensuring a worst-case performance guarantee. Furthermore, when advice is available, e.g., predictions from machine learning (ML) models, we design a learning-augmented online algorithm that exhibits two desired properties: (i) consistency: closely approximating the optimal offline algorithm when the ML prediction is accurate and trusted; (ii) robustness: guaranteeing worst-case performance even with inaccurate ML prediction. While this dissertation marks a significant advancement in AoI research, numerous open problems remain. For instance, our learning-augmented online algorithm treats ML predictions as external inputs. Exploring the co-design and training of ML and online algorithms to improve performance could yield interesting insights. Additionally, while AoI typically assesses update importance based solely on timestamps, the content of updates also holds significance. Incorporating considerations of both age and semantics of information is imperative in future research.
|
Page generated in 0.1311 seconds