Browsing by Author "Shah, Vijay K."
Now showing 1 - 6 of 6
Results Per Page
Sort Options
- 5G Scheduling for Distributed Control in MicrogridsIyer, Rahul Rajan (Virginia Tech, 2021-11-12)There is an increasing integration of distributed energy resources (DER), controllable loads, and other technologies that are making the grid more robust, reliable, and decentralized. Communication is a major aspect that enables this decentralization and can improve control of important system parameters by allowing different grid components to communicate their states with each other. This information exchange requires a reliable and fast communication infrastructure. Different communication techniques can be used towards this objective, but with recent technological advancements, 5G communication is proving to be a very viable option. 5G is being widely deployed throughout the world due to its high data rates combined with increased reliability compared with its predecessor technologies. This thesis focuses on application and performance analysis of a 5G network for different power system test cases. These test cases are microgrids, and consist of DERs that use distributed control for efficient operation. Under distributed control, the DERs communicate with each other to achieve fast and improved dynamic response. This work develops a co-simulation platform to analyze the impact that a 5G network has in this distributed control objective. This offers key insights on 5G's capability to support critical functions. Different scenarios including set point changes and transients are evaluated. Since distributed control is a time-critical application and DERs rely on the availability of up-to-date information, the scheduling aspect of 5G becomes very important and is given more focus. Information freshness measured using age of information (AoI) is used in this work. Information freshness is a measure of how recent and updated the information communicated by DERs is. This thesis compares the performance of AoI-based schedulers against standard schedulers. These different schedulers are then used on test systems employing distributed control.
- Deep Learning Empowered Unsupervised Contextual Information Extraction and its applications in Communication SystemsGusain, Kunal (Virginia Tech, 2023-01-16)
- Identifying and Prioritizing Critical Information in Military IoT: Video Game DemonstrationAvverahalli Ravi, Darshan (Virginia Tech, 2021-06-29)Current communication and network systems are not built for delay-sensitive applications. The most obvious fact is that the communication capacity is only achievable in theory with infinitely long codes, which means infinitely long delays. One remedy for this is to use shorter codes. Conceptually, there is a deeper reason for the difficulties in such solutions: in Shannon's original 1948 paper, he started out by stating that the "semantic aspects" of information is "irrelevant" to communications. Hence, in Shannon's communication system, as well as every network built after him, we put all information into a uniform bit-stream, regardless what meanings they carry, and we transmit these bits over the network as a single type of commodity. Consequently, the network system can only provide a uniform level of error protection and latency control to all these bits. We argue that such a single measure of latency, or Age of Information (AoI), is insufficient for military Internet of Things (IoT) applications that inherently connect the communication network with a cyber-physical system. For example, a self-driving military vehicle might send to the controller a front-view image. Clearly, not everything in the image is equally important for the purpose of steering the vehicle: an approaching vehicle is a much more urgent piece of information than a tree in the background. Similar examples can be seen for other military IoT devices, such as drones and sensors. In this work, we present a new approach that inherently extracts the most critical information in a Military Battlefield IoT scenario by using a metric - called H-Score. This ensures the neural network to only concentrate on the most important information and ignore all background information. We then carry out extensive evaluation of this a by testing it against various inputs, ranging from a vector of numbers to a 1000x1000 pixel image. Next, we introduce the concept of Manual Marginalization, which helps us to make independent decisions for each object in the image. We also develop a video game that captures the essence of a military battlefield scenario and test our developed algorithm here. Finally, we apply our approach on a simple Atari Space Invaders video game to shoot down enemies before they fire at us.
- Information Freshness: How To Achieve It and Its Impact On Low- Latency Autonomous SystemsChoudhury, Biplav (Virginia Tech, 2022-06-03)In the context of wireless communications, low latency autonomous systems continue to grow in importance. Some applications of autonomous systems where low latency communication is essential are (i) vehicular network's safety performance depends on how recently the vehicles are updated on their neighboring vehicle's locations, (ii) updates from IoT devices need to be aggregated appropriately at the monitoring station before the information gets stale to extract temporal and spatial information from it, and (iii) sensors and controllers in a smart grid need to track the most recent state of the system to tune system parameters dynamically, etc. Each of the above-mentioned applications differs based on the connectivity between the source and the destination. First, vehicular networks involve a broadcast network where each of the vehicles broadcasts its packets to all the other vehicles. Secondly, in the case of UAV-assisted IoT networks, packets generated at multiple IoT devices are transmitted to a final destination via relays. Finally for the smart grid and generally for distributed systems, each source can have varying and unique destinations. Therefore in terms of connectivity, they can be categorized into one-to-all, all-to-one, and variable relationship between the number of sources and destinations. Additionally, some of the other major differences between the applications are the impact of mobility, the importance of a reduced AoI, centralized vs distributed manner of measuring AoI, etc. Thus the wide variety of application requirements makes it challenging to develop scheduling schemes that universally address minimizing the AoI. All these applications involve generating time-stamped status updates at a source which are then transmitted to their destination over a wireless medium. The timely reception of these updates at the destination decides the operating state of the system. This is because the fresher the information at the destination, the better its awareness of the system state for making better control decisions. This freshness of information is not the same as maximizing the throughput or minimizing the delay. While ideally throughput can be maximized by sending data as fast as possible, this may saturate the receiver resulting in queuing, contention, and other delays. On the other hand, these delays can be minimized by sending updates slowly, but this may cause high inter-arrival times. Therefore, a new metric called the Age of Information (AoI) has been proposed to measure the freshness of information that can account for many facets that influence data availability. In simple terms, AoI is measured at the destination as the time elapsed since the generation time of the most recently received update. Therefore AoI is able to incorporate both the delay and the inter-packet arrival time. This makes it a much better metric to measure end-to-end latency, and hence characterize the performance of such time-sensitive systems. These basic characteristics of AoI are explained in detail in Chapter 1. Overall, the main contribution of this dissertation is developing scheduling and resource allocation schemes targeted at improving the AoI of various autonomous systems having different types of connectivity, namely vehicular networks, UAV-assisted IoT networks, and smart grids, and then characterizing and quantifying the benefits of a reduced AoI from the application perspective. In the first contribution, we look into minimizing AoI for the case of broadcast networks having one-to-all connectivity between the source and destination devices by considering the case of vehicular networks. While vehicular networks have been studied in terms of AoI minimization, the impact of mobility and the benefit of a reduced AoI from the application perspective has not been investigated. The mobility of the vehicles is realistically modeled using the Simulation of Urban Mobility (SUMO) software to account for overtaking, lane changes, etc. We propose a safety metric that indicates the collision risk of a vehicle and do a simulation-based study on the ns3 simulator to study its relation to AoI. We see that the broadcast rate in a Dedicated Short Range Network (DSRC) that minimizes the system AoI also has the least collision risk, therefore signifying that reducing AoI improves the on-road safety of the vehicles. However, we also show that this relationship is not universally true and the mobility of the vehicles becomes a crucial aspect. Therefore, we propose a new metric called the Trackability-aware AoI (TAoI) which ensures that vehicles with unpredictable mobility broadcast at a faster rate while vehicles that are predicable are broadcasting at a reduced rate. The results obtained show that minimizing TAoI provides much better on-road safety as compared to plain AoI minimizing, which points to the importance of mobility in such applications. In the second contribution, we focus on networks with all-to-one connectivity where packets from multiple sources are transmitted to a single destination by taking an example of IoT networks. Here multiple IoT devices measure a physical phenomenon and transmit these measurements to a central base station (BS). However, under certain scenarios, the BS and IoT devices are unable to communicate directly and this necessitates the use of UAVs as relays. This creates a two-hop scenario that has not been studied for AoI minimization in UAV networks. In the first hop, the packets have to be sampled from the IoT devices to the UAV and then updated from the UAVs to the BS in the second hop. Such networks are called UAV-assisted IoT networks. We show that under ideal conditions with a generate-at-will traffic generation model and lossless wireless channels, the Maximal Age Difference (MAD) scheduler is the optimal AoI minimizing scheduler. When the ideal conditions are not applicable and more practical conditions are considered, a reinforcement learning (RL) based scheduler is desirable that can account for packet generation patterns and channel qualities. Therefore we propose to use a Deep-Q-Network (DQN)-based scheduler and it outperforms MAD and all other schedulers under general conditions. However, the DQN-based scheduler suffers from scalability issues in large networks. Therefore, another type of RL algorithm called Proximal Policy Optimization (PPO) is proposed to be used for larger networks. Additionally, the PPO-based scheduler can account for changes in the network conditions which the DQN-based scheduler was not able to do. This ensures the trained model can be deployed in environments that might be different than the trained environment. In the final contribution, AoI is studied in networks with varying connectivity between the source and destination devices. A typical example of such a distributed network is the smart grid where multiple devices exchange state information to ensure the grid operates in a stable state. To investigate AoI minimization and its impact on the smart grid, a co-simulation platform is designed where the 5G network is modeled in Python and the smart grid is modeled in PSCAD/MATLAB. In the first part of the study, the suitability of 5G in supporting smart grid operations is investigated. Based on the encouraging results that 5G can support a smart grid, we focus on the schedulers at the 5G RAN to minimize the AoI. It is seen that the AoI-based schedulers provide much better stability compared to traditional 5G schedulers like the proportional fairness and round-robin. However, the MAD scheduler which has been shown to be optimal for a variety of scenarios is no longer optimal as it cannot account for the connectivity among the devices. Additionally, distributed networks with heterogeneous sources will, in addition to the varying connectivity, have different sized packets requiring a different number of resource blocks (RB) to transmit, packet generation patterns, channel conditions, etc. This motivates an RL-based approach. Hence we propose a DQN-based scheduler that can take these factors into account and results show that the DQN-based scheduler outperforms all other schedulers in all considered conditions.
- Practical Algorithms and Analysis for Next-Generation Decentralized Vehicular NetworksDayal, Avik (Virginia Tech, 2021-11-19)The development of autonomous ground and aerial vehicles has driven the requirement for radio access technologies (RATs) to support low latency applications. While onboard sensors such as Light Detection and Ranging (LIDAR), Radio Detection and Ranging (RADAR), and cameras can sense and assess the immediate space around the vehicle, RATs are crucial for the exchange of information on critical events, such as accidents and changes in trajectory, with other vehicles and surrounding infrastructure in a timely manner. Simulations and analytical models are critical in modelling and designing efficient networks. In this dissertation, we focus on (a) proposing and developing algorithms to improve the performance of decentralized vehicular communications in safety critical situations and (b) supporting these proposals with simulation and analysis of the two most popular RAT standards, the Dedicated Short Range Communications (DSRC) standard, and the Cellular vehicle-to-everything (C-V2X) standard. In our first contribution, we propose a risk based protocol for vehicles using the DSRC standard. The protocol allows a higher beacon transmission rate for vehicles that are at a higher risk of collision. We verify the benefits of the risk based protocol over conventional DSRC using ns-3 simulations. Two risk based beacon rate protocols are evaluated in our ns-3 simulator, one that adapts the beacon rate between 1 and 10 Hz, and another between 1 and 20 Hz. Our results show that both protocols improve the packet delivery ratio (PDR) performance by up to 45% in congested environments using the 1-10 Hz adaptive beacon rate protocol and by 38% using the 1-20 Hz adaptive scheme. The two adaptive beacon rate protocol simulation results also show that the likelihood of a vehicle collision due to missed packets decreases by up to 41% and 77% respectively, in a three lane dense highway scenario with 160 vehicles operating at different speeds. In our second contribution, we study the performance of a distance based transmission protocol for vehicular ad hoc network (VANET) using tools from stochastic geometry. We consider a risk based transmission protocol where vehicles transmit more frequently depending on the distance to adjacent vehicles. We evaluate two transmission policies, a listen more policy, in which the transmission rate of vehicles decreases as the inter-vehicular distance decreases, and a talk more policy, in which the transmission rate of vehicles increases as the distance to the vehicle ahead of it decreases. We model the layout of a highway using a 1-D Poisson Point process (PPP) and analyze the performance of a typical receiver in this highway setting. We characterize the success probability of a typical link assuming slotted ALOHA as the channel access scheme. We study the trends in success probability as a function of system parameters. Our third contribution includes improvements to the 3rd Generation Partnership Project (3GPP) Release 14 C-V2X standard, evaluated using a modified collision framework. In C-V2X basic safety messages (BSMs) are transmitted through Mode-4 communications, introduced in Release 14. Mode-4 communications operate under the principle of sensing-based semi-persistent scheduling (SPS), where vehicles sense and schedule transmissions without a base station present. We propose an improved adaptive semi-persistent scheduling, termed Ch-RRI SPS, for Mode-4 C-V2X networks. Specifically, Ch-RRI SPS allows each vehicle to dynamically adjust in real-time the BSM rate, referred to in the LTE standard as the resource reservation interval (RRI). Our study based on system level simulations demonstrates that Ch-RRI SPS greatly outperforms SPS in terms of both on-road safety performance, measured as collision risk, and network performance, measured as packet delivery ratio, in all considered C-V2X scenarios. In high density scenarios, e.g., 80 vehicles/km, Ch-RRI SPS shows a collision risk reduction of 51.27%, 51.20% and 75.41% when compared with SPS with 20 ms, 50 ms, and 100 ms RRI respectively. In our fourth and final contribution, we look at the tracking error and age-of-information (AoI) of the latest 3GPP Release 16 NR-V2X standard, which includes enhancements to the 3GPP Release 14 C-V2X standard. The successor to Mode-4 C-V2X, known as Mode-2a NR-V2X, makes slight changes to sensing-based semi-persistent scheduling (SPS), though vehicles can still sense and schedule transmissions without a base station present. We use AoI and tracking error, which is the freshness of the information at the receiver and the difference in estimated vs actual location of a transmitting vehicle respectively, to measure the impact of lost and outdated BSMs on a vehicle's ability to localize neighboring vehicles. In this work, we again show that such BSM scheduling (with a fixed RRI) suffers from severe under- and over- utilization of radio resources, which severely compromises timely dissemination of BSMs and increases the system AoI and tracking error. To address this, we propose an RRI selection algorithm that measures the age or freshness of messages from neighboring vehicles to select an RRI, termed Age of Information (AoI)-aware RRI (AoI-RRI) selection. Specifically, AoI-aware SPS (i) measures the neighborhood AoI (as opposed to channel availability) to select an age-optimal RRI and (ii) uses a modified SPS procedure with the chosen RRI to select BSM transmission opportunities that minimize the overall system AoI. We compare AoI-RRI SPS to Ch-RRI SPS and fixed RRI SPS for NR-V2X. Our experiments based on the Mode-2a NR-V2X standard implemented using system level simulations show both Ch-RRI SPS and AoI-RRI SPS outperform SPS in high density scenarios in terms of tracking error and age-of-information.
- Spectrum Sharing of the 12 GHz Band with Two-Way Terrestrial 5G Mobile Services: Motivations, Challenges, and Research Road MapHassan, Zoheb; Heeren-Moon, Erika; Sabzehali, Javad; Shah, Vijay K.; Dietrich, Carl; Reed, Jeffrey H.; Burger, Eric W. (IEEE, 2023-07)Telecommunication industries and spectrum regulation authorities are increasingly interested in unlocking the 12 GHz band for two-way 5G terrestrial services. The 12 GHz band has a much larger bandwidth than the current sub-6 GHz band and better propagation characteristics than the millimeter-wave (mmWave) band. Thus, the 12 GHz band offers great potential for improving the coverage and capacity of terrestrial 5G networks. However, interference issues between incumbent receivers and 5G radio links present a major challenge in the 12 GHz band. If one could exploit the dynamic contexts inherent to the 12 GHz band, one could reform spectrum sharing policy to create spectrum access opportunities for 5G mobile services. This article makes three contributions. First, it presents the characteristics and challenges of the 12 GHz band. Second, we explain the characteristics and requirements for spectrum sharing at a variety of levels to resolve those issues. Lastly, we present several research opportunities to enable harmonious coexistence of incumbent licensees and 5G networks within the 12 GHz band.