Spelling suggestions: "subject:"distributed inference"" "subject:"distributed cnference""
1 |
Communication-efficient Distributed Inference: Distributions, Approximation, and ImprovementYin, Ziyan January 2022 (has links)
In modern data science, it is common that large-scale data are stored and processed parallelly across a great number of locations. For reasons including confidentiality concerns, only limited data information from each parallel center is eligible to be transferred. To solve these problems more efficiently, a group of communication-efficient methods are being actively developed. The first part of our investigation is the distributions of the distributed M-estimators that require a one-step update, combining data information collected from all parallel centers. We reveal that the number of centers plays a critical role. When it is not small compared with the total sample size, a non-negligible impact occurs to the limiting distributions, which turn out to be mixtures involving products of normal random variables. Based on our analysis, we propose a multiplier-bootstrap method for approximating the distributions of these one-step updated estimators.
Our second contribution is that we propose two communication-efficient Newton-type algorithms, combining the M-estimator and the gradient collected from each data center. They are created by constructing two Fisher information estimators globally with those communication-efficient statistics. Enjoying a higher rate of convergence, this framework improves upon existing Newton-like methods. Moreover, we present two bias-adjusted one-step distributed estimators. When the square of the center-wise sample size is of a greater magnitude than the total number of centers, they are as efficient as the global M-estimator asymptotically. The advantages of our methods are illustrated by extensive theoretical and empirical evidences. / Statistics
|
2 |
Distributed Inference over Multiple-Access Channels with Wireless Sensor NetworksJanuary 2010 (has links)
abstract: Distributed inference has applications in fields as varied as source localization, evaluation of network quality, and remote monitoring of wildlife habitats. In this dissertation, distributed inference algorithms over multiple-access channels are considered. The performance of these algorithms and the effects of wireless communication channels on the performance are studied. In a first class of problems, distributed inference over fading Gaussian multiple-access channels with amplify-and-forward is considered. Sensors observe a phenomenon and transmit their observations using the amplify-and-forward scheme to a fusion center (FC). Distributed estimation is considered with a single antenna at the FC, where the performance is evaluated using the asymptotic variance of the estimator. The loss in performance due to varying assumptions on the limited amounts of channel information at the sensors is quantified. With multiple antennas at the FC, a distributed detection problem is also considered, where the error exponent is used to evaluate performance. It is shown that for zero-mean channels between the sensors and the FC when there is no channel information at the sensors, arbitrarily large gains in the error exponent can be obtained with sufficient increase in the number of antennas at the FC. In stark contrast, when there is channel information at the sensors, the gain in error exponent due to having multiple antennas at the FC is shown to be no more than a factor of 8/π for Rayleigh fading channels between the sensors and the FC, independent of the number of antennas at the FC, or correlation among noise samples across sensors. In a second class of problems, sensor observations are transmitted to the FC using constant-modulus phase modulation over Gaussian multiple-access-channels. The phase modulation scheme allows for constant transmit power and estimation of moments other than the mean with a single transmission from the sensors. Estimators are developed for the mean, variance and signal-to-noise ratio (SNR) of the sensor observations. The performance of these estimators is studied for different distributions of the observations. It is proved that the estimator of the mean is asymptotically efficient if and only if the distribution of the sensor observations is Gaussian. / Dissertation/Thesis / Ph.D. Electrical Engineering 2010
|
3 |
Hardware-Aware Distributed Pipelined Neural Network Models InferenceAlshams, Mojtaba 07 1900 (has links)
Neural Network models got the attention of the scientific community for their
increasing accuracy in predictions and good emulation of some human tasks.
This led to extensive enhancements in their architecture, resulting in models
with fast-growing memory and computation requirements. Due to hardware constraints such as memory and computing capabilities, the inference of a large neural network model can be distributed across multiple devices by a partitioning
algorithm. The proposed framework finds the optimal model splits and chooses
which device shall compute a corresponding split to minimize inference time and
energy. The framework is based on PipeEdge algorithm and extends it by not
only increasing inference throughput but also simultaneously minimizing inference energy consumption. Another thesis contribution is the augmentation of
the emerging technology Compute-in-memory (CIM) devices to the system. To
the best of my knowledge, no one studied the effect of including CIM, specifically DNN+NeuroSim simulator, devices in a distributed inference. My proposed
framework could partition VGG8 and ResNet152 on ImageNet and achieve a comparable trade-off between inference slowest stage increase and energy reduction
when it tried to decrease inference energy (e.g. 19% energy reduction with 34%
time increase) and when CIM devices were augmenting the system (e.g. 34%
energy reduction with 45% time increase).
|
Page generated in 0.2528 seconds