Distributed Least-Squares Optimization Solvers with Differential Privacy
Abstract
This paper studies the distributed least-squares optimization problem with differential privacy requirement of local cost functions, for which two differentially private distributed solvers are proposed. The first is established on the distributed gradient tracking algorithm, by appropriately perturbing the initial values and parameters that contain the privacy-sensitive data with Gaussian and truncated Laplacian noises, respectively. Rigorous proofs are established to show the achievable trade-off between the -differential privacy and the computation accuracy. The second solver is established on the combination of the distributed shuffling mechanism and the average consensus algorithm, which enables each agent to obtain a noisy version of parameters characterizing the global gradient. As a result, the least-squares optimization problem can be eventually solved by each agent locally in such a way that any given -differential privacy requirement can be preserved while the solution may be computed with the accuracy independent of the network size, which makes the latter more suitable for large-scale distributed least-squares problems. Numerical simulations are presented to show the effectiveness of both solvers.
I Introduction
Distributed optimization, in which networked agents seek to minimize a global cost function in a distributed and cooperative fashion, has gained significant attention in diverse fields such as machine learning [1], unmanned aerial vehicles (UAV) [2], and sensor networks [3]. For such a problem, since the seminal work of the distributed gradient-descend algorithm [4], numerical algorithms have been developed, that differ from linear or sub-linear convergence rate [5, 6, 7, 8], deterministic or stochastic communication graph [9, 10], and equality or inequality constraints [11, 12], etc. Particularly, the gradient-tracking algorithm [13, 5] has gained increasing attention due to its fast convergence rate, yielding several variants such as [14, 15, 16, 17, 18]. As a special case, distribution least-squares optimization considers the computation problems where cost functions are squared, with wide applications across many fields (e.g., versatile LiDAR SLAM [19]). See [20, 21] for a thorough review of recent advances of distributed optimization.
In distributed optimization, each network agent holds a local dataset (that defines a local cost function), that may contain private sensitive information for the agent. For example, in the digital contact tracing based computational epidemiology, users’ privacy-sensitive data such as localization is encoded in the corresponding optimization problem [22]. In existing distributed optimization algorithms, during computing processes each agent needs to communicate with neighboring agents its local states or some intermediate variables that may directly contain the local sensitive data, or can be used to infer the local data. In view of this, new risks of privacy breach arise when there are eavesdroppers having access to network communication messages.
For privacy concern in distributed optimization, a natural idea is to apply homomorphic encryption on transmitted messages [10, 23]. However, this method may lead to heavy computation and communication burden. Besides, other approaches have also been proposed by injecting uncertainties into exchanged states [24, 25], or into asynchronous heterogeneous stepsize [26]. To further describe a rigorous privacy preserving notion, the differential privacy is proposed in [27], and has been widely applied in various fields such as signal processing [28], and distributed computation [25, 29, 30, 31, 32, 33, 34, 35], due to its resilience to side information and post-processing [36]. For a distributed optimization problem with a trusted cloud, a differentially private computation framework is proposed in [35] where the cloud sends perturbed global gradient to agents. Further, for the scenario without a trusted cloud, [31] studies distributed optimization problems with bounded gradients and sensitive constraints by injecting noises to communication messages, for which the desired privacy is guaranteed by splitting the privacy budget to each iteration round. In [33], to preserve the differential privacy under arbitrary iteration rounds, the idea of perturbing objective functions is introduced for a class of distributed convex optimization problem with square-integrable cost functions. Note that as shown in [33], in these results the introduction of randomness for privacy concern inevitably leads to a certain computation error, i.e., there is an accuracy-privacy trade-off. When an exact optimal solution is sought, [25, 32] turn to make a bit modification on the adjacency notion for the differential privacy, and develop novel algorithms by perturbing communication messages with noises having increasing variants and employing a vanishing stepsize that is appropriately designed.
In this paper, we focus on the problem of distributed least-squares optimization with differential privacy requirements of local cost functions under a common data adjacency notion. For such a problem, a common approach is to perturb communication messages, but as in [31] the change of iteration number may lead to a different privacy budget, i.e., yielding an iteration-number-dependent differential privacy. Besides, we also note that the idea of functional perturbation [33] is not applicable as the cost functions are not square-integrable, though it may remove the dependence on the iteration number.
In view of this analysis, inspired by the truncated Laplace mechanism [37] and the Gaussian mechanism [38], we first propose a Differentially Private Gradient Tracking-based (DP-GT-based) solver by perturbing the gradient tracking algorithm in such a way of appropriately adding Gaussian and truncated Laplacian noises to the initial values and parameters that contain the sensitive data, respectively. Moreover, a trade-off between the achievable -differential privacy and the computation accuracy is rigorously proved, and the iteration round is allowed to be arbitrary with no influence on the privacy guarantee. For such a method, it is noted that the intrinsic property of truncated Laplacian differential privacy mechanism [37] limits the achievable differential-privacy level and the computation accuracy linearly relies on the network size, which makes it inapplicable to a large-scaled distributed least-squares problem. In view of this, we further propose a Differentially Private Dishuf Average Consensus-Based (DP-DiShuf-AC-based) solver, where agents run the Distributed Shuffling (DiShuf)-based differentially private average consensus algorithm [39] to obtain a noisy estimate of the global gradient function, enabling each agent to solve for the optimal solution independently with differential privacy guarantees. Benefiting from the use of distributed shuffling mechanism, the least-squares optimization problem can be eventually solved to fulfill any desired differential privacy levels, and with the accuracy independent of the network size, which makes it more suitable for large-scale least-squares optimization problems. In the paper, our contribution mainly lies in proposing two differentially private least-squares optimization solvers: the DP-GT-based solver and DP-DiShuf-AC-based Solver. Particularly, the latter is able to achieve a privacy-accuracy trade-off which is independent of the network size as shown by numerical simulations.
The remainder of this paper is organized as follows. In Section II the differentially private distributed least squares problem is formulated, for which two solvers are explicitly presented in Sections III and IV. Simulations are then conducted in Section V to verify the effectiveness of the proposed solvers. Finally, Section VI concludes the paper.
Notation. Denote by the real number, the real space of dimension for any positive integer . Denote by as a vector with each entry being and as a vector with each entry being . For a vector , denote , , as the 0, 1, and 2-norm of vector . For a matrix , denote as its -th row -th column element, as 2-norm, as Frobenius norm, and as -th eigenvalue. Denote by if each entry of is i.i.d. and drawn from Gaussian distribution with mean and variance for positive integer . Define and for any . Denote by the inverse function of for . It can be verified that both and are strictly increasing functions.
II Problem Statement
Consider a multi-agent system with an undirected and connected communication network , where the agent set , the edge set , and each agent holds a local and privacy-sensitive cost function of the form
(1) |
where is symmetric, and is a scalar. The networked system aims to solve the following least-squares optimization problem
(2) |
in a distributed and privacy-preserved manner. It is well-known that the value of does not affect the solution of the optimization problem and is generally not used. Thus, in the sequel we only consider the privacy concern of the matrix pair while developing algorithms to solve the problem (2), though the privacy of is encoded in the triplet . Denote by as the mapping that rearranges all elements of matrix to a vector , i.e. . Specifically, is obtained by setting the element in the -th row and the -th column with of as the -th element. Note that since is symmetric, we only put the upper-triangle elements of in the mapping . On the other hand, it is clear that is invertible. For convenience, we denote by as its inverse, i.e., , and define the sensitive-data vector .
To ensure the solvability of the problem (2), we make the following assumption throughout the paper.
Assumption 1.
The matrix is positive definite, i.e., there exists a such that .
The above assumption guarantees strict convexity of global cost function , admitting the unique least-squares solution
(3) |
where we define .
In solving the distributed least-squares optimization problem (2) over the network , necessarily agents communicate with neighboring agents about local information, which may contain the sensitive data (i.e., vector ) directly or indirectly. If there exists adversaries having access to the communication messages, then the sensitive data may be inferred, leading to privacy leakage risks. This motivates to develop distributed algorithms that solve the problem (2) with privacy guarantees of the sensitive data against the adversaries eavesdropping all communication messages.
For privacy concern, this paper focuses on the notion of differential privacy, originated from [27] and is widely used in the field of distributed computation. Differential privacy describes rigorous privacy preserving in that eavesdroppers could not infer sensitive data by the way of “differentiating”. Any pair of data is said to be -adjacent with , denoted by , if and . We denote as a mapping (or mechanism) from the sensitive data to the eavesdropped communication messages, and give the following definition of -differential privacy [27].
Definition 1.
(Differential Privacy). Given , the distributed solver preserves -differential privacy of under -adjacency, if for all , there holds
(4) |
for any .
Before the close of this section, we make the following claims on the communication network. Denote by as neighbor set of agent , and the weight of edge , satisfying for and for . Moreover, denote by the Laplacian matrix of the graph , satisfying , and . Since the graph is assumed to be connected, by [40, Theorem 2.8] and Gershgorin Circle Theorem [41] there holds .
III Gradient-Tracking-based Solver with Differential Privacy
In this section, we modify the distributed gradient tracking algorithm [5] by injecting random noises to the sensitive data for the purpose of solving the distributed least-squares problem (2) while preserving differential privacy of , .
Let for . For any , we let
(5) |
for , and choose . With these parameters, we propose the Differentially Private Gradient Tracking-based (DP-GT-based) solver in Algorithm 1.
\fname@algorithm 1 Differentially Private Gradient Tracking-based solver (DP-GT-based solver)
Input: Data , privacy budgets , parameters .
-
1.
Each agent independently generates a vector of i.i.d. truncated Laplacian noises , with each entry independently generated according to the probability density function
(6) and i.i.d. Gaussian noises .
-
2.
Each agent computes and , and initializes the local states as
(7) -
3.
For , each agent sends to neighboring agents and iterates the local states by following
(8)
In Algorithm 1, two types of random noises (i.e., truncated Laplacian noises and Gaussian noises ) are, respectively, added to the gradient tracking algorithm in the manner of directly perturbing the parameters (or ) and , for the purpose of preserving -differential privacy of these parameters. Particularly, the use of truncated Laplacian distribution (6) is motivated by [37]. It can be easily verified that the noise generated according to the truncated Laplacian distribution (6) has the mean zero and variance satisfying
(9) |
Denote and . The following result demonstrates the differential privacy and computation accuracy that can be achieved by the above algorithm.
Theorem 1.
Given any privacy budgets satisfying (5), by letting , there exists a such that for all , Algorithm 1 achieves the following properties.
-
i).
(Privacy) The -differential privacy of , is preserved under adjacency.
-
ii).
(Convergence) There exists such that for all , with .
-
iii).
(Accuracy) The mean-square error between and satisfies
Remark 1.
The algorithm (8) indeed solves the least-squares problem . To preserve the convexity of while achieving the differential privacy, truncated Laplacian noises are injected with the truncated level . However, due to the presence of such a constraint on the truncated level, from (5) it can be seen that the achievable privacy budgets cannot be arbitrarily prescribed. Moreover, one can see from the statement iii) that the mean-square error is a linear function of the network size , indicating a worse computation accuracy under a larger network size. All these issues thus motivate us to develop a new solver that can guarantee arbitrarily given privacy budgets while allowing a better computation accuracy when a large scale of networked least-squares problem is investigated.
IV Average-Consensus-Based Solver with Differential Privacy
In this section, a new distributed solver with differential privacy is proposed to overcome the issues addressed in Remark 1.
We first observe that, the least-squares solution implies that the least-squares problem (2) is solved if each agent obtains the global information and . This intuition immediately motivates us to employ differentially private average consensus algorithms (e.g., [29, 42, 39]) to compute these global information in a distributed manner, while preserving the desired differential privacy. Further, we note that the differentially private average consensus algorithm proposed in [42, 39] yields a computation accuracy that can almost recover the centralized averaging algorithm where the accuracy of computing the sum is independent of the network size.
In view of the previous analysis, we propose a new solver by employing the differentially private average consensus algorithm proposed in [42, 39], whose implementation relies on the following distributed shuffling (DiShuf) mechanism where Paillier cryptosystem [43] is adopted with and as encryption and decryption operation, respectively, and , ) as public and private key pair.
\fname@algorithm 2 Distributed Shuffling(DiShuf) Mechanism
Input: Data , variance , key pairs
(, ), a large positive integer .
-
1.
Each agent generates a vector of i.i.d. Gaussian noise and adds to , i.e. .
-
2.
Each agent encrypts each entry of with local public key , yielding a ciphertext vector , then sends and public key to neighboring agents .
-
3.
Each agent encrypts each entry of with received neighboring public keys , yielding a ciphertext vector , and computes in entry to obtain a vector for .
-
4.
Each agent generate a positive integer , and computes in entry to get for each .
-
5.
Each agent sends to and decrypts received with local private key , yielding for .
-
6.
Each agent multiplies each by and computes the sum
(10)
Output:
With some lengthy but straightforward computations, it can be seen that the output of the DiShuf mechanism (see Algorithm 2) satisfies
which implies . Then we introduce the differentially private DiShuf-based average consensus algorithm below.
\fname@algorithm 3 Differentially Private DiShuf Average Consensus-based solver (DP-DiShuf-AC-based solver)
Input: Data , variance ,
-
1.
Each agent runs the DiShuf mechanism with input data and outputs , then generates a vector of i.i.d. Gaussian noises , and initializes its state as
(11) -
2.
For , each agent sends its state to its neighboring agents and updates its state following
(12)
As , it can be easily verified that
implying that the noises do not affect the average of , . Thus, there are two design freedoms of noise variances (i.e., and ), with the former affecting only achievable differential privacy, but having no influence on the average. As a result, by appropriately designing and , a better trade-off between the privacy and the accuracy can be achieved. The following result from [39] demonstrates the corresponding privacy and computation properties.
Lemma 1.
[39, Theorem 3] Suppose Assumption 1 holds and let and . Given any privacy budgets , let
(13) |
where , Algorithm 3 achieves the following properties.
-
i).
(Privacy) The -differential privacy of , is preserved under adjacency.
-
ii).
(Convergence) There holds for all , with and .
-
iii).
(Accuracy) The mean-square error between and satisfies
Towards this end, by employing Algorithm 3 to compute the average of , in a distributed and differentially private manner, each agent can eventually obtain and thus an estimate of the global information as and , with where denotes the vector formed by the first entries of , and the vector formed by the last entries of . Therefore, each agent can compute for the solution of the least-squares problem (2) by computing from
(14) |
It is noted that with invertible, its noisy version is actually invertible in a generic sense. Besides, if the resulting is singular, one can implement Algorithm 3 again. It is also noted that , which is independent of the network size . A natural conjecture comes out that the mean-square error between the computed and the actual solution is also independent of , though it is still open to derive an explicit expression of due to the inverse operation on the random matrix, i.e., .
V Case Studies
In this section, we implement numerical simulations to show the effectiveness of our proposed Algorithms 1 and 3 (i.e., DP-GT-based solver and DP-DiShuf-AC-based solver, respectively), and compare their computation performance. We consider a cycle communication graph with each edge having the same weight for . In this section, and are randomly generated and .
Let the network size and choose . Given the privacy budget , and , we let and . It is shown in Figure 1 that the mean-square error decreases and exponentially converges as increases, demonstrating the effectiveness of Algorithm 1.

Then we show the effectiveness of DP-DiShuf-AC-based solver. Taking the same , , , , choose , we change and corresponding computed and . Figure 2 shows the distribution of computation error under different with 100 samples. It is observed our Algorithm 3 holds a resilience to a higher privacy preservation level, i.e., a smaller , in the sense of computation accuracy.

Further, we study the computation error of our DP-Dishuf-AC-based solver (Algorithm 3) and DP-GT-based solver (Algorithm 1) under varying network size . To make a comparison, we consider a differentially private average consensus based solver (DP-AC-based solver), obtained by removing the DiShuf mechanism in Algorithm 3, where Gaussian noises are directly added to privacy-sensitive data to preserve differential privacy [38]. It can be observed in Figure 3 that our Algorithm 3 (i.e., the DP-Dishuf-AC-based solver) shows the best computation accuracy in different scale networks.

VI Conclusion
In this paper, two solvers were proposed to solve the distributed least squares optimization problem with requirements of preserving differential privacy of local cost functions. The first solver was established on the distributed gradient-tracking algorithm, by perturbing the parameters encoding the sensitive data with truncated Laplacian noises and Gaussian noises. It was shown that this method leads to a limited differential-privacy-protection ability and the resulting computation accuracy linearly relies on the network size. To overcome these issues, the second solver was developed where the differentially private DiShuf-based average consensus algorithm was employed such that each agent obtains an noisy estimate of the global gradient function and thus compute the solution directly. As a result, one could achieve arbitrary differential privacy requirements and a computation accuracy independent of the network size. By simulations, it was shown that the computation accuracy of the former solver strictly increases as the network size increases, while such trend was not observed by the latter.
-A Proof of Theorem 1
i). For each agent , consider the following two mechanisms:
(15) | ||||
By [37, Theorem 1], the truncated Laplacian mechanism can be easily verified to be -differentially private under adjacency, with satisfying (5) and each entry of generated according to (6). Similarly, by [38, Theorem 8], the Gaussian mechanism can be found to be -differentially private under adjacency, with and . With these properties in mind, we then turn to take a look at all communication messages that are eavesdropped during computation, and notice that they are in fact deterministic mappings of all mechanisms , . Thus, by employing the parallel composition [44, Theorem 4] and post-processing [36, Proposition 2.1] property of differential privacy, we conclude that Algorithm 1 preserves -differential privacy of .
ii). It is noted from [5] that Algorithm 1 indeed solves the least-squares optimization problem of the from
More explicitly, by [5, Theorem 1] there exists such that each state converges to the unique solution if it exists, with some linear convergence rate . With this in mind, we let and , which implies and . By noting that each element of is a truncated Laplacian noise according to (6), it can be seen that . Then, with Assumption 1 there holds , implying that is strongly convex and admits the unique solution . The statement ii) is thus completed.
iii). We note that
(16) | ||||
By recalling that , we have . This yields
To this end, we proceed to study the bound of and , and have the following observations.
-
•
Since , we can obtain that each element of matrix is the sum of i.i.d. truncated Laplacian noises with zero mean and variance. Thus, .
-
•
Similarly, each element of is the sum of i.i.d. noises subject to . Then .
Thus, we have
The proof is thus completed.
References
- [1] A. Nedic, “Distributed gradient methods for convex machine learning problems in networks: Distributed optimization,” IEEE Signal Processing Magazine, vol. 37, no. 3, pp. 92–101, 2020.
- [2] L. Yan, J. L. Webber, A. Mehbodniya, B. Moorthy, S. Sivamani, S. Nazir, and M. Shabaz, “Distributed optimization of heterogeneous uav cluster pid controller based on machine learning,” Computers and Electrical Engineering, vol. 101, p. 108059, 2022.
- [3] P. Wan and M. D. Lemmon, “Event-triggered distributed optimization in sensor networks,” in 2009 International Conference on Information Processing in Sensor Networks, 2009, pp. 49–60.
- [4] I. Lobel and A. Ozdaglar, “Distributed subgradient methods for convex optimization over random networks,” IEEE Transactions on Automatic Control, vol. 56, no. 6, pp. 1291–1306, 2011.
- [5] I. Notarnicola, M. Bin, L. Marconi, and G. Notarstefano, “The gradient tracking is a distributed integral action,” IEEE Transactions on Automatic Control, pp. 1–8, 2023.
- [6] D. Jakovetić, J. Xavier, and J. M. F. Moura, “Fast distributed gradient methods,” IEEE Transactions on Automatic Control, vol. 59, no. 5, pp. 1131–1146, 2014.
- [7] A. Nedić and A. Olshevsky, “Distributed optimization over time-varying directed graphs,” IEEE Transactions on Automatic Control, vol. 60, no. 3, pp. 601–615, 2015.
- [8] S. Yang, Q. Liu, and J. Wang, “Distributed optimization based on a multiagent system in the presence of communication delays,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 47, no. 5, pp. 717–728, 2017.
- [9] K. Srivastava and A. Nedic, “Distributed asynchronous constrained stochastic optimization,” IEEE Journal of Selected Topics in Signal Processing, vol. 5, no. 4, pp. 772–790, 2011.
- [10] C. Zhang, M. Ahmad, and Y. Wang, “Admm based privacy-preserving decentralized optimization,” IEEE Transactions on Information Forensics and Security, vol. 14, no. 3, pp. 565–580, 2019.
- [11] A. Nedic, A. Ozdaglar, and P. A. Parrilo, “Constrained consensus and optimization in multi-agent networks,” IEEE Transactions on Automatic Control, vol. 55, no. 4, pp. 922–938, 2010.
- [12] D. Jakovetić, “A unification and generalization of exact distributed first-order methods,” IEEE Transactions on Signal and Information Processing over Networks, vol. 5, no. 1, pp. 31–46, 2019.
- [13] D. Varagnolo, F. Zanella, A. Cenedese, G. Pillonetto, and L. Schenato, “Newton-raphson consensus for distributed convex optimization,” IEEE Transactions on Automatic Control, vol. 61, no. 4, pp. 994–1009, 2015.
- [14] A. Nedic, A. Olshevsky, and W. Shi, “Achieving geometric convergence for distributed optimization over time-varying graphs,” SIAM Journal on Optimization, vol. 27, no. 4, pp. 2597–2633, 2017.
- [15] J. Xu, S. Zhu, Y. C. Soh, and L. Xie, “Convergence of asynchronous distributed gradient methods over stochastic networks,” IEEE Transactions on Automatic Control, vol. 63, no. 2, pp. 434–448, 2017.
- [16] G. Scutari and Y. Sun, “Distributed nonconvex constrained optimization over time-varying digraphs,” Mathematical Programming, vol. 176, pp. 497–544, 2019.
- [17] S. Pu, W. Shi, J. Xu, and A. Nedić, “Push–pull gradient methods for distributed optimization in networks,” IEEE Transactions on Automatic Control, vol. 66, no. 1, pp. 1–16, 2020.
- [18] S. Pu and A. Nedić, “Distributed stochastic gradient tracking methods,” Mathematical Programming, vol. 187, pp. 409–457, 2021.
- [19] Y. Pan, P. Xiao, Y. He, Z. Shao, and Z. Li, “Mulls: Versatile lidar slam via multi-metric linear least square,” in 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 11 633–11 640.
- [20] Y. Zheng and Q. Liu, “A review of distributed optimization: Problems, models and algorithms,” Neurocomputing, vol. 483, pp. 446–459, 2022.
- [21] T. Yang, X. Yi, J. Wu, Y. Yuan, D. Wu, Z. Meng, Y. Hong, H. Wang, Z. Lin, and K. H. Johansson, “A survey of distributed optimization,” Annual Reviews in Control, vol. 47, pp. 278–305, 2019.
- [22] C.-N. Hang, Y.-Z. Tsai, P.-D. Yu, J. Chen, and C.-W. Tan, “Privacy-enhancing digital contact tracing with machine learning for pandemic response: A comprehensive review,” Big Data and Cognitive Computing, vol. 7, no. 2, 2023.
- [23] Y. Lu and M. Zhu, “Privacy preserving distributed optimization using homomorphic encryption,” Automatica, vol. 96, pp. 314–325, 2018.
- [24] S. Mao, Y. Tang, Z. Dong, K. Meng, Z. Y. Dong, and F. Qian, “A privacy preserving distributed optimization algorithm for economic dispatch over time-varying directed networks,” IEEE Transactions on Industrial Informatics, vol. 17, no. 3, pp. 1689–1701, 2021.
- [25] Y. Wang and A. Nedić, “Tailoring gradient methods for differentially-private distributed optimization,” IEEE Transactions on Automatic Control, pp. 1–16, 2023.
- [26] Y. Lou, L. Yu, S. Wang, and P. Yi, “Privacy preservation in distributed subgradient optimization algorithms,” IEEE Transactions on Cybernetics, vol. 48, no. 7, pp. 2154–2165, 2018.
- [27] C. Dwork, F. McSherry, K. Nissim, and A. Smith, “Calibrating noise to sensitivity in private data analysis,” in Theory of Cryptography, S. Halevi and T. Rabin, Eds. Berlin, Heidelberg: Springer Berlin Heidelberg, 2006, pp. 265–284.
- [28] J. Le Ny and G. J. Pappas, “Differentially private filtering,” IEEE Transactions on Automatic Control, vol. 59, no. 2, pp. 341–354, 2013.
- [29] E. Nozari, P. Tallapragada, and J. Cortés, “Differentially private average consensus: Obstructions, trade-offs, and optimal algorithm design,” Automatica, vol. 81, pp. 221–231, 2017.
- [30] M. Ruan, H. Gao, and Y. Wang, “Secure and privacy-preserving consensus,” IEEE Transactions on Automatic Control, vol. 64, no. 10, pp. 4035–4049, 2019.
- [31] Z. Huang, S. Mitra, and N. Vaidya, “Differentially private distributed optimization,” in Proceedings of the 16th International Conference on Distributed Computing and Networking, ser. ICDCN ’15. New York, NY, USA: Association for Computing Machinery, 2015.
- [32] Y. Wang and T. Başar, “Decentralized nonconvex optimization with guaranteed privacy and accuracy,” Automatica, vol. 150, p. 110858, 2023.
- [33] E. Nozari, P. Tallapragada, and J. Cortés, “Differentially private distributed convex optimization via functional perturbation,” IEEE Transactions on Control of Network Systems, vol. 5, no. 1, pp. 395–408, 2018.
- [34] S. Han, U. Topcu, and G. J. Pappas, “Differentially private distributed constrained optimization,” IEEE Transactions on Automatic Control, vol. 62, no. 1, pp. 50–64, 2017.
- [35] M. T. Hale and M. Egerstedt, “Cloud-enabled differentially private multiagent optimization with constraints,” IEEE Transactions on Control of Network Systems, vol. 5, no. 4, pp. 1693–1706, 2018.
- [36] C. Dwork, A. Roth et al., “The algorithmic foundations of differential privacy.” Found. Trends Theor. Comput. Sci., vol. 9, no. 3-4, pp. 211–407, 2014.
- [37] Q. Geng, W. Ding, R. Guo, and S. Kumar, “Tight analysis of privacy and utility tradeoff in approximate differential privacy,” in Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, ser. Proceedings of Machine Learning Research, S. Chiappa and R. Calandra, Eds., vol. 108. PMLR, 26–28 Aug 2020, pp. 89–99.
- [38] B. Balle and Y.-X. Wang, “Improving the gaussian mechanism for differential privacy: Analytical calibration and optimal denoising,” in International Conference on Machine Learning. PMLR, 2018, pp. 394–403.
- [39] L. Wang, W. Liu, F. Guo, Z. Qiao, and Z. Wu, “Differentially private average consensus with improved accuracy-privacy trade-off,” arXiv:2309.08464, 2023.
- [40] M. Mesbahi and M. Egerstedt, Graph theoretic methods in multiagent networks. Princeton University Press, 2010.
- [41] R. A. Horn and C. R. Johnson, Matrix Analysis. Cambridge University Press, 1985.
- [42] Z. Qiao, F. Guo, X. Pan, Y. Sun, and L. Wang, “Distributed load shedding via differentially private average consensus algorithm,” in 2022 34th Chinese Control and Decision Conference (CCDC), 2022, pp. 1503–1508.
- [43] P. Paillier, “Public-key cryptosystems based on composite degree residuosity classes,” in International conference on the theory and applications of cryptographic techniques. Springer, 1999, pp. 223–238.
- [44] F. McSherry, “Privacy integrated queries: An extensible platform for privacy-preserving data analysis,” Commun. ACM, vol. 53, no. 9, p. 89–97, sep 2010.