Você está na página 1de 47

The Particle Swarm Optimization Algorithm and

its Applications in Wireless Communications

Presenter: Zaid

SigCom research group - SnT centre,

University of Luxembourg.

SigCom Seminars

May 2023

Zaid SigCom Seminars May 2023 1 / 12


Overview

1 Introduction to Evolutionary Algorithms

2 Particle Swarm Optimization (PSO)

3 Key Insights

4 Case Study

5 Concluding Remarks

Zaid SigCom Seminars May 2023 2 / 12


Introduction to Evolutionary Algorithms

The particle swarm optimization (PSO) is a class of the evolutionary compu-


tation algorithms (ECAs).

Zaid SigCom Seminars May 2023 3 / 12


Introduction to Evolutionary Algorithms

The particle swarm optimization (PSO) is a class of the evolutionary compu-


tation algorithms (ECAs).
The ECAs perform various operations for their evolution that are inspired by
biological behaviour and natural selection.

Zaid SigCom Seminars May 2023 3 / 12


Introduction to Evolutionary Algorithms

The particle swarm optimization (PSO) is a class of the evolutionary compu-


tation algorithms (ECAs).
The ECAs perform various operations for their evolution that are inspired by
biological behaviour and natural selection.
Some examples include: Genetic algorithms, artificial bee colony, ant colony
optimization, among many others.

Zaid SigCom Seminars May 2023 3 / 12


Introduction to Evolutionary Algorithms

The particle swarm optimization (PSO) is a class of the evolutionary compu-


tation algorithms (ECAs).
The ECAs perform various operations for their evolution that are inspired by
biological behaviour and natural selection.
Some examples include: Genetic algorithms, artificial bee colony, ant colony
optimization, among many others.
ECAs often perform well in solving highly non-linear problems due to the
following: i) They make no assumption about the underlying problem,

Zaid SigCom Seminars May 2023 3 / 12


Introduction to Evolutionary Algorithms

The particle swarm optimization (PSO) is a class of the evolutionary compu-


tation algorithms (ECAs).
The ECAs perform various operations for their evolution that are inspired by
biological behaviour and natural selection.
Some examples include: Genetic algorithms, artificial bee colony, ant colony
optimization, among many others.
ECAs often perform well in solving highly non-linear problems due to the
following: i) They make no assumption about the underlying problem, and ii)
They deal directly with the quality of potential solutions without performing
any function derivatives.

Zaid SigCom Seminars May 2023 3 / 12


Introduction to Evolutionary Algorithms

The particle swarm optimization (PSO) is a class of the evolutionary compu-


tation algorithms (ECAs).
The ECAs perform various operations for their evolution that are inspired by
biological behaviour and natural selection.
Some examples include: Genetic algorithms, artificial bee colony, ant colony
optimization, among many others.
ECAs often perform well in solving highly non-linear problems due to the
following: i) They make no assumption about the underlying problem, and ii)
They deal directly with the quality of potential solutions without performing
any function derivatives.
ECAs have wide applications in many different areas including: Finance, plan-
ning, and Engineering.

Zaid SigCom Seminars May 2023 3 / 12


Particle Swarm Optimization

The PSO was developed by Kennedy and Eberhart in 1995. It is simple


in concept, easy to implement, computationally efficient, and effective on a
variety of problems [1].

Zaid SigCom Seminars May 2023 4 / 12


Particle Swarm Optimization

The PSO was developed by Kennedy and Eberhart in 1995. It is simple


in concept, easy to implement, computationally efficient, and effective on a
variety of problems [1].
The swarm in PSO refers to a population of interacting individuals that col-
lectively adapt to the environment, and it relates to bird flocking and fish
schooling.

Zaid SigCom Seminars May 2023 4 / 12


Particle Swarm Optimization

The PSO was developed by Kennedy and Eberhart in 1995. It is simple


in concept, easy to implement, computationally efficient, and effective on a
variety of problems [1].
The swarm in PSO refers to a population of interacting individuals that col-
lectively adapt to the environment, and it relates to bird flocking and fish
schooling.
Individuals in the population influence one another, and in doing so they
become more similar as they evolve.

Zaid SigCom Seminars May 2023 4 / 12


Particle Swarm Optimization

The PSO was developed by Kennedy and Eberhart in 1995. It is simple


in concept, easy to implement, computationally efficient, and effective on a
variety of problems [1].
The swarm in PSO refers to a population of interacting individuals that col-
lectively adapt to the environment, and it relates to bird flocking and fish
schooling.
Individuals in the population influence one another, and in doing so they
become more similar as they evolve.
Due to the collective adaptation of particles, PSO is a class of the “swarm
intelligence” paradigms.

Zaid SigCom Seminars May 2023 4 / 12


PSO Process

(1) Initialize a population X of N particles (i.e. initial solutions). Each particle


is an M-dimensional vector, where M is the number of optimization variables.
Hence, X = [x 1 , x 2 , · · · , x N ]T has dimensions of N × M.

Zaid SigCom Seminars May 2023 5 / 12


PSO Process

(1) Initialize a population X of N particles (i.e. initial solutions). Each particle


is an M-dimensional vector, where M is the number of optimization variables.
Hence, X = [x 1 , x 2 , · · · , x N ]T has dimensions of N × M.
(2) Fix or adjust each particle in the population.

Zaid SigCom Seminars May 2023 5 / 12


PSO Process

(1) Initialize a population X of N particles (i.e. initial solutions). Each particle


is an M-dimensional vector, where M is the number of optimization variables.
Hence, X = [x 1 , x 2 , · · · , x N ]T has dimensions of N × M.
(2) Fix or adjust each particle in the population.
(3) Evaluate the fitness of individual particles.

Zaid SigCom Seminars May 2023 5 / 12


PSO Process

(1) Initialize a population X of N particles (i.e. initial solutions). Each particle


is an M-dimensional vector, where M is the number of optimization variables.
Hence, X = [x 1 , x 2 , · · · , x N ]T has dimensions of N × M.
(2) Fix or adjust each particle in the population.
(3) Evaluate the fitness of individual particles.
(4) Update the velocity matrix V (which has the same dimension as X ) based
on (personal, local, and/or global) best solutions according to Eq. (1).

Zaid SigCom Seminars May 2023 5 / 12


PSO Process

(1) Initialize a population X of N particles (i.e. initial solutions). Each particle


is an M-dimensional vector, where M is the number of optimization variables.
Hence, X = [x 1 , x 2 , · · · , x N ]T has dimensions of N × M.
(2) Fix or adjust each particle in the population.
(3) Evaluate the fitness of individual particles.
(4) Update the velocity matrix V (which has the same dimension as X ) based
on (personal, local, and/or global) best solutions according to Eq. (1).
(5) Update the population X according to Eq. (2).

Zaid SigCom Seminars May 2023 5 / 12


PSO Process

(1) Initialize a population X of N particles (i.e. initial solutions). Each particle


is an M-dimensional vector, where M is the number of optimization variables.
Hence, X = [x 1 , x 2 , · · · , x N ]T has dimensions of N × M.
(2) Fix or adjust each particle in the population.
(3) Evaluate the fitness of individual particles.
(4) Update the velocity matrix V (which has the same dimension as X ) based
on (personal, local, and/or global) best solutions according to Eq. (1).
(5) Update the population X according to Eq. (2).
(6) Repeat steps (2) to (5) until convergence or a reaching a maximum number
of iterations.

Zaid SigCom Seminars May 2023 5 / 12


PSO Process

(1) Initialize a population X of N particles (i.e. initial solutions). Each particle


is an M-dimensional vector, where M is the number of optimization variables.
Hence, X = [x 1 , x 2 , · · · , x N ]T has dimensions of N × M.
(2) Fix or adjust each particle in the population.
(3) Evaluate the fitness of individual particles.
(4) Update the velocity matrix V (which has the same dimension as X ) based
on (personal, local, and/or global) best solutions according to Eq. (1).
(5) Update the population X according to Eq. (2).
(6) Repeat steps (2) to (5) until convergence or a reaching a maximum number
of iterations.
(7) Select the particle with the highest fitness in the population accross all
iterations as the final solution.

Zaid SigCom Seminars May 2023 5 / 12


PSO Process (continued)
The general PSO formula for the velocity update is:
  (t)  
 (t+1)   (t)  (t)
V n,m
=µn V n,m
+ c 1 r1 [l ]m − X n,m
 
(t)
+c2 r2 [g ]m − X (t) n,m ,
 
(1)

where:

Zaid SigCom Seminars May 2023 6 / 12


PSO Process (continued)
The general PSO formula for the velocity update is:
  (t)  
 (t+1)   (t)  (t)
V n,m
=µn V n,m
+ c 1 r1 [l ]m − X n,m
 
(t)
+c2 r2 [g ]m − X (t) n,m ,
 
(1)

where:
n ∈ {1, 2, · · · , N} and m ∈ {1, 2, · · · , M}.

Zaid SigCom Seminars May 2023 6 / 12


PSO Process (continued)
The general PSO formula for the velocity update is:
  (t)  
 (t+1)   (t)  (t)
V n,m
=µn V n,m
+ c 1 r1 [l ]m − X n,m
 
(t)
+c2 r2 [g ]m − X (t) n,m ,
 
(1)

where:
n ∈ {1, 2, · · · , N} and m ∈ {1, 2, · · · , M}.
t is the current PSO iteration.

Zaid SigCom Seminars May 2023 6 / 12


PSO Process (continued)
The general PSO formula for the velocity update is:
  (t)  
 (t+1)   (t)  (t)
V n,m
=µn V n,m
+ c 1 r1 [l ]m − X n,m
 
(t)
+c2 r2 [g ]m − X (t) n,m ,
 
(1)

where:
n ∈ {1, 2, · · · , N} and m ∈ {1, 2, · · · , M}.
t is the current PSO iteration.
c1 and c2 are positive constants (a good practice is to set c1 = c2 = 2).

Zaid SigCom Seminars May 2023 6 / 12


PSO Process (continued)
The general PSO formula for the velocity update is:
  (t)  
 (t+1)   (t)  (t)
V n,m
=µn V n,m
+ c 1 r1 [l ]m − X n,m
 
(t)
+c2 r2 [g ]m − X (t) n,m ,
 
(1)

where:
n ∈ {1, 2, · · · , N} and m ∈ {1, 2, · · · , M}.
t is the current PSO iteration.
c1 and c2 are positive constants (a good practice is to set c1 = c2 = 2).
(t) (t)
r1 and r2 are random numbers drawn from uniform distribution with
values between 0 and 1.

Zaid SigCom Seminars May 2023 6 / 12


PSO Process (continued)
The general PSO formula for the velocity update is:
  (t)  
 (t+1)   (t)  (t)
V n,m
=µn V n,m
+ c 1 r1 [l ]m − X n,m
 
(t)
+c2 r2 [g ]m − X (t) n,m ,
 
(1)

where:
n ∈ {1, 2, · · · , N} and m ∈ {1, 2, · · · , M}.
t is the current PSO iteration.
c1 and c2 are positive constants (a good practice is to set c1 = c2 = 2).
(t) (t)
r1 and r2 are random numbers drawn from uniform distribution with
values between 0 and 1.
l is the local best solution, while g is the global best solution (both are
particles within the population).

Zaid SigCom Seminars May 2023 6 / 12


PSO Process (continued)
The general PSO formula for the velocity update is:
  (t)  
 (t+1)   (t)  (t)
V n,m
=µn V n,m
+ c 1 r1 [l ]m − X n,m
 
(t)
+c2 r2 [g ]m − X (t) n,m ,
 
(1)

where:
n ∈ {1, 2, · · · , N} and m ∈ {1, 2, · · · , M}.
t is the current PSO iteration.
c1 and c2 are positive constants (a good practice is to set c1 = c2 = 2).
(t) (t)
r1 and r2 are random numbers drawn from uniform distribution with
values between 0 and 1.
l is the local best solution, while g is the global best solution (both are
particles within the population).
µn is the inertia weight of nth particle (a good practice is to reduce the
value of µn over time).
Zaid SigCom Seminars May 2023 6 / 12
PSO Process (continued)
The population is then updated according to:

X (t+1) = X (t) + V (t) . (2)

Zaid SigCom Seminars May 2023 7 / 12


PSO Process (continued)
The population is then updated according to:

X (t+1) = X (t) + V (t) . (2)

Key insights into the PSO operation


The requirement to adjust or fix the population is due to the population
update in Eq. (2), which is likely to violate the optimization constraints.

Zaid SigCom Seminars May 2023 7 / 12


PSO Process (continued)
The population is then updated according to:

X (t+1) = X (t) + V (t) . (2)

Key insights into the PSO operation


The requirement to adjust or fix the population is due to the population
update in Eq. (2), which is likely to violate the optimization constraints.
Each particle might require more than a single adjustment [2].

Zaid SigCom Seminars May 2023 7 / 12


PSO Process (continued)
The population is then updated according to:

X (t+1) = X (t) + V (t) . (2)

Key insights into the PSO operation


The requirement to adjust or fix the population is due to the population
update in Eq. (2), which is likely to violate the optimization constraints.
Each particle might require more than a single adjustment [2].
Higher values of µ means faster exploration of the space, while smaller values
allow for better convergence.

Zaid SigCom Seminars May 2023 7 / 12


PSO Process (continued)
The population is then updated according to:

X (t+1) = X (t) + V (t) . (2)

Key insights into the PSO operation


The requirement to adjust or fix the population is due to the population
update in Eq. (2), which is likely to violate the optimization constraints.
Each particle might require more than a single adjustment [2].
Higher values of µ means faster exploration of the space, while smaller values
allow for better convergence.
The velocity matrix V can be initialized with zeros or randomly.

Zaid SigCom Seminars May 2023 7 / 12


PSO Process (continued)
The population is then updated according to:

X (t+1) = X (t) + V (t) . (2)

Key insights into the PSO operation


The requirement to adjust or fix the population is due to the population
update in Eq. (2), which is likely to violate the optimization constraints.
Each particle might require more than a single adjustment [2].
Higher values of µ means faster exploration of the space, while smaller values
allow for better convergence.
The velocity matrix V can be initialized with zeros or randomly.
While the PSO deals mainly with real-valued functions, discrete (combinato-
rial) optimization is also feasible [3].

Zaid SigCom Seminars May 2023 7 / 12


PSO Process (continued)
The population is then updated according to:

X (t+1) = X (t) + V (t) . (2)

Key insights into the PSO operation


The requirement to adjust or fix the population is due to the population
update in Eq. (2), which is likely to violate the optimization constraints.
Each particle might require more than a single adjustment [2].
Higher values of µ means faster exploration of the space, while smaller values
allow for better convergence.
The velocity matrix V can be initialized with zeros or randomly.
While the PSO deals mainly with real-valued functions, discrete (combinato-
rial) optimization is also feasible [3].
The per-iteration complexity of the PSO is O(MN).

Zaid SigCom Seminars May 2023 7 / 12


PSO Process (continued)
The population is then updated according to:

X (t+1) = X (t) + V (t) . (2)

Key insights into the PSO operation


The requirement to adjust or fix the population is due to the population
update in Eq. (2), which is likely to violate the optimization constraints.
Each particle might require more than a single adjustment [2].
Higher values of µ means faster exploration of the space, while smaller values
allow for better convergence.
The velocity matrix V can be initialized with zeros or randomly.
While the PSO deals mainly with real-valued functions, discrete (combinato-
rial) optimization is also feasible [3].
The per-iteration complexity of the PSO is O(MN).
The PSO does NOT guarantee optimal solutions.

Zaid SigCom Seminars May 2023 7 / 12


Case Study: Passive Beamforming Design

Consider two communication links with 2 transmitters and 2 receivers, and


assume there is a programmable reflective array of size M in the area.

Zaid SigCom Seminars May 2023 8 / 12


Case Study: Passive Beamforming Design

Consider two communication links with 2 transmitters and 2 receivers, and


assume there is a programmable reflective array of size M in the area.
The goal is to optimize the array phase response θ = exp(ȷψ) to maximize
the minimum SINR between γ1 (θ) and γ2 (θ), where

Zaid SigCom Seminars May 2023 8 / 12


Case Study: Passive Beamforming Design

Consider two communication links with 2 transmitters and 2 receivers, and


assume there is a programmable reflective array of size M in the area.
The goal is to optimize the array phase response θ = exp(ȷψ) to maximize
the minimum SINR between γ1 (θ) and γ2 (θ), where
|a T
i θ|
2
γi (θ) = T
, {i, k} ∈ {1, 2}, i ̸= k. (3)
|a k θ|2 + σ 2

Zaid SigCom Seminars May 2023 8 / 12


Case Study: Passive Beamforming Design

Consider two communication links with 2 transmitters and 2 receivers, and


assume there is a programmable reflective array of size M in the area.
The goal is to optimize the array phase response θ = exp(ȷψ) to maximize
the minimum SINR between γ1 (θ) and γ2 (θ), where
|a T
i θ|
2
γi (θ) = T
, {i, k} ∈ {1, 2}, i ̸= k. (3)
|a k θ|2 + σ 2

Therefore, we generate a population X with N particles.

Zaid SigCom Seminars May 2023 8 / 12


Case Study: Passive Beamforming Design

Consider two communication links with 2 transmitters and 2 receivers, and


assume there is a programmable reflective array of size M in the area.
The goal is to optimize the array phase response θ = exp(ȷψ) to maximize
the minimum SINR between γ1 (θ) and γ2 (θ), where
|a T
i θ|
2
γi (θ) = T
, {i, k} ∈ {1, 2}, i ̸= k. (3)
|a k θ|2 + σ 2

Therefore, we generate a population X with N particles.


Each particle is a vector of size M with entries between [−π, π] that represents
a possible candidate solution for ψ.

Zaid SigCom Seminars May 2023 8 / 12


Case Study: Passive Beamforming Design

Consider two communication links with 2 transmitters and 2 receivers, and


assume there is a programmable reflective array of size M in the area.
The goal is to optimize the array phase response θ = exp(ȷψ) to maximize
the minimum SINR between γ1 (θ) and γ2 (θ), where
|a T
i θ|
2
γi (θ) = T
, {i, k} ∈ {1, 2}, i ̸= k. (3)
|a k θ|2 + σ 2

Therefore, we generate a population X with N particles.


Each particle is a vector of size M with entries between [−π, π] that represents
a possible candidate solution for ψ.
The fitness function for nth particle is:
λn = min{γ1 (exp(ȷx n )), γ2 (exp(ȷx n ))}. (4)
For the remaining details, please refer to [4].

Zaid SigCom Seminars May 2023 8 / 12


Case Study (continued)

0
0 50 100 150 200 250 300 350 400

Figure 1: Convergence behavior of the proposed PSO when N = 50.

Zaid SigCom Seminars May 2023 9 / 12


Conclusions

The PSO algorithm is an efficient way to tackle non-linear optimization prob-


lem.

Zaid SigCom Seminars May 2023 10 / 12


Conclusions

The PSO algorithm is an efficient way to tackle non-linear optimization prob-


lem.
While the PSO is mainly applied to real-valued functions, discrete optimization
is also feasible.

Zaid SigCom Seminars May 2023 10 / 12


Conclusions

The PSO algorithm is an efficient way to tackle non-linear optimization prob-


lem.
While the PSO is mainly applied to real-valued functions, discrete optimization
is also feasible.
Individual particles influence one another, and they collectively adapt to the
environment.

Zaid SigCom Seminars May 2023 10 / 12


Conclusions

The PSO algorithm is an efficient way to tackle non-linear optimization prob-


lem.
While the PSO is mainly applied to real-valued functions, discrete optimization
is also feasible.
Individual particles influence one another, and they collectively adapt to the
environment.
A case study on passive beamforming was provided which demonstrated that
PSO can attain near optimal solutions.

Zaid SigCom Seminars May 2023 10 / 12


References

R. Eberhart and J. Kennedy, “Particle swarm optimization,” in Proceedings


of the IEEE international conference on neural networks, vol. 4, 1995, pp.
1942–1948.
Z. Abdullah, S. Kisseleff, E. Lagunas, V. N. Ha, F. Zeppenfeldt, and
S. Chatzinotas, “Integrated access and backhaul via satellites,” arXiv
preprint arXiv:2304.01304, 2023.

W.-N. Chen et al., “A novel set-based particle swarm optimization method


for discrete optimization problems,” IEEE Transactions on evolutionary
computation, vol. 14, no. 2, pp. 278–300, 2009.

Z. Abdullah, S. Kisseleff, K. Ntontin, W. A. Martins, S. Chatzinotas, and


B. Ottersten, “Successive decode-and-forward relaying with reconfigurable
intelligent surfaces,” in IEEE International Conference on Communications
(ICC), Seoul, South Korea, 2022, pp. 2633–2638.

Zaid SigCom Seminars May 2023 11 / 12


The End

That’s it for today!


Thank you

Zaid SigCom Seminars May 2023 12 / 12

Você também pode gostar