Mutual Information can also be expressed in terms of entropy:
Formula:
The diagram below illustrates this relationship:
Mutual Information (I(X; Y)) can be expanded to show its relationship with Entropy (H(X)):
Initial Expression:
Expansion:
Simplification:
Final Form:
Gaussian channels are a class of channels that are widely used in information theory.
Noise is additive and Gaussian distributed.
Channel capacity can be expressed as the maximum mutual information between the input and output of the channel.
The water-filling algorithm is used to find the optimal power allocation for a gaussian channel. It is used to maximize the mutual information between the input and output of the channel.
The optimal power allocation is given by the following optimization problem:
This metaphorical ”water filling” ensures that channels with lower noise levels receive more power because they can transmit information more effectively. Channels with higher noise
levels receive less power, as they contribute less to the overall capacity.
The Channel capacity is the sum of the capacities of each channel.
Linear response approximation is a powerful concept used in control theory. It allows us to approximate the dynamics of a system linearly around a set point, typically a null action or equilibrium. This approach not only facilitates the analysis of the exact trajectory but also enables us to understand the influence of small changes in the control signal on the system's evolution.
The state of the system can be expressed as:
Where:
The recursive mapping from
Sensitivity of
Where
Senstivity is a measure of how much the state of the system changes in response to a change in the control signal.
The dynamics equations for the inverted pendulum can be defined by:
For a duration
The channel capacity is calculated using the following equation:
Where:
We apply an empowerment-based control algorithm to the pendulum, which is given by the following equation:
This study highlights the promising path of empowerment for advancing robotics and autonomous systems.
update design
robotics example
insert gif of rl failure
H is the entropy of the random variable X, P(x) is the probability of the event x
information is measured in bits and as you can see by the use of the log function, it is a logarithmic measure, meaning that common events are not very informative and rare events are very informative
However, for any probability distribution, we define a quantity called the entropy, which has many properties that agree with the intuitive notion of what a measure of information should be
mutual information is the reduction in uncertainty of X due to the knowledge of Y. It is the difference between the entropy of X and the conditional entropy of X given Y.
this form of entropy is known as shannon entropy
delta x - change in the state vector between two points
F is the sensitivity matrix, how the change in control parameter affects the change in state x
a change in control vector is applied to the system and the change in state is observed with the noise
the sensitivity matrix is calculated using the svd decomposition of the jacobian matrix