Cognitoware.Robotics.dll

Class ProbabilityUtil

System.Object

Cognitoware.Mathematics.Probability.ProbabilityUtil

A set of static functions used in statistical and probabalistic calculations.

Covariance(RandomDistribution<Double>, IEnumerable<Double>, Double)

Estimates the covariance of the distribution as if it were a Gaussian.

Entropy(RandomDistribution<Double>, IEnumerable<Double>, Double)

Estimates the entropy of a distribution.

Equals(Object)

Inherited from System.Object

Expectation(RandomDistribution<Double>, IEnumerable<Double>, Double)

Calculates the expected value of a distribution by approximate integration.

Finalize()

Inherited from System.Object

GetHashCode()

Inherited from System.Object

GetType()

Inherited from System.Object

IndependentProbabilityOf(T1, T2, RandomDistribution<T1>, RandomDistribution<T2>)

Calculates the joint probability of two independent events.

MemberwiseClone()

Inherited from System.Object

ToString()

Inherited from System.Object

public static Double Covariance(RandomDistribution<Double> p, IEnumerable<Double> pts, Double dx)

Calculates the covariance of a distrubiton using a set of uniformly spaced samples provided by the user.
The covariance provides a measure of how "spread out" the distribution is.
Combined with Expectation, this method can be used to find an Gaussian that approximates the target distribution.
Covariance may not converge for all distributions as the number of samples increases.
Distributions with long tails may not have a covariance that stabilizes as the sample size increase (such as Cauchy distributions).

`p`

- The distribution whose covariance is calculated.`pts`

- A set of uniformly spaced points used to estimate the covariance.`dx`

- The distance between each point.The covariance of the distribution.

public static Double Entropy(RandomDistribution<Double> p, IEnumerable<Double> pts, Double dx)

Entropy provides the number of bits of uncertainty in the distribution.
A distribution with a single known value has zero uncertainty and zero entropy.
A distribution with two equally probable values have 1 bit of uncertainty.
Continuous distributions can potentially have infinite entropy.
The user must provide a set of uniformly spaced samples and the space between each sample to approximate the distribution.

`p`

- The distribution whose entropy is calculated.`pts`

- A set of uniformly spaced points used to estimate the entropy.`dx`

- The distance between each point.The entropy of the distribution.

public static Double Expectation(RandomDistribution<Double> p, IEnumerable<Double> pts, Double dx)

Estimates a weighted average of all values in the distribution.
The user must supply a set of uniformly spaced points in the distribution
as well as the distance between each point.

`p`

- The distribution whose expectation is calculated.`pts`

- A set of uniformly spaced points used to estimate the expectation.`dx`

- The distance between each point.The expected value of the distribution.

public static Double IndependentProbabilityOf(T1 x, T2 y, RandomDistribution<T1> px, RandomDistribution<T2> py)

From P(X) and P(Y), calculates P(X, Y).
P(X) and P(X) must be independent so that P(A,B) = P(A) * P(B).

`x`

- The value in X.`y`

- The value in Y.`px`

- The distribution across X.`py`

- The distribution across Y.The probability of result and range.