## Probability Theory and Stochastic Processes  ## Meaning – Probability Theory and Statistics Processes

In probability theory and related fields, the stochastic process is a mathematical object, generally defined as a family of random variables. Stochastic processes are widely used as mathematical models of systems and phenomena that appear to change randomly. Examples include the growth of bacterial populations, electrical currents that fluctuate due to thermal noise, or the movement of gas molecules. Random processes have applications in many disciplines, such as biology, chemistry, ecology, neuroscience, physics, image processing, signal processing, control theory, information theory, computer science, cryptography, and telecommunications. Furthermore, seemingly random changes in the financial market have led to the widespread use of stochastic processes in finance.

Introduction

Remarkably, science that began with the consideration of games of chance should have become the most important object of human knowledge.

### A brief history

The Probability has an incredible history. The great French mathematician Blaise Pascal (1623-1662) was inspired by a practical gaming dilemma experienced by the French aristocrat Chevalier de Mere. Pascal’s contact with another French mathematician, Pierre de Fermat (1601-1665), in the form of seven letters in 1654 is considered as the origin of probability. Jacob Bernoulli (1654-1705), Abraham de Moivre (1667-1754), Thomas Bayes (1702-1761), and Pierre Simon De Laplace (1749-1827) were early mathematicians who contributed to the development of probability. Laplace’s Theory Analytique des Probabilités provided extensive tools for calculating probabilities based on the principle of the probability of permutations and combinations. Laplace also said,

“Probability theory is nothing but common sense reduced to calculation.”

Chebyshev (1821-1894), Markov (1856-1922), von Mises (1883-1953), Norbert Wiener (1894-1964), and Kolmogorov (1903-1987) were later mathematicians who contributed to new advancements. Probability has evolved over the previous four centuries and a half to become one of the most important mathematical tools used in disciplines as diverse as economics, business, physical sciences, biological sciences, and engineering. It is very useful in tackling real electrical engineering issues in communication, signal processing, and computing. Regardless of the foregoing advancements, mathematicians have struggled for millennia to come up with a clear definition of probability. Kolmogorov in 1933 gave the axiomatic definition of probability and resolved the problem.

Randomness arises because of

• random nature of the generation mechanism
• Limited understanding of the signal dynamics inherent imprecision in measurement, observation, etc.

For example, thermal noise appearing in an electronic device is generated due to the random motion of electrons. We have a deterministic model for weather prediction; it takes into account the factors affecting the weather. We can locally predict the temperature or the rainfall of a place based on previous data. Probabilistic models are established from observation of a random phenomenon. While the probability is concerned with the analysis of a random phenomenon, statistics help in building such models from data.

### Deterministic versus probabilistic models

A deterministic model for a physical quantity and the process generating it can be utilized if enough information about the initial state and the dynamics of the process generating the physical quantity is available. For example,

• we can use Kirchoff’s laws to determine the position of a particle moving under a constant force if we know the particle’s initial position as well as the magnitude and direction of the force.
• We can also use Kirchoff’s laws to determine the current in a circuit consisting of resistance, inductance, and capacitance for a known voltage source.

Many physical quantities are random in the sense that they cannot be anticipated with confidence and can only be explained using probabilistic models. For example,

• The outcome of a coin toss cannot be anticipated with confidence. As a result, the outcome of a coin flip is random.
• The number of ones and zeros in a binary data packet coming over a communication link cannot be accurately anticipated and is random.
• Only statistical analysis can simulate the omnipresent noise that corrupts the signal during capture, storage, and transmission.

#### How to Interpret Probability

Mathematically, the probability that an event will occur is expressed as a number between 0 and 1. Notationally, the probability of event A is represented by P (A).

• If P (A) equals zero, event A will almost definitely not occur.
• If P (A) is close to zero, there is only a small chance that event A will occur.
• If P (A) equals 0.5, there is a 50-50 chance that event A will occur.
• If P(A) is close to one, there is a strong chance that event A will occur.
• If P(A) equals one, event A will almost definitely occur.

In a statistical experiment, the sum of probabilities for all possible outcomes is equal to one. This means, for example, that if an experiment can have three possible outcomes (A, B, and C), then

P(A) + P(B) + P(C) = 1.