Articles added in Mathematics Lab | Cyber Lab is now open | Astronomy Lab is now open | VLSI Lab is now open | 120 SEO Tips article published | More exciting articles coming soon!


What is Probability ?

  • Probability is a kind of mathematics in which we find possibility of any result

  • In an experiment, there are total 's' occurrences/sample out of which 'A' occurrences/samples are favourable or required in a particular event 'E', then the probability of the event E is defined as a P(E) = n (A)/n(S).
    p(E)= probability of an event
    n(A)= required occurrences/samples of an event
    n(S)= total number of possible occurrences/samples

  • Examples:
    1. A Pot contains 10 copper coins, 5 silver coins and 6 gold coins. If three coins are picked at random from the pot, then what is the probability that all the three coins are silver coins?
  • Solution:
  • Total coins are 10+5+6 = 21 coins
    Silver Coin= 5
    Probability of (E) where 'C' is combination.

    2. Two Dices are thrown:
      a) The first die will give 4, 5, 6 & the second die will give 1,2,3,4.
      b) The sum of number is 12.
  • Solution:

  • NOTE: If there are two dices thrown then there are many combinations possible out of this outcome which all possibilities will give the correct answer that will be the probability. So the total possible outcomes are nothing but sample space.
  • So if two dices are thrown then sample space is:

  • S = {(1, 1), (1, 2), (1, 3)...(1, 6)
    (2, 1), (2, 2), (2, 3)...(2, 6)
    (3, 1), (3, 2), (3, 3)...(3, 6)
    (4, 1), (4, 2), (4, 3)...(4, 6)
    (5, 1), (5, 2), (5, 3)...(5, 6)
    (6, 1), (6, 2), (6, 3)...(6, 6)}

    Therefore, n(S)= 36
    a) n(E1)= 12
    (Note: as per in sample space only 12 possible combinations are possible) NOW,
    Probability of (E1)

    b) E2 = {(6,6)}
    N(E2) = 1
    Probability of (E2)

    Applications of probability

  • Robotics and Artificial Intelligence

  • Communication Networks

  • Signal Processing

  • Poker game etc.

  • Applications of probability in Robotics

  • The robot is designed with the help of Human environment and moved along with the help of sensors running through different algorithms.
  • So probability is used to understand the law of physics, the position, orientation and movements in 3 dimensional space and also sometimes symbolic notation to name objects and concepts for robotics.

  • Programmed Robot

  • In this model of diagram, the programmer programs a robot by using simple conditions (parameters). This industrialist robot will function in the environment as a controlled robot without having its own intelligence.

  • But do we need this kind of robot which is helpful to us in this modern world?

  • No, because our goal should be to make smart robots by imparting Artificial Intelligence in them. Embedding intelligence will further help to build bridge between the gaps of raw data acquired from sensors and the abstract model used in programs. So the basic idea is to build a smart robot over a programmed robot dedicated to routine a particular task. We want robot to build his own algorithm in his brain with the help of mathematical tool used by programmer. So robot should have his own 'internal representations' of the environment and to propose the models programmer it should uses his basic programming tools.

  • Smart Robot

  • The common problem in smart robots is Localization which occurs when a robot is placed in a random environment, where it needs to start from scratch and localize itself. This causes maximum uncertainty in the above diagram as it uses Uniform Distribution.

  • The probability density function of the continuous uniform distribution is:

  • Smart Robot
  • where a and b are two input value obtained by robot to detect obstacle as per the above environment

  • So it may give a very high probability or low probability and even sometime zero probability. Hence, it is difficult to trace a minor deviation.

  • To overcome the above problem of tracing minor deviation, we shall use the “Markov Property” to model the localization problem. Markov localization addresses the problem of state estimation from sensor data.

  • This Markov representation is based on the famous probability theorem “Bayes Theorem” which is a very powerful tool to solve the uncertainty localization problem.

  • Bayes Theorem

  • Now we will study of above theorem with a practical example:

  • Bayes Example 1

  • Suppose Robot obtains a measurement z. Then what is P(open|z) ?

  • Bayes Example 2

  • P(open|z) is diagnostic.

  • P(z|open) is causal.

  • Often causal knowledge is easier to understand.

  • So Bayes Rule allows us understand and use causal knowledge.
    Bayes Example 2 Bayes Example 2

  • Thus, z raises the probability that the door is open.Suppose now a Robot obtains another observation z2. How we can integrate this new value or information. How we can get P(x|Z1…Zn) ?

  • Bayes Example 2 Bayes Example 2

  • Z2 lowers the probability that door is open.

  • Markov localization is a probabilistic algorithm: Instead of maintaining single possibilities as to where in the world a robot might be, Markov localization maintains a probability distribution over the space of all such possibilities. The probabilistic representation allows it to weigh these different possibilities in a mathematically way.

  • This is how Bayes theorem is used to solve the uncertainty localization problem in the application of obstacle avoiding smart robot.

  • CACKLE comment system

    Programming Resources
    Computer Networking Fundamentals Android Application