Basics before starting with Robotics — Part 1

Open House

It was our open house in 2017 when I realized the meaning of the saying

Jo dikhta ha woh bikta ha

What happened was that we were all standing in front of our final year project posters with demos on our laptops waiting for the company representatives to question us in the hope of getting an interview call ideally ending in a job offer. What I noticed was that everyone entering the room was attracted to a robot photographer which was one of the projects and the attention it got really shook me and after some thinking, I realized that all other projects were similar and the robot was different as it was a physical object instead of a demo on a computer screen, from that day, I decided to always work on things that can be physically seen outside the screen.

4 years ago

I started off by working on driving simulators for autonomous driving, as I love it, implementing alpha zero, lane detection, object tracking, cyclist detection, and many such projects but still, those were all demos on screen, and nothing physical so now I have decided to implement such algorithms on robots for which I need some background information on how these work in real life for which I have started following Professor Cyrill Stachniss with the hope of ideally working with him one day on an end-to-end autonomous car.

Starting off with the basics, this is what I have learned so far:

State Estimation

In robotics, it refers to things somewhere in space, often needing to estimate geometry e.g. where is an obstacle, where is the platform, etc. Some objects are dynamic such as car driving or platform moving through the environment but some are static such as trees, land markings on roads, traffic signs, etc. We want to estimate the location of objects in order to build a map of the environment.

SLAM (Simultaneous Localization and Mapping)

Estimating where the system is, where am I, and what does the world look like are coupled with each other e.g. You see a street sign 20 metres in front of you depends where you are so that you know where the street sign actually is.

Also, the other way around, if you don’t know where you are but you have the map of the environment and you see a street sign, you can estimate where you are given that you see it and you know where it is located based on your map information. Example of a 3D model built using a vehicle that is equipped with 3D lidar scanners which can estimate the distance to obstacles, objects, basically reflect the lidar beam, then gives you range information of how far that object is and by estimating the poles of the platform and integrating the range information combined with the positional information, you can actually build a 3D model of the environment or at least a 3D point cloud in order to use this information to localize, navigate and to do other things.

Recursive Bayes Filter

Recursive Bayes Filter is a framework for state estimation e.g position of a mobile robot in the environment based on sensor data and control commands such as steering commands. Some realizations of the framework include Kalman and Particle Filter. These probabilistic techniques are used in systems that navigate through the environment, perceive the environment, and derive actions.

Why Probability

Because the system doesn’t know its current state, it can only estimate it and this can be done best with probabilistic approaches because they allow us to explicitly encode uncertainty. We can also take this uncertainty into account in order to be more robust with respect to noise as we never live in a perfect world.

Common Applications of State Estimation

Mapping: Estimating what the world looks like.

Localization: Estimating where we are in the world.

SLAM: Solving localization and mapping jointly.

Motion Planning: Planning which action to execute in order to reach a goal or come closer to a goal and then using controls to execute these high-level motions.

Four Basic Axioms of Probability Theory

Axioms are definitions and based on these definitions everything else can be derived.

The first definition states that the probability is always a value between 0 and 1.

The second definition states that the probability of a true event, an event that holds is 1.

The third definition states that the probability for something that doesn’t hold is 0.

The fourth definition states that if we have 2 propositions A and B, the probability of A or B holds is equal to the probability that A holds + the probability that B holds - the probability that A and B hold together.

The above figure is a pictorial representation of Axiom 4 showing the subtracted region from A and B so that the probability of A or B can be computed.

Using the Axioms

Based on the axioms, we can derive the probability of not A:

Using the last axiom and replacing B with not A to start off.
The probability that A or not A holds is always 1 as in any given instant, one of the two will be true. Similarly, the probability that A and not-A holds is always 0 as it is impossible to achieve.
Assigning values and rearranging gives us the required answer.

Discrete Random Variables

Continuous Random Variables

Probability Sums up to One

Joint and Conditional Probability

Law of Total Probability

Marginalization

Example

The above represents that the probability of Y means summing over X and vice-versa.

S — Smokers
M — Males
Some basic examples of utilization of probability theory are given above.

Bayes’ Rule

Bayes Rule with Background Knowledge

Conditional Independence

Normal Distribution

If we are living in a continuous world, we have to provide the parametric function in order to describe a probability distribution, a very popular choice for this is a normal distribution or Gaussian distribution. It has a mean and standard deviation. Mean tells you where the peak of the distribution is while standard deviation tells the width of the distribution. More the peak of the distribution, the smaller the uncertainty of the distribution.

Multivariate Normal Distribution

Gaussian Mixture

That’s it for now. 3D Coordinates and Representations of Rotations is next, I hope to write about it soon.

References

--

--

--

MS Thesis Student, CVGL, LUMS http://pk.linkedin.com/in/talhahanifbutt

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Infrastructuring Open Data as a new common

Spark and Data Virtualization: Competitors or Cooperators?

Here’s All you Need to Know About Encoding Categorical Data (with Python code)

EXPLORATORY DATA ANALYSIS ON IRIS DATASET

Wherefore Data Science? And to What End?

Summarize whole paragraph to one sentence by Extractive Approach

Image result for summary

The Best Alternative for WhatsApp and Google Map

Data Cleaning with Python: Dealing with the Inevitability of Missing Data

An eraser erasing the word ‘Data’.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Talha Hanif Butt

Talha Hanif Butt

MS Thesis Student, CVGL, LUMS http://pk.linkedin.com/in/talhahanifbutt

More from Medium

Can AI Breathe? Or, Where Humanity Ends and the Automaton Begins

Decoding Machine Learning

AUTONOMOUS VEHICLE

The Incredible Journey in Robotics