Hi there folks. It’s been a long time since I wrote on this blog. I have been very busy with university applications. A lot has happened recently which I will love to share with you. Firstly, I got a news from a friend that my book is being used in McGill University to teach Python programming. That is something I have always wanted, Continue reading Your first talk
The WordPress.com stats helper monkeys prepared a 2015 annual report for this blog.
Here’s an excerpt:
The Louvre Museum has 8.5 million visitors per year. This blog was viewed about 640,000 times in 2015. If it were an exhibit at the Louvre Museum, it would take about 27 days for that many people to see it.
Hi there folks! I am very busy now-a-days. You might already be aware of that due to the long pauses between posts. Therefore, I am searching for guest bloggers who would like to write about Python, it’s frameworks or literally anything interesting and informative related to Python. Continue reading Looking for guest bloggers
Hi there folks. I have a good news for you. I have got two coupons for a video course on Udemy. The name of the course is “Learn Python GUI programming using Qt framework” and costs $79. It is taught by Bogdan Milanovich. Continue reading Free Python GUI development course
Hey folks! I am feeling really proud to announce the completion of my very own book. After a lot of hard-work and sheer determination this became possible and “Intermediate Python” saw the light of day. It will receive updates over time 🙂 Continue reading Intermediate Python Released!
A Markov Chain is a random process, where we assume the previous state(s) hold sufficient predictive power in predicting the next state. Unlike flipping a coin, these events are dependent. It’s easier to understand through an example.
Imagine the weather can only be rainy or sunny. That is, the state space is rainy or sunny. We can represent our Markov model as a transition matrix, with each row being a state, and each column being the probability it moves to another.
In other words, given today is sunny, there is a .9 probability that tomorrow will be sunny, and a .1 probability that tomorrow will be rainy.
One cool application of this is a language model, in which we predict the next word based on the current word(s). If we just predict based on the last word…
View original post 507 more words
Hi guys. Recently my interview was published over at Mouse vs Python blog which is run by Mike. I am glad that I was able to become a part of Mike’s PyDev of the week series. This post is not going to be technical. I am going to use my time to clear up my mind through this post.