Your first talk

Hi there folks. It’s been a long time since I wrote on this blog. I have been very busy with university applications. A lot has happened recently which I will love to share with you. Firstly, I got a news from a friend that my book is being used in McGill University to teach Python programming. That is something I have always wanted, Continue reading Your first talk

Advertisements

2015 in review

The WordPress.com stats helper monkeys prepared a 2015 annual report for this blog.

Here’s an excerpt:

The Louvre Museum has 8.5 million visitors per year. This blog was viewed about 640,000 times in 2015. If it were an exhibit at the Louvre Museum, it would take about 27 days for that many people to see it.

Click here to see the complete report.

Free Python GUI development course

Hi there folks. I have a good news for you. I have got two coupons for a video course on Udemy. The name of the course is “Learn Python GUI programming using Qt framework” and costs $79. It is taught by Bogdan Milanovich. Continue reading Free Python GUI development course

Intermediate Python Released!

Intermediate Python - coverHey folks! I am feeling really proud to announce the completion of my very own book. After a lot of hard-work and sheer determination this became possible and “Intermediate Python” saw the light of day. It will receive updates over time 🙂 Continue reading Intermediate Python Released!

Generating Text Using a Markov Model

alexhwoods

Screen Shot 2015-08-05 at 10.57.18 AM

A Markov Chain is a random process, where we assume the previous state(s) hold sufficient predictive power in predicting the next state. Unlike flipping a coin, these events are dependent. It’s easier to understand through an example.

Imagine the weather can only be rainy or sunny. That is, the state space is rainy or sunny. We can represent our Markov model as a transition matrix, with each row being a state, and each column being the probability it moves to another.

f5f65f94c2f9cd72f9f2ce10499e6b7b However, it’s easier to understand with this state transition diagram.

220px-Markov_Chain_weather_model_matrix_as_a_graph

In other words, given today is sunny, there is a .9 probability that tomorrow will be sunny, and a .1 probability that tomorrow will be rainy.

Text Generator

One cool application of this is a language model, in which we predict the next word based on the current word(s). If we just predict based on the last word…

View original post 507 more words