Featured Image -- 658

Generating Text Using a Markov Model

Originally posted on alexhwoods:

Screen Shot 2015-08-05 at 10.57.18 AM

A Markov Chain is a random process, where we assume the previous state(s) hold sufficient predictive power in predicting the next state. Unlike flipping a coin, these events are dependent. It’s easier to understand through an example.

Imagine the weather can only be rainy or sunny. That is, the state space is rainy or sunny. We can represent our Markov model as a transition matrix, with each row being a state, and each column being the probability it moves to another.

f5f65f94c2f9cd72f9f2ce10499e6b7b However, it’s easier to understand with this state transition diagram.


In other words, given today is sunny, there is a .9 probability that tomorrow will be sunny, and a .1 probability that tomorrow will be rainy.

Text Generator

One cool application of this is a language model, in which we predict the next word based on the current word(s). If we just predict based on the last word…

View original 507 more words


Optimise Python with closures

Originally posted on Wrong Side of Memphis:

This blog post by Dan Crosta is interesting. It talks about how is possible to optimise Python code for operations that get called multiple times avoiding the usage of Object Orientation and using Closures instead.

While the “closures” gets the highlight, the main idea is a little more general. Avoid repeating code that is not necessary for the operation.

The difference between the first proposed code, in OOP way

and the last one

The main differences are that both the config dictionary and the methods (which are also implemented as a dictionary) are not accessed. We create a direct reference to the value (categories and mode) instead of making the Python interpreter search on the self methods over and over.

This generates a significant increase in performance, as described on the post (around 20%).

But why stop there? There is another clear win in terms of access, assuming that the…

View original 310 more words