background preloader

Shoematt

Facebook Twitter

Matthew Shoemate

Chaîne de Mal3volution. Whitest Kids U'Know: Sex Robot. Chaîne de VentertainMe. Quantitative Easing Explained. Chaîne de SuperVideoCompiler. How Liberals Argue. EAT DA POO POO. Chaîne de shoerob. Annoying Orange: Grandpa Lemon. Chaîne de Boonehams. Honey Bear.

Programming

C# Shoematt. Chris Smith's completely unique view : Awesome F# - Decision Tre. In my previous post I went over the theory behind the ID3 algorithm. Now that we got all that painful math out of the way, let’s write some code! Here is an implementation of the algorithm in F#. (It is also attached to this blog post, download it via the link at the bottom.) The entropy and informationGain functions were covered in my last post, so let’s walk through how the actual decision tree gets constructed. There’s a little work to calculating the optimal decision tree split, but with F# you can express it quite beautifully. let attributeWithMostInformationGain = attributesLeft |> List.map(fun attrName -> attrName, (informationGain data attrName)) |> List.maxBy(fun (attrName, infoGain) -> infoGain) |> fst First, it takes all the potential attributes left to split on… attributesLeft … and then maps that attribute name to a new attribute name / information gain tuple … |> List.map(fun attrName -> attrName, (informationGain data attrName)) |> fst The code is very straight forward.