Playing with GUIs in R with RGtk2. Sometimes when we create some nice functions which we want to show other people who don’t know R we can do two things.
We can teach them R what is not easy task which also takes time or we can make GUI allowing them to use these functions without any knowledge of R. This post is my first attempt to create a GUI in R. Although it can be done in many ways, we will use the package RGtk2, so before we start you will need: I will try to show you making GUI on an example. I want to make an application which works like calculator. Firstly we need to make window and frame. window <- gtkWindow()window["title"] <- "Calculator" frame <- gtkFrameNew("Calculate")window$add(frame) It should look like this: Now, let’s make two boxes.
Simple UI in R to get login details. Occasionally I have to connect to services from R that ask for login details, such as databases.
I don't like to store my login details in the R source code file, instead I would prefer to enter the my login details when I execute the code. Fortunately, I found some old code in a post by Barry Rowlingson that does just that. It uses the tcltk package in R to create a little window in which the user can enter her details, without showing the password. The tcltk package is part of base R, which means the code will run on any operating system. Nice! Mechanical Turk Workflow with MTurkR. I’ve been using Thomas Leeper‘s MTurkR package to administer my most recent Mechanical Turk study—an extension of work on representative-constituent communication claiming credit for pork benefits, with Justin Grimmer and Sean Westwood.
MTurkR is excellent, making it quick and easy to: Programming a Twitter bot in R. A considerable share of Twitter accounts is not actually run by humans.
According to a recent release by Twitter, `up to approximately 8.5%' of the active users are bots or third-party software that automatically aggregates tweets. Bots can follow other users, retweet content or post content on their own. What they say is essentially generated by scripts. Take @TwoHeadlines, for example.
The bot, hosted by Darius Kazemi, scrapes headlines from Google News and replaces one of the nouns with another trending noun, which generates hilarious and sometimes thought-provoking tweets. If you are familiar with R, such projects are well within your reach. Step 1: Create content for the bot's tweets. The PhD whipping bot is inspired by @indiewhipbot, an equally tedious contemporary who pushes indie game developers back to work by shouting orders and closing with a mild insult. Dave Harris on Max Like Est. At our last Davis R Users’ Group meeting of the quarter, Dave Harris gave a talk on how to use the bbmle package to fit mechanistic models to ecological data.
Here’s his script, which I ran throgh the spin function in knitr: Fitting a Model by Maximum Likelihood. Maximum-Likelihood Estimation (MLE) is a statistical technique for estimating model parameters.
It basically sets out to answer the question: what model parameters are most likely to characterise a given set of data? First you need to select a model for the data. Printing R help files in the console or in knitr documents. Yesterday, I was creating a knitr document based on a script, and was looking for a way to include content from an R help file.
The script, which was a teaching document, had a help() command for when the author wanted to refer readers to R documentation. I wanted that text in my final document, though. There’s no standard way to do this in R, but with some help from Stack Overflow and Scott Chamberlain, I figured out I needed some functions hidden in the depths of the tools package. So I wrote this function: help_console prints the help file to the console or lets you assign the help file text to a character. Help_console(optim, "html", lines = 1:25, before = "<blockquote>", after = "</blockquote>") Description General-purpose optimization based on Nelder–Mead, quasi-Newton and conjugate-gradient algorithms. The function is part of my noamtools package on GitHub, where I keep various convenience functions. A Null Model for Age Effects in Disease with Multiple Infections. Here’s a little thought exercise I did that has caused me to go back and restart my Sudden Oak Death modeling in a new framework.
Feedback welcome. I’m especially interested in relevant literature – I haven’t found many good examples of macroparasite/multiple infection models with age structure. Introduction Cobb et al. (2012) develop two models of forest stand demography in the face of Sudden Oak Death. The first, a statistical survival model, estimated the rates of infection and time-to-mortality as functions of density of infected trees and tree size. Optimization - How to optimize for integer parameters (and other discontinuous parameter space) in R. Automated Archival and Visual Analysis of Tweets. ## Most of this code was adapted near-verbatim from Neil's post about ISMB 2012. ## ## Modify this.
The Power of the Evaluate, Parse, Paste Combo. # The Power of the Evaluate, Parse, Paste combination # One of the powerful features of Stata which I have missed the most when working with R is the absence of the Stata Macros that allow the user to construct bits of Stata code from anything and combine them into commands or variable names. # I know many a programmer who has modest to little use of Stata might sneer at this ability since it seems to imply some kind of laziness or lack of precision.
However, I have found myself on many an occation forced to use inefficient structures in coding in R that could have easily been simplified in Stata. # At last I have stumbled upon a solution! # By combining the command eval(parse(text=paste("String Command"))) I am able to do exactly that I want. Poor man’s integration – a simulated visualization approach – R is my friend. Every once in a while I encounter a problem that requires the use of calculus. This can be quite bothersome since my brain has refused over the years to retain any useful information related to calculus. Most of my formal training in the dark arts was completed in high school and has not been covered extensively in my post-graduate education. Fortunately, the times when I am required to use calculus are usually limited to the basics, e.g., integration and derivation.
P-values are (possibly biased) estimates of the probability that the null hypothesis is true. Last week, I posted about statisticians’ constant battle against the belief that the p-value associated (for example) with a regression coefficient is equal to the probability that the null hypothesis is true, for a null hypothesis that beta is zero or negative. I argued that (despite our long pedagogical practice) there are, in fact, many situations where this interpretation of the p-value is actually the correct one (or at least close: it’s our rational belief about this probability, given the observed evidence).
Download Files from Dropbox Programmatically with R. Normal distribution functions. Ah, the Central Limit Theorem. The basis of much of statistical inference and how we get those 95% confidence intervals. It's just so beautiful! Lately, I have found myself looking up the normal distribution functions in R. Sports Data and R. F1Stats – Visually Comparing Qualifying and Grid Positions with Race Classification. [If this isn't your thing, the posts in this thread will be moving to a new blog soon, rather than taking over OUseful.info...] Following the roundabout tour of F1Stats – A Prequel to Getting Started With Rank Correlations, here’s a walk through of my attempt to replicate the first part of A Tale of Two Motorsports: A Graphical-Statistical Analysis of How Practice, Qualifying, and Past SuccessRelate to Finish Position in NASCAR and Formula One Racing.
Specifically, a look at the correlation between various rankings achieved over a race weekend and the final race position for that weekend. So… let’s finally get started. The data I’ll be using for now is pulled from my scraper of the FormulaOne website (personal research purposes, blah, blah, blah;-). If you’re following along, from the Download button grab the whole SQLite database and save it into whatever directory you’re going to be working in in R… When did “How I Met Your Mother” become less legen.. wait for it… …dary! Or, as you’ll see below, when did it become slightly less legendary?
The analysis in this post was inspired by DiffusePrioR’s analysis of when The Simpsons became less Cromulent. Parameter Fitting for Models Involving Differential Equations. It looks like MATLAB, Octave and Python seem to be the preferred tools for scientific and engineering analysis (especially those involving physical models with differential equations). SIR Model - The Flu Season - Dynamic Programming. Stats and things: Visualizing Bus Stops with rCharts. I wanted to create a quick visualization of Bloomington IL bus stops. This data is in pdf file format spread across multiple files. The first step, before any mapping can occur, is downloading those files, parsing them to get the bus stop locations and times.