Finding the ROI of Title tag changes using Google's CausalImpact R package

After a conversation on Twitter about this new package, and mentioning it in my recent MeasureCamp presentation, here is a quick demo on using Google's CausalImpact applied to an SEO campaign.

CausalImpact is a package that looks to give some statistics behind changes you may have done in a marketing campaign.  It examines the time-series of data before and after an event, and gives you some idea on whether any changes were just down to random variation, or the event actually made a difference.

You can now test this yourself in my Shiny app that automatically pulls in your Google Analytics data so that you can apply CausalImpact to it.   This way you can A/B test changes for all your marketing channels, not just SEO.  However, if you want to try it manually yourself, keep reading.

Considerations before getting the data

Suffice to say, it should only be applied to time-series data (e.g. there is date or time on the x-axis), and it helps if the event was rolled out on only one of those time points.  This may influence the choice of time unit you use, so if say it rolled out over a week its probably better to use weekly data exports.  Also consider the time period you choose.  The package will use the time-series before the event to construct what it thinks should happen vs what actually happened, so if anything unusual or spikes occur in the test period it may affect your results.

Metrics wise the example here is with visits.  You could perhaps do it with conversions or revenue, but then you may get affected by factors outside of your control (the buy button breaking etc.), so for clean results try to take out as many confounding variables as possible. 

Example with SEO Titles

For me though, I had an example where some title tag changes went live on one day, so could compare the SEO traffic before and after to judge if it had any effect, and also more importantly judge how much extra traffic had increased.

I pulled in data with my go-to GA R import library, rga by Skardhamar.


I first setup, importing the libraries if you haven't got them and authenticating the GA account you want to pull data from.

Import GA data

I then pull in the data for the time period covering the event.  SEO Visits by date.

Apply CausalImpact

In this example, the title tags got updated on the 200th day of the time-period I pulled.  I want to examine what happened the next 44 days.

Plot the Results

With the plot() function you get output like this:

  1. The left vertical dotted line is where the estimate on what should have happened is calculated from.
  2. The right vertical dotted line is the event itself. (SEO title tag update)
  3. The original data you pulled is the top graph.
  4. The middle graph shows the estimated impact of the event per day.
  5. The bottom graph shows the estimated impact of the event overall.

In this example it can be seen that after 44 days there is an estimated 90,000 more SEO visits from the title tag changes. This then can be used to work out the ROI over time for that change.

Report the results

The $report method gives you a nice overview of the statistics in a verbose form, to help qualify your results.  Here is a sample output:

"During the post-intervention period, the response variable had an average value of approx. 94. By contrast, in the absence of an intervention, we would have expected an average response of 74. The 95% interval of this counterfactual prediction is [67, 81]. Subtracting this prediction from the observed response yields an estimate of the causal effect the intervention had on the response variable. This effect is 20 with a 95% interval of [14, 27]. For a discussion of the significance of this effect, see below.

Summing up the individual data points during the post-intervention period (which can only sometimes be meaningfully interpreted), the response variable had an overall value of 4.16K. By contrast, had the intervention not taken place, we would have expected a sum of 3.27K. The 95% interval of this prediction is [2.96K, 3.56K].

The above results are given in terms of absolute numbers. In relative terms, the response variable showed an increase of +27%. The 95% interval of this percentage is [+18%, +37%].

This means that the positive effect observed during the intervention period is statistically significant and unlikely to be due to random fluctuations. It should be noted, however, that the question of whether this increase also bears substantive significance can only be answered by comparing the absolute effect (20) to the original goal of the underlying intervention.

The probability of obtaining this effect by chance is very small (Bayesian tail-area probability p = 0.001). This means the causal effect can be considered statistically significant."

Next steps

This could then be repeated for things like UX changes, TV campaigns, etc. You just need the time of the event and the right metrics or KPIs to measure them against.

The above is just a brief intro, there is a lot more that can be done with the package including custom models etc, for more see the package help file and documentation.

My Google Webmaster Tools Downloader app

Here is a tool that I have used for SEO analytics, that I am now making publicly available. It extends Google Webmaster Tools to help answer common SEO questions more easily.

Visit the Google Webmaster Tools Downloader

Here are a few example questions it helps answer:

  • SEO keyword rankings taking into account personalisation and localisation for Google, in this age of (not provided)
  • SEO keyword performance beyond the 90 days available by default e.g. year on year comparisons
  • How a segment of keywords have performed over time e.g. brand vs non-brand
  • How click through rates change over time e.g. after a website migration.
  • How new/old website sections perform in Google search via the Top Pages reports

These things were a lot easier before (not provided) took keywords out of web analytics.  This left Google Webmaster Tools as the only reliable source of rankings, but it was not an ideal replacement, with limitations that needed to be worked around by downloading data via an API - an API that rarely gets updated.

I'm aware this app could quickly become obsolete if Google updated GWT, but it has also served as a great project for me to get to know working with App Engine, jinja2 templating, Google Charts, caching, Stripe, Bootstrap, etc. so its all been worthwhile - I think I can safely say its been the most educational project I've done, and can serve as another template for more sophisticated APIs (the Google Tag Manager API is in sights)

Its also my first app that will be charged for, simply because keeping a daily breakdown of keywords in a database carries a cost, which is probably why Google don't offer it for free at the moment. There are other web apps on the market that do downloads for free, but I am wary of those by the adage "if you don't pay for a service, you pay with your data".

I plan to follow it up with more deeper features, including Tableau examples of what you can do with this data once you have it at such a deep level. 

For now, if you want to sign up to test the alpha, please check out the signup page here

My Weekend Away To London at #MeasureCamp

This weekend I was in London for MeasureCamp, which is an analytics (un)conference with the emphasis on practical advice and knowledge sharing.  You can follow the tweets on their hashtag here: #MeasureCamp

It was my first conference in 5 years, since my general opinion on conferences are they are largely a waste of time apart from meeting up with people.  But my tweetstream was filled with talk of this new conference format with no sponsored content; where all the sessions are provided by the attendees; "the law of two feet" encouraging a culture of being able to walk out of a session if it was too boring; and free food.  Since I'm always hungry, and the attendees all are world experts in their field, the content looked like it would be top-rate.

After the morning introduction, there was a scrum to get your session up on the board.

Again from Twitter feedback, I was encouraged to prepare a practical session on using R for a digital analyst.  My session in the morning was a bit nerve racking, but I managed to finish on time and have had great feedback, thanks very much to all who attended.  My presentation is here: Using R in a digital analytics workflow

I wish I could have attended more of the sessions such as Simo's GTM wizardry or @fastblokes Embed API session, but of the ones I did attend I generally got inspired or reassured on what we're doing at work.  One thing I'd of liked to see more of was how people are approaching attribution, but its a pretty old subject and maybe people were a bit loathe doing sessions on that. 

The venue was great, and free food and beer was provided.  Lots of rooms ranging big and small with a central area to bump into people.

My quick review of my sessions:

  • Unifying Customer Data - this actually was probably my worst session of the day, as it seemed a bit salesy and not very clear on what the take aways were.  I left early to prepare for my session
  • Custom dimensions and metrics - a discussion on what people were using custom metrics/dimensions for.  No real surprises.
  • From Anonymous to Identify - interesting discussion on how once you ID a user if you could use older data on that same cookie to reliably make offers to the user.  Probably could if you don't need a 100% match, but for making financial offers you really need that 100% authentication post-login.
  • Backroom to boardroom - probably most useful take-aways for me on the day, on how to communicate the tools of analytics to the C-suite to get investment.
  • Retargeting discussion - comparing notes on how other's are utilising remarketing.  Was a bit focused to the GA segments remarketing lists and not other platforms such as AdForm, but interesting to hear what people are doing.
  • Pings to Predictions - talk by the tech team of on how the are making a real-time, machine learning behavioural analytics tool.  Was very cool, and a nice team from Latvia I met in the pub the night before and afterwards.

But the sessions were just a part of it, around them were lots of meeting up with people I've know digitally for a long time but met face-to-face, old friends, new friends from Poland, France, Denmark and UK.

All in all I would recommend going, and look forward to being able to attend again.  Thanks to all the #MeasureCamp team on their hard work and enthusiasm :)

Run R, RStudio and OpenCPU on Google Compute Engine [free VM image]

File this under "what I wished was on the web whilst trying to do this myself."

edit 20th November, 2016 - now everything in this post is abstracted away and available in the googleComputeEngineR package - I would say its a lot easier to use that.  Here is a post on getting started with it.

edit 30th April, 2016: I now have a new post up on how to install RStudio Server on Google Compute Engine using Docker, which is a better way to do it. 

edit 30th Nov, 2015: Oscar explains why some users couldn't use their username

edit 5th October: Added how to login, add users and migrated from gcutil to gcloud

Google Compute Engine is a very scalable and quick alternative to Amazon Web Services, but a bit less evolved in the images available for users. 

If you would like to have a VM with R 3.01, RStudio Server 0.98 and OpenCPU installed, then you can click on the link below, and install a pre-configured version for you to build upon.

With this image, you have a cloud server with the most popular R / Cloud interfaces available, which you can use to apply statistics, machine learning or other R applications on web APIs.  It is a fundamental building block for a lot of my projects.

The VM image is here. [940.39MB]

To use, follow these steps:

Downloading the instance and uploading to your project

  1. Create your own Google Cloud Compute project if you haven't one already.
  2. Put in billing details.  Here are the prices you'll pay for running the machine. Its usually under $10 a month.
  3. Download the image from the link above (and here) and then upload it to your own project's Cloud Storage. Details here
  4. Add the uploaded image to your project with a nice name that is only lowercase, numbers or includes hyphens (-).  Details here. You can do this using gcloud and typing: 
$ gcloud compute images create IMAGE_NAME --source-uri URI

Creating the new Instance

  1. Now go to Google Compute Engine, and select Create New Instance
  2. Select the zone, machine type you want (i.e. you can select a 50GB RAM machine if needed for big jobs temporarily)
  3. In the dropdown for images you should be able to see the image from step 4 above.  Here is a screenshot of how it should look, I called my image "r-studio-opencpu20140628"

Or, if you prefer using command line, you can do the steps above in one command with gcloud like this:

$ gcloud compute instances create INSTANCE [INSTANCE ...] --image IMAGE

Using your instance

You should now have RStudio running on http://your-ip-address/rstudio/ and openCPU running on http://your-ip-address/ocpu/test and a welcome homepage running at the root http://your-ip-address

To login, your Google username is an admin as you created the Google cloud project. See here for adding users to Google Cloud projects

If you don't know your username, try this command using gcloud to see your user details:

$ gcloud auth login

Any users you add to Debian running on the instance will have a user in RStudio - to log into Debian and add new users, see below:

$ ## ssh into the running instance
$ gcloud compute ssh <your-username>@new-instance-name
$ #### It should now tell you that you are logged into your instance #####
$ #### Once logged in, add a user: example with jsmith
$ sudo useradd jsmith
$ sudo passwd jsmith
$ ## give the new user a directory and change ownership to them
$ sudo mkdir /home/jsmith $ sudo chown jsmith:users /home/jsmith

Oscar in the comments below also explains why sometimes your username may not work:

Like other comments, my username did not work.

Rather than creating a new user, you may need to simply add a password to your user account:

$ sudo passwd .

Also, the username will be your email address with the '.' replaced with '_'. So became xx_yy

You may also want to remove my default user the image comes with:

$ sudo userdel markedmondson

...and remove my folder:

$ sudo rm -rf /home/markedmondson

The configuration used

If you would like to look before you leap, or prefer to install this yourself, a recipe is below. It largely cobbles together the instructions around the web supplied by these sources:

Many thanks to them.

It covers installation on the Debian Wheezy images available on GCE, with the necessary backports:

How To Use R to Analyse and Plot Your Twitter Use

Here is a little how-to if you want to use R to analyse Twitter.  This is the first of two posts: this one talks about the How, the second will talk about the Why.  

If you follow all the code you should be able to produce plots like this:

As with all analytic projects its split into four different aspects: 1. getting the data; 2. transformations; 3. analysing; 4. plotting.

All the code is available on my first public github project:

I did this project to help answer an idea: can I tell by my Twitter when I changed jobs or moved country?

I have the feeling the more I am doing SEO, the more I rely on Twitter as an information source; whereas for Analytics its more independent research that takes place more on StackOverflow and Github. Hopefully this project can see if this is valid.

1. Getting the data

R makes getting tweets easy via the twitteR package.  You need to install that, register your app with Twitter, then authenticate to get access to the Twitter API.

Another alternative to using the API is to use Twitter's data export, which will then let you go beyond the 3200 limit in the API. This gives you a csv which you can load into R using read.csv()

2. Transforming the data

For my purposes, I needed to read the timestamps of the tweets, and put them into early, morning, afternoon and evening buckets, so I could then plot the data.  I also created a few aggregates of the data, to suit what I needed to plot, and these dataframes I outputted from my function in a list.

Again, as with most analytics projects, this section represents most of the work, with to and fro happening as I tweaked the data I wanted in the chart.  Some tip I've picked up is to try and do these data transformations in a function taking the raw data as an input and outputting your processed data, as it makes it easier to repeat for different data inputs.

3. Analysing the data

This will be covered in the second post, and usually is the point of the whole exercise - it only takes about 10% of time on the project, but is the most important.

4. Plotting the data

This part evolves as you go to and fro from steps 2-3, but what I ended up with where these functions below.

theme_mark() is a custom ggplot2 theme you can use if you want the plots to look exactly the same as above, or at the very least show how to customise ggplot2 to your own fonts/colours.  It also uses choosePalette() and installFonts(). "mrMustard" is my name for the colour scheme chosen.

I use two layers in the plot - one is the area plot to show the total time spent per Day Part, the second is a smoother line to help pick out the trend better for each Day Part.

plotTweetsDP() takes as input the tweetTD (weekly) or tweetTDm (monthly) dataframes, and plots the daypart dataframe produced by the transformations above.  The timeAxis paramter expects "ym" (yearWeek) or "ym" (yearMonth) which it uses to make the x-axis be more suited to each.

plotLinksTweets() is the same, but works on the tweetLinks dataframe.

I hope this is of some use to someone, let me know in the comments!  Also any ideas on where to go from here - at the moment I'm working through some text mining packages to try and get something useful out of those. 

Again the full project code is available on Github here:

Five Things The UK Could Learn From Denmark, And Vice Versa

I first arrived in Denmark October 1st, 2010, so I guess I'm qualified enough now to see the highs and lows of living in Denmark versus the UK.  I thought I'd list them here, if only for you to see what my Danish experience is like day to day.

Five Things The UK Should Learn From Denmark

I apologise if the below already exist in the UK, it'll be mostly because I came from a small town in deep dark Cornwall, and civilization hasn't reached us down there yet.

  1. A single duvet for each person in a double bed.  This was a revelation.  No more bed wars fighting for your corner of a compromised duvet, your own as you like it but close enough to cuddle your partner.
  2. Supermarket logistics.  First, you can have a shopping basket with little wheels that you pull around behind you, like a shopping trolley but without the commitment. Second, once you arrive at the checkout there is a slidey thing that means you can pack your shopping whilst the customer behind you is served.  Genius. (edit: have been reliably informed by Steph slidey thing is just lacking in Cornish supermarkets.  Oh well.)
  3. No cash.  Everything is paid for by your Dankort (a debit card).  Even things like a coffee or papers etc.  No minimum charges, no whinging from the shopkeeper about transaction costs.
  4. A commitment to bicycles.  50% of people in Copenhagen commute every day.  I have an hour enforced exercise a day, no road rage stress, breathe in clean air, buildings aren't caked in soot, the dull roar of cars is absent.  I think this is a major contributor to why Denmark is voted the happiest country in the world.
  5. Work/Life balance.  Soon after I arrived from the UK and was working in the office, I found myself confused at around 4pm on a Friday.  Looking around I realised - the office was empty.  People had gone home, to eat dinner with their family.  Most had arrived at 8am after taking their kid to school.  I admit this isn't a habit I have yet cultivated, but its implications for the happiest country in the world are obvious.  Its not to say that the Danes are work-shy: it seems there is more personal responsibility given to people to finish their tasks, so they will work late when needed, just not to a prescribed 9-5.  In general there are more "flat" management structures.

Five Things Denmark Should Learn From The UK

Obviously I like living in Denmark as I'm still here, but my honeymoon and disillusionment periods all expats seem to have means I can see where improvements could be made, if not just for me for all Danes:

  1. Banter with strangers.  My first months I mistakenly sat in a cafe (the closest I could find to an English pub) with the hope if I frequented it enough, I could strike up conversations with the locals.  All I got was embarrassment from the waiters and uncomfortable glances.  The way to meet people in Denmark is to arrange common interest groups, you can't just rock down to a local pub, as the equivalent isn't frequented by normal people, and even dropping in on a friend without pre-arrangement is uncomfortable for Danes.
  2. Queueing. Its most marked when flying back to and from the UK.  In the UK, the queues for the passport control is based on order of arrival, and doesn't matter whether you're old, young, or English.  When you arrive in Denmark, the queue is based on elbows, size, if you're Danish and amount of determination. Danes will even start a new mini-queue heading off the normal one that the English try to start, just to avoid that extra 2 mins wait. Danish shops have to employ ticketing systems, just to avoid the inevitable ruckus if left to politeness.
  3. Animal Welfare.  The debate in the UK on vegetarianism and fur is light years ahead of Denmark.  This may have been smoke screened by the giraffe incident, where people got upset about lions doing something they do every day, and with animal welfare in mind regarding genetics.  The real story is that Denmark is one of the biggest exporters of fur in the world, as well as the biggest provider of bacon.  Factory farming is prevalent, with techniques such as sow stalls banned in the UK but only just becoming law in Denmark due to EU rulings.
  4. Ease of starting a small business.  My wife is starting up as a games designer, but is often coming up against what seems to be nonsensical blocks to my capitalist soaked British brain.  I have often piped up with suggestions that would work in the UK, but not in Denmark.  It covers things like VAT registration, which is obligatory in Denmark for any business(?) whereas UK need a turnover of £70k and tax credits - Denmark actually seems to encourage staying on welfare.
  5. ......Racial Tolerance? Up until I left the UK I would have ranked this over Denmark, being proud of the UK's integration of cultures from around the world.  But now, with the raise of UKIP and the rhetoric I read from over the channel I question if this value is still part of being British.  I hope so.  The Danes are very protective of being Danish, which for a country of 5.5 million is understandable, but Denmark is also paradoxically suffering from a declining population and shrinking workforce, whilst at the same time making it hard for people outside the EU to become Danish: In 2012, 3,000 people became Danish citizens, compared with 194,000 who became British citizens.  Dual citizenship is not an option (yet) but is being considered, which I will get if I can.  Citizens in the EU have an easier time, so I've been ok, but those outside of the EU I've heard nightmare scenarios, even with good jobs and fluent Danish.  My personal concern is that if the UK talks itself out of the EU my status may be questioned, which in this age of integration seems a step backwards - surely as the world gets more global, nationalism becomes less and less relevant?

What would you pick, if you had to?  If you're an expat or not, I'll be interested. Seeing the perception of the differences between Denmark and the UK is as interesting as if you have direct experience of both.

    Goodbye, NetBooster

    After 400 weeks, 2800 days, 7.67 years and 3 re-brands: I am moving on.

    I've witnessed the company go from a local SEO agency to an international digital agency, and its been an experience that has changed my life for the better in many ways, and I'd like to thank all the people who made it possible along the way. 

    Trying to condense 7.67 years worth of gratitude into one blog post is an impossible task, but I'll like to at least try to chart my journey below, as every one of these have helped shape where I am now (very sorry if I leave anyone out!!):

    • August 2007 - join Neutralize as an "Internet Assistant" in the Tolvaddon Energy Park office, Camborne.
    • I really needed a job as I had been in a band for the past year and had no income, so I turned up for work in my flip-flops and in my big brown VW van, which I had to use all my wages to fill with the diesel to get to work.  But I met a team of people working in this strange field of internet marketing which offered almost boundless opportunities.  From there, I'd like to thank Lucy and Janine for hiring me in the first place, Teddie for infusing the enthusiasm and expertise he would offer over the years afterwards, Mark, Ingo, Chris, Andrew, Adam, Stuart, Martin, Nigel and John for making me feel welcome.
    • May 2008 - Neutralize becomes Guava, and we move into the Nordics. Special fun times for the Falmouth carpoolers, Ingo, Luke, Lotte and the legend that is Daz.  New clients, new Google algos, new adventures with Gary, Rachel, Lotte, Paul, the Tom's: Telf, Wigley, Birmingham and Bailey, Hug, Sam, Ugo, George, Will, Clemi, Dan and John - rest in peace. 
    • September 2010 - my first visit to Denmark for a 3 month placement, where I first experience hyyge, many thanks to Morten for asking me over, the great SEO team of Kristoffer N and Erik, and to Kirstoffer E, Dan, Ann-Sofie, Kasper, Hans Peter, Christian K and P, Karen, Marie-Louise, Sidse and Andreas for making me feel welcome enough that I wanted to come back
    • July 2011 - I go part-time to focus on long term projects on my days off, something which was one of my best life decisions. I work as Social Media and Analytics Manager in the UK.  I move to Denmark again for a 9 month placement, but not before meeting Amy, Eve, Tim, Matt, Charlie, Sian, Charlotte, Mandy, Hollie, Alan, Lyndsey, Jowita, Emmanuel and Peter.  In Denmark great working with Line, Mads, Jens, Hans-Jørgen, Christian, Michaela, Ewa, Martin and Katrine
    • March 2012 - Guava become NetBooster. I join the NetBooster DNA team as an analytics consultant, headed up by Kristoffer Ewald, and with the world-class experts of Christian Pluzek, Dan Pahlen, where I learn lots. Helle, Alun and Mia add to the team that I will miss and made the decision to move very hard to make.  Not forgetting the French team whom I had the pleasure to work with: Thomas, Pierre, Emmanual, Vania and Jerome.

    I'm sorry if you're not mentioned above, if I have worked with you - I haven't mentioned the cool clients and the international offices, but if you see the list above I hope you can see why. 

    Looking back over the list of names I can see that the majority are off on their own various ventures around the world, and I get a sense of pride, like you are members of family out into the world.  I would very much like to hear from anyone, do let me know how you're doing if we haven't spoke in a while.  I'll still be around on SoMe so do get in touch on LinkedIn, Facebook, Twitter or G+ if any of then are your thing.

    My die is cast: I'm off for new adventures; but I hope the people I meet are half as lovely as those I have met.

      My Google Analytics Time Series Shiny App (Alpha)

      There are many Google Analytics dashboards like it, but this one is mine:

      My Google Analytics Time Series App

      Its a bare bones framework where I can start to publish publicly some of the R work I have been learning over the past couple of years. 

      It takes advantage of an Alpha of Shinyapps, which is a public offering of R Shiny, that I love and adore. 

      At the moment the app has just been made to authenticate and show some generic output, but I plan to create a lot more interesting plots/graphs from it in the future.

      How To Use It

      1. You need a Google Analytics account.  
      2. Go to
      3. You'll see this screen.  Pardon the over heavy legal disclaimers, I'm just covering my arse.  I have no intention of using this app to mine data, but other's GA apps might, so I would be wary giving access to Google Analytics for other webapps, especially now its possible to add users via the management API.
      4. Click the "GA Authentication" link.  It'll take you to the Google account screen, where you say its ok to use the data (if it is), and copy-paste the token it then displays.
      5. This token allows the app (but not me) process your data.  Go back to the app and paste the token in the box.
      6. Wait about 10 seconds, depending on how many accounts you have in your Google Analytics.
      7. Sometimes you may see "Bad Request" which means the app is bad, and the GA call has errored.  If you hard reload the page (on Firefox this is SHIFT + RELOAD), you need to reauthenticate starting from step 2 above. Sorry.
      8. You should now see a table of your GA Views on the "GA View Table" tab.  You can search and browse the table, and choose the account and profile ID you want to work with via the left hand drop downs. Example using Sanne's Copenhagenish blog:
      9. If you click on "Charts" tab in the middle, you should see some Google Charts of your Visits and PageViews. Just place holders for now.
      10. If you click on the "Forecasts" tab you should see some forecasting of your visits data.  If it doesn't show, make sure the date range to the far left covers 70 days (say 1st Dec 2013 to 20th Feb 2014). 
      11. The Forecast is based on Holt-Winters exponential smoothing to try and model seasonality.  The red line is your actual data, the blue the model's guess including 70 days into the future. The green area is the margin of error to 50% confidence, and the Time axis shows number of months.  To be improved.
      12. Under the forecast model is a decomposition of the visits time series. Top graph is the actual data, second is the trend without seasonal, third graph the 31 data seasonal trend and the forth graph is the random everything else.
      13. In the last "Data Table" tab you can see the top 1000 rows of data.

      That's it for now, but I'll be doing more in the future with some more exciting uses of GA data, including clustering, unsupervised learning, multinomial regression and sexy stuff like that.

      Update 24th Feb

      I've now added a bit of segmentation, with SEO and Referral data available trended, forecasted and decomposed.

      Map of the European Migration of Languages

      Just a quick note about this nice language migration map found on reddit lingustics (reddit amazing) at this website

      Imagine a world where the greatest technological achievement is the wheel? Mental.  Its why I love playing Civ 5 so much, getting a tiny sense of those times.

      Early around 1000 BC we have Proto Germanic developing in Denmark, which will eventually invade the British Isles twice - Old Norse and the Northmen who settle in France, Normans.  Any concept of nationality is ridiculous, we are all from everywhere else within a 2000 mile radius.

      Looking at my locales, we can see Brythonic starting around 400 AD, which turned into what we now know as Britons, Cornish and Welsh, which then got over-washed with the Saxons and Old Norse, which then got over-washed with the Normans giving us the mongrel English language today.

      Doesn't help

      I wish this helped a bit with learning Danish, but where I often trip up is where its too close but different - for instance using går ("to walk") interchanged with "go". 

      But there are a lot of Danish grammar rules that are similar to English, something which I guess wouldn't even be close to applying for languages such as Chinese. 

      The most alien Danish grammar wise to be is putting the word "the" at the end of a noun - "huset" meaning "the house" with et = the and hus = house, and even that doesn't apply to Jutland, the closest part of Denmark to Britain.

      Learning another language definitely makes you think more about your own, which is worthwhile. And as they say, each language you learn is like having another soul :D Mine is slowly being built.

      EDIT 4th Feb, 2014

      Alun who comments below has often recommended this Danish Red Book for those English learning Danish, as its written for English people in particular, highlighting where the two languages differ.