Migrating reports based on custom reports to custom dimensions and ensuring continuity

EDIT! – read note at footer – issues with overcounting using this method!

We have a blog report that used a custom variable to log the authors name. This was fairly standard practise in order to group together blog posts and content – and we have a wide range of bloggers both guesting and on the payroll, so there are a multitude of reasons why we need to know which bloggers contributions are performing and which are not.

During our recent transfer to Universal Analytics, we obviously followed best practise and configured all these custom variable elements into custom dimensions, working on the same basis etc.

So during the cross over, custom variables died a death on the report – cue ‘where have our stats gone? / GA broken please fix’ email requests, and re-referencing the notification of the transfer where UA would now be reporting these under custom dimensions and to look there. The issue is one of convergence. Is it possible to set up a report that ‘munges’ these two elements together seamlessly to give continuation? Or does the business have to run two reports until such time the new one is the only one of interest. Obviously the former is the preferred option in order to allow historical comparison.

So first stage would be to set up a segment, which allows for both options to be driven from the data – so using a custom segment, setting a condition rule to return all traffic that had a blog author attached to it, either variable or dimension.

This materialised as a simple OR query using regex = ‘.’ (no quote marks in the actual regex!) in order to bring back entries where there were any characters attached for both variables and dimensions:


Edited – because its always a lower case slug of the blog authors name, I could use matches regex [a-z] instead of ‘.’…

This works in as much it will generate an overall visit statistic, across both types of data – and therefore allows comparison with previous periods:


However the table based output will not allow the two to be joined – the methods attempted were to output the custom variable and then add the custom dimension as a secondary dimension – but as the custom variable does not return non matching data, there is nothing for the custom dimension to reference against. If you attempt to open up the segment to include non-custom variable data, this effectively says return all data, and nullifies the segment – it is the output that will not allow you to say ‘return all custom variable entries, and group up all page views that did not have a custom variable value under a blank value’. So therefore a possible answer would be to set the custom dimension on EVERY page, listing a blank as the value for pages without an author – however the pages are obviously logged historically, this was not done previously and therefore would only work moving forward. As this is not a problem moving forward, this is not a feasible solution!

EDIT! – this does not actually work. There seems to be a fundamental difference between the way GA adds the custom variable to the page hit (and then recalls a count of all page hits by custom variable) and the way GA allocates a custom dimension to a page hit (and then subsequently samples the results in order to attribute that custom dimension across all page hits). This end result means that blog author counts are over calculating by approximately 25% and blog authors are being credited with the same blog entry, as well as pages they did not edit (home page, training lists etc).

This does not appear resolveable, so the only avenue left available is to extract the information using SSIS and then match up the page views to the authors in a back office function, or add the blog authors to the page names. Shame, that was a nice use of custom variables and I’m sure as i look further into this the limitations of this will become more and more apparent.



Never thought it would be so complicated to remove a customised object and put everything back across a business to the standard out of the box Salesforce instance. The hardest element – without doubt – has been the cultural changes to business processes. No matter how simplistic the approach technically, I have spent more time in meetings and on calls discussing the *why* things are different as opposed to the *how*.

What has been refreshing is having actually intelligent users to work with. On one hand the asking of *how* has not been an issue because people ‘get it’ – on the other the asking of *why* could be indicative of one of two things. It could be the negative approach which finds it difficult to cope with the change in itself as it affects working practice and people are scared of change, or alternatively it could be the ‘masters thinking’ approach of understanding the philosophical repercussions of the change in itself. And i’m quite happy to report it seems to be the latter. Although you would rather people just *did*, i’d much rather be surrounded by people who want to understand the logic underpinning the changes rather than moan about how crap life is…


Just a note, one day I must spend a lot more time reviewing the details around transaction logging and restore for production databases. The fact that i generally work on marketing systems where the server tends to be more for experimentation means I’m usually really lax on organising the setup. With that in mind, there are always a couple of business critical elements that need to be resolved, all of which become a priority when things start to go a bit sideways.


which i stole from this thread http://www.sqlservercentral.com/Forums/Topic650142-357-1.aspx

Seems to be a good place to start. In future, setting to Simple when relying on a restore without the need to roll back transactions should also be a good idea….

Upgrade experience from Google Analytics to Universal Analytics via Google Tag Manager

Comparison of expected benefits and realised benefits:

Universal Analytics upgrade is purely a change in the way the system gathers data, it’s a change to the new ‘Measurement Protocol’ which means you can fire data in from any source (including offline). It allows for easier customisation of options by presenting them in Analytics rather than having to code them into the tags, it replaces custom variables with custom dimensions and metrics (20 instead of the previous 5), and allows you to ‘stay up to date with new features and updates’.

Much has been blogged about the ability to enable UserID support across the account in order to generate multi touchpoint, single user style reports – this feature is not available https://groups.google.com/forum/#!topic/google-analytics-measurement-protocol/b4wn1b9GY5A

Steps Taken:

  • Removed all previous implementations of Universal Analytics created prior to upgrade path being chosen
    • GA > Admin > Choose relevant Account and Property to delete > View Settings > Delete View
    • Delete the tags created in GTM that point to the old UA implementation
  • The actual upgrade process (see To Do Reference URL) is fairly straightforward – you select the transfer property option, and wait. Once complete, it generates you the new code you need to implement, which includes the new property ID.
  • Establish tags in GTM (remove GA tags from GTM) – in order to do this (and to make sure I did not miss any) I just updated the existing tags to point to the new instance.
    • Update tag type to universal analytics
    • Point to new property ID
    • Check any firing rules or Ecommerce settings still worked
  • Migrate any custom variables into custom dimensions
    • Set up in Google Analytics Admin (Property > Custom Definitions)
    • Define as per requirements (noting the index number)
  • Set up rules in google tag manager as to when to fire – Tag > more Settings > Custom Definitions > Index number and choose the macro containing the data

Reference URLS:

https://developers.google.com/analytics/devguides/collection/upgrade/ (Benefits)

https://developers.google.com/analytics/devguides/collection/upgrade/guide (To Do)

https://developers.google.com/analytics/devguides/collection/upgrade/reference/gtm (GTM SetUp)

https://support.google.com/analytics/answer/2795983?hl=en-GB (UA Usage guidelines)

http://www.analyticsmarket.com/freetools/ipregex – IP address range regex creator

https://support.google.com/analytics/answer/1070983 (Data Limits on Tag Collection)

Issues Identified and things of note:

  • During implementation noticed that we also did not filter new office traffic from Analytics so used the IP address range regex creator to generate new rules.
  • UA uses a new cookie to track – lasts 2 years from last visit – although improves experience with subdomains, it does mean temporary increases in new user visits in analytics post implementation
  • Does not support Remarketing, Content Experiments, Google Display Network Impression Reporting (inc. Adsense), DoubleClick Campaign Manager Integration, and the Google Analytics Demographics and Interests Reports as yet.
  • Ecommerce tracks through its own code block, not embedded within GA code as before.
  • I have completed the upgrade on the basis that no specific tracking customisations had been made around timeouts or groupings of organic traffic or referrals etc.
  • If you use multiple trackers on each page and are upgrading to Universal Analytics, expect to see a small, temporary increase of new visitors relative to return visitors after updating your tags. This is expected behaviour and due to differences in the way sessions are processed in Universal Analytics.
  • I have assumed the data layer elements are still relevant and work fine with GTM
  • UA has a 10m hits per month limit – Google suggests self-sampling or you should review event tracking
  • Latency on data processing can be between 24-48 hours. UA accounts with < 200k visits per day will get refreshed once a day. Processing is from 12:00 UTC and can take approx. 10hrs which can delay updates to reports and metrics for up to two days. Only way to counter this is to upgrade to premium or reduce visits.

Using Google Tag Manager to Add Events


Just a quick reference around adding events to websites using google tag manager – events are managed through event listeners, which themselves need to be set up as tags in the container.

So you fire the event listener tag on pages you wish to have event tracking, in order to have GA listen for events. These events can be clicks, form submits, link clicks and timers. The difference between the two types of click tags seems to be ‘Click’ listener records all clicks and will not allow you to specify which link you want tracked. Link click events would be used to pick up individual links.

You check for the event listener firing in Google Tag Manager using {{event}} equals gtm.click

And then check for a specific element using {{element id}} equals [id used in HTML].

The guide suggests that “Note that checking the value only works in a rule that’s used by the tag that fires in response to the listener” – So this means “Checking the value” – means using {{element id}} equals [HTML ID] only works, if it is in the same rule as the listener check

So two tags –

1)       Add a site wide tag to cover the pages you want event listening on – choose the tag type and add the rule for tag firing (.* for all pages matching regex)

2)       Add a google analytics tag, type ‘event’ and add the rules:

  1. {{event}} equals gtm.click
  2. {{element id}} equals [HTML ID]






Issues Identified:

Dramatic drop in bounce rate

On implementation of a specific set of event listeners/click tags we noticed a significant drop in bounce rate.  Our site receives a lot of blog traffic and therefore bounce is relatively stable for this type of viewing behaviour. We introduced 2 events to track some product recommendations on the page, the first of which fired whenever the recommendations were loaded, and the second if any click event was fired from them.

As the first event (on the load of the recommendations) was fired, GA interpreted this as the first page load had a second interaction and therefore the visit did not bounce. The fix for this is identified in the old GA (ga.js) description of events by allowing a flag to be set for non-interaction events (should be set to true in this instance), the event would still fire but prevent the interaction from being assigned to the user and therefore not affect bounce rate.

Configuring SMTP services on Windows 7

Following a successful implementation of SQL Server Reporting Services on Windows 7 Professional, I then set out to try and figure out how to add on SMTP services, so that I could generate scheduled emails.

SMTP services are not installed by default on Windows 7, so the relevant information seems to be in http://social.msdn.microsoft.com/Forums/vstudio/en-US/ad9e940b-fe29-49fc-9bc4-6e572d505b2f/how-to-install-and-configure-smtp-server-in-windows-7?forum=csharpgeneral

Stating that you can install the services through the Remote Server Admin Tools (http://www.microsoft.com/en-us/download/details.aspx?id=7887) and once installed, services can be switched on using Windows Features.

I was already Service Pack 1 (included in the download) but ran the install anyway, and then found SMTP services in Windows Features. Pleased the job was so easy, restarted the computer to start configuring the service.

The service didn’t exist in the list, so I started looking at the IIS console to switch it on and IIS 6.0 had been installed, but trying to open the instance for the machine stated SMTP was not installed.

After more googling, people were suggesting that various settings on windows features should be switched on, mainly around IIS, so once these were switched on IIS7 was installed – again with no access to SMTP servers.

Further checking seems to suggest that SMTP services are only available if you installed Remote Server Admin Tools BEFORE installing Service Pack 1 (which is odd, as that seems to be the first thing installed by the RSAT download) – http://geekswithblogs.net/ferdous/archive/2011/03/15/smtp-setup-for-windows-7.aspx. I have not been able to prove or disprove this as I’m not prepared to roll back a work machine I use everyday to check.

I think (and I am no expert here) that the issue relates to what remote server admin tools are – people are incorrectly interpreting the question ‘how can I get an smtp server operational on a computer running windows 7’ and answering it with ‘is there a piece of software that contains smtp server that I can get a windows 7 install package for’. The true use of the remote server admin tools is described on the download page as: “Remote Server Administration Tools for Windows® 7 with SP1 enables IT administrators to manage roles and features that are installed on computers that are running Windows Server® 2008 R2, Windows Server® 2008, or Windows Server® 2003, from a remote computer that is running Windows 7 or Windows 7 with SP1.” The element I believe to be important is the install contains “SMTP Server Tools” – not SMTP server – which is described as “SMTP Server Tools includes the Simple Mail Transfer Protocol (SMTP) snap-in”.

So therefore the ability to load a snap in through a MMC which in turn allows you to control the SMTP services on a Windows Server remotely, will be installed. Again this is only an interpretation as I’ve not actually loaded the snap in on my MMC but that’s probably more my ineptitude 😉

It would appear you need to get a third party SMTP server and install that. MS don’t provide one.

First Post!

Hello! If you have found this then you’ll not necessarily want to hang around, as I’m using this blog to record professional and academic progress, discussion points and lists of things I find interesting as a bookmarking exercise. I’m sure there will be some content in here that’s written for an audience, but that is not my intention for every post.

Either way, I work in digital marketing mainly looking at data issues and management, web analytics and technical integration. I also am working towards (was once) Econsultancy and (still) Manchester Met’s acclaimed MSc program in Digital Marketing Communications.