KISSmetrics vs Mixpanel

While trying to decide on KISSmetrics or Mixpanel, I decided to write a blog post about it since I’m guessing other people are asking the same question. I am not in any way affiliated to either of them.

Analytics Impact is all about converting data into actionable insights. Though in order to order to find good insights you need to have the right data and be able to easily slice and dice the data as needed.

Google analytics can usually give you 90% of the “right data” for most sites, but it has a few major shortcomings that truly limit it when trying to use it to gain insight for a SaaS site.

  • It does not allow you to track data down to the individual visitor across visits
  • It doesn’t have time based cohort analysis

As I am now in charge of a SaaS site, I found myself needing answers to questions Google Analytics just couldn’t answer. I know there are free add-ons and work-arounds that could handle most of my needs just with Google Analytics, but I would rather pay a reasonable monthly fee than spend hours gluing everything together, and even then I wouldn’t have an easy to use reporting solution. I know because I’ve done it in the past.

What I need is a system to fully understand what visitors are doing on my website and then continue to track them when they sign up for a free account and ultimately become customers. Once they are customers I need to understand how they are using my SaaS site (what features they are or aren’t using) and why we lose customers.

I’ve been using web analytics for a while (even before Urchin became Google Analytics) so I already knew what my shortlist was for my needs:

KISSmetrics or Mixpanel

Let me start by saying that both of them are excellent choices. Neither is “better” in the absolute sense, but I need to decide on one or the other so I started looking deeper into which one would better meet my needs.

I found an excellent blog posting on this exact topic by Sacha Greif http://sachagreif.com/analytics-showdown-kissmetrics-vs-mixpanel/

A great read but with one major problem. It’s from March 2012. I know that’s just 8 months ago, but a lot has changed since then.

Here’s a request for both KISSmetrics and Mixpanel. Please provide a simple “changes.txt” type page that easily shows me what’s changed over time. That way if I read an old product review (like this one will be in a year) I’ll be able to easily see what’s changed. Mixpanel kinda has something like this for major changes on their about page.

Back to the comparison. I personally don’t need real-time data so I’m fine with KISSmetrics not being real time (though debugging can be a pain).

Since I really need to easily be able to look at individual user history I was originally leaning towards KISSmetrics as I thought Mixpanel doesn’t support this feature. I shortly found they do but only introduced the feature in July 2012 as a paid add-on.

I wonder why the “people feature”  https://mixpanel.com/people/ isn’t linked from the main site. If anything it makes the pricing page a bit confusing since they talk about the people plan add-on but don’t provide any further details.

As an ex-coder I must say the online documentation for KISSmetrics seems more comprehensive than the Mixpanel documentation. I was also surprised that Mixpanel doesn’t even link to their documentation from the main site (it’s at https://mixpanel.com/docs/ ). KISSmetrics has it linked from the footer at http://support.kissmetrics.com/

Next I wanted to look more into revenue reporting. I’m guessing that you can store revenue just like any other number in Mixpanel, though I’m a bit concerned that revenue isn’t mentioned anywhere on their site or their docs (I searched).

KISSmetrics on the other hand talks about lifetime value on their homepage and even has a revenue report as I found in their docs.

At this point I was just about to go with KISSmetrics when I stumbled across Mixpanel’s new Engage feature: http://blog.mixpanel.com/2012/10/19/insights-are-just-the-start/ Basically you can now send targeted emails or notifications with Mixpanel’s targeting criteria.

This is the kind of feature that was science fiction (for an analytics service) a few years ago. It’s interesting to see analytics and marketing automation services like Marketo or Eloqua really start to overlap.

I’m betting than in a few years we’ll see content targeting as an additional feature so you’ll also be able to easily show dynamic based on user behavior (though this has existed for a while as stand-alone products)

BTW, I came across https://www.klaviyo.com/ which seems to be very similar to Mixpanel and KISSmetrics though it heavily promotes their email integration as one of the main features (rightfully so). They are pretty new (April 2012) but I’d keep an eye on them.

I also wanted to mention http://customer.io/ which seems like a no-brainer if all you want is very smartly targeted emails.

UPDATE:

I just wanted to include some other services that look interesting and worth looking into for SaaS based analytics:

http://totango.com looks interesting as well. It’s laser focused on SaaS sites which I like. Very strong in natively identifying the type of real world data I’d want to look at (ie customers at risk of leaving). It does seem a bit behind in terms of reporting (I didn’t see any time based cohort analysis). Also no pricing info on their site though they were very responsive when I contacted them (a good indicator that they value good customer service).

I’d love to hear your thoughts – KISSmetrics or Mixpanel and why!

Advertisement

Are you making this common split testing mistake?

I was reading a simple case study today.

They were testing two different versions of a banner that was advertising a webinar.
One of the banners had an image of the presenter, while the other did not.
The banner without the image of the presenter won (by over 50%).

One of the comments was something along the lines of:

I guess this audience prefers banners without an image of a person.

*sigh*

If you don’t immediately realize the mistake the commenter made, don’t feel bad. It’s a very common mistake.

Beyond the fact that a specific banner (which did have have an image of the presenter) won over a different specific banner (which did not have an image of the presenter) you really can’t be sure of anything.

The loosing banner might have won with:

  • An image of a different person
  • A different image of the same person
  • The same image of the same person in a different position or size on the banner.
  • The same image of the same person in the same position and size but with different elements on the banner changed.

The point is:

Don’t jump to generalized conclusions based on the outcome of a specific experiment.

Should You Test or Target?

Recently I’ve been hearing more and more online buzz about the benefits of delivering targeted content to your visitors. In simple terms this means a customized message based on information you know about the visitor (opposed to a generic message which all visitors see).

A simple example would be adding a message for international visitors that your site ships to their country. Something more complex would be a 20% discount on ink cartridges for customers that purchased a printer in the past year but have not purchased any ink in the past 90 days (and of course the message would include the name of the printer they already purchased).

Serving up targeted content is indeed a valuable tool which I have used for many of our clients (I work for Adobe), though I invite you to take a step back and look at the greater question:

What content on my website will bring me the best results?

Intuitively it makes sense that targeted content will resonate better with visitors, and ultimately get more sales (or leads, etc).

On the other hand, you can simply test changes on your site which will effect everyone in order to try to improve your conversion rates.

Both are valid methods for optimizing your site and in an ideal world your company would be doing both.

In reality though, you have limited resources to improve your online marketing efforts and you’ll need to prioritize how much targeting you’ll do and how much user experience (common content) testing you’ll do.

Based on my personal experience, most websites still have huge room for improvement by simply optimizing the user experience through split testing. I’ve discussed this with a few other conversion rate professionals who agree. Just look at the case studies out there and you’ll see dozens of examples of how making relatively simple changes to your website can increase conversion rates by double digits.

In other words, you should initially focus on improving the common user experience and then test and test and test and then test some more. Only then does it make the most sense to start targeting (and of course test to see what targeted message performs best).

If you’re site sucks, it will still suck with targeted messaging.

I will add though that some targeting opportunities are very low hanging fruit and I would implement them without even testing. For example any traffic that you are sending to your web site and know what they clicked on to get there (search, display, email, etc) make sure the main message on the landing page is the same as the message they clicked on to get there.

I’d love to hear your targeting success and failures (and I’ll even provide feedback if you want).

Thanks
Ophir

Test Fatigue – Conversion Optimization’s Dirty Little Secret

I’m going to expose to you a phenomenon that’s fairly common when split testing, but no one seems to be talking about it (other than veteran split testers) and I don’t think it’s ever been blogged about (please add a comment if I’m wrong).

It has to do with the question:
“Will the lift I see during my split test continue over time”?

Let’s start by looking at a scenario commonly used by practically everyone in the business of split testing.

Your web site currently is currently generating $400k a month is sales which has been steady for the past few months. You hire a conversion optimization company, which does a split test on your checkout page.

After running the test for 3-4 weeks, the challenger version provides a 10% lift in conversion and RPV at a 99% statistical confidence level. The conversion rate company turns off the test and you hard code the winning challenger.

First of all – Wooohoo!!! (Seriously, that’s an excellent win.)

A 10% lift from $400k a month is an extra $40k a month. Annualized that amounts to an extra $480k a year. So your potential increased yearly revenue from using the winning checkout page is almost half a million dollars. Sounds pretty good to me.

Here’s the problem.

All things being equal, by using the winning version of the checkout page and not your old checkout page, there is a good chance you won’t be making an extra $480k in the next 12 months.

Don’t get me wrong. You will indeed be making more money with the winning checkout page than with the old one, but in all likelihood, it will be less than simply annualizing the lift from during the test itself.

The culprit is what I like to call “Test Fatigue” (a term I think I just coined).

Here’s what often happens if instead of stopping your split test after 3-4 weeks you could let it run for an entire year. There is a phenomenon that I’ve often, but not always seen with very long running split tests; after a while (this might be 3 weeks or 3 months) the performance of the winning version and the control (original) version start to converge.

They usually won’t totally converge, but that 10% lift which was going strong for a while with full statistical confidence is now a 9% lift or an 8% lift or a 5% lift or maybe even less.

As I mentioned before this doesn’t always happen and the time frame can change, but this is a very real phenomenon.

Why is does this happen?

Please read my next posting – Why Test Fatigue Happens where I provide some explanations on why this happens.

Also, I’d love to hear if you have also seen this phenomenon with your own tests and what your personal theories are as to why it happens.

Thanks
Ophir

How can I help you with conversation optimization?

I just realized it’s been almost six months since I last posted on this blog. While I have plenty of ideas for posts, I figured it might be best to ask you – my readers (all three of you) how I can help you. Specifically there are two major ideas I’ve had in my head for a while and I’m debating between which one to write about next.

The first idea is a technical overview of how the web works, going into detail on web analytics and split testing. Everything someone who is not a techie needs to know in order to gain a better understanding of what the data really means from a technical perspective as well the implications on how technical decisions impact business decisions.

The second idea is making conversion rate optimization more of a science and less of an art. I’ve read just about every book out there that deals with site and page optimization. I’ve also conducted countless split tests and have analyzed more sites than I can remember. What I’ve found is that there seems to be a major gap in the process where what to do next and how to do it becomes more of an art and less of a science.

Plenty of smart marketers can see a web page and know intuitively that it won’t convert well. Often it’s even easy to identify specific elements which are “broken” and need to be fixed, but more often than not (at least for me), it’s usually not so simple to explain the internal thought process of converting an OK page into a great one. This is something I’d like to address.

So, my loyal readers, please let me know what I should write about. Even if it’s something other than the two topics I’m thinking about let me know.

Thanks,
Ophir

Four New Additions to the Google Analytics API

For those of you not following the GA API change log Google just added four new data points:

Dimension

  • ga:dayOfWeek

Metrics

  • ga:percentVisitsWithSearch
  • ga:visitsWithEvent
  • ga:eventsPerVisitWithEvent

All of the new data points are essentially “calculated metrics”, meaning you could calculate this yourself if you were to download the data and do the calculations offline, but still, I applaud Google for continuing to make it easier to get the data without having to resort to offline processing.

Personally, I’m most existed about the dayOfWeek dimension. If you’ve never segmented your traffic by day of week, you really should. Do you know what day of the week has the highest conversion rates? Maybe you should be sending out your emails that morning :)

Google Analytics Report Permalinks

Just a quick post on a very cool trick I recently learned from my colleague Mike Plummer.

If you’ve ever tried to bookmark a report in Google Analytics or share a report URL via chat, you might have noticed that some of the report’s criteria isn’t included.

For example, here is the URL in my browser bar for the top content report for Jan 1-31, 2011:

https://www.google.com/analytics/reporting/top_content?id={profile id}&pdr=20110101-20110131&cmp=average#lts=1296578679097

Now lets say I want to add a page filter (at the bottom of the page) to only show URLs with “google” in them. After I’ve added the filter and clicked Go, the report now only shows URLs with “google” in them, but the URL has not changed!

If I were to bookmark the URL and come back to it later (or send it to a colleague) the report would not include the “google” filter you just added.

But there is a simple solution!

1. Click on the “Email” button on the top of the report (next to Export button)

2. Click on the “Back to report” text link at the top of the page – right underneath the “Set Up Email:”

3. You’re done!

Now the link in the address bar looks like this:

https://www.google.com/analytics/reporting/top_content?
id={profile id}&pdr=20110101-20110131&cmp=average&
rpt=TopContentReport&segkey=request_uri&q=google&qtyp=0&tst=0

As you can see, the page filter information is now included in the URL and when returning to this URL you’ll get exactly same same view!

Thanks again to Mike for the awesome tip.

Google Analytics API – Now With New Dimensions and Metrics

Google just added a boatload of new dimensions and metrics to the Google Analytics API:

http://googlecode.blogspot.com/2011/01/127-new-dimensions-and-metrics-through.html

I’ll spare you the technical details (you can read the official post) but I do want to comment on what I think is the most important change – 10 new Adwords dimensions.

Here’s why –

I admit I’m not an expert regarding Adwords administration and optimization tools, but until recently, they’ve had what I consider one very big flaw. Initially Adwords tools would look at the beginning of a visit; what happened on Google and the Google network such as impressions, CTR, CPC, etc and then what happened at the end of a visit, IF it ended with a conversion and you had Adwords conversion tags.

Then Google integrated Google Analytics goals into the mix which provided some additional data, but we’re still looking at the start and the end of a visit.

For sites that have a zillion visits and a few hundred conversions a day, you have enough data for analysis, though for the average small business site, there just isn’t enough data if you’re just looking at the end goal (sales, leads, etc).

In order to analyze the vast majority of visits that don’t end up in a conversion, you really need to look at metrics that serve as indicators for traffic quality such as bounce rate, time on site, page views, viewing key pages, etc.

This means that either the Adwords tool has it’s own internal analytics system (and you need to install yet another tag on your site) or it can take advantage of your existing analytics data.

I know a few vendors recently added Google Analytics metrics to the mix, which is a very welcome addition, but some key Adwords dimensions were still missing form the API.

Now that we have almost every Adwords dimension you could want in the API, I foresee a new wave of Google Analytics / Adwords integration, and eventually tools that will truly be able to automatically optimize your campaigns.

The Future of Split Testing and Conversion Rate Optimization

I’ve been fortunate enough to see and experience first hand the evolution of the Internet, from even before the web till today.

I’ll spare you a lengthy history lesson explaining how we’ve gone from brochureware sites to where we are today, but I do want to share some thoughts and perspective on where I think things are going.

When marketers started to understand the potential of dynamic web sites, there were two terms everyone was throwing around:

Personalization & Customization.

Fast forward to today (2011). The user experience is still exactly the same for all visitors (other than on a handfull of sites such as Amazon.com).

For the most part, web site Personalization has failed. Sure it sounds good in theory, but trying to tailor the web site experience at the individual level is extremely difficult. It is difficult both from a technological perspective but mostly by trying to create an optimal user experience based on data from a single individual.

There is no doubt in my mind that in the future (and to some extent today) the user experience when visiting a web site will be created dynamically based on what gets the best results, but based on “anonymous” information which is common to large groups of visitors, and not based on a single person.

This reminds me of the concept of Psychohistory from the science fiction series “Foundation” by Isaac Asimov.
Wikipedia explains it better than I can:

The premise of the series is that mathematician Hari Seldon spent his life developing a branch of mathematics known as psychohistory, a concept of mathematical sociology (analogous to mathematical physics). Using the law of mass action, it can predict the future, but only on a large scale; it is error-prone on a small scale. It works on the principle that the behaviour of a mass of people is predictable if the quantity of this mass is very large. The larger the number, the more predictable is the future.

I also like to think of this in terms of what usually happens at (successful) brick and mortar stores.

When you walk into a store, the salesperson probably doesn’t know you personally, but will probably try to help you based on certain public traits such as gender, age, if you’re by yourself or with someone else, etc.

Which brings me back to what actually prompted me to write this article in the first place :)

While I’ve been split testing since 2005 in order to improve conversion rates, the majority of the time, it’s still about what works best for the site as a whole, opposed to split testing together with segmentation (which is what we really want).

Until recently, there haven’t been many options out there to achieve this level of targeting and testing (at least not priced for small to mid sized businesses) but over the past few months, I’ve been starting to see more and more startups trying to bring this level of sophistication to the masses.

While I haven’t had a chance to use any of these services first hand, there is no doubt in my mind that business that truly embrace this level of targeting and split testing will eventually lead the pack and leave most one-size-fits all web sites in the dust.

Google Analytics on Intranets and Development Servers / FQDN

Just a quick posting about using Google Analytics on pages that don’t use a fully qualified domain name.

If you’re using Google Analytics on a site with a URL like http://intranet/ or something like http://mydevserver:12345 it won’t work.

Specifically, the Google Analytics JS code will not send the tracking hit (__utm.gif) to the GA servers.

I don’t really know the specifics, but I’m guessing that the domain hashing code looks for at least one period in the hostname and won’t work if it doesn’t find one.

Two alternatives come to mind:

1. Use an IP address if one will work. If you’re testing on a local machine 127.0.0.1 should work fine (that IP always resolves to the machine you’re on)

2. Turn off domain hashing. Simply using _setDomainName("none") in your code should also fix the issue.

Hope that helps someone who might be pulling their hair out trying to figure out why the page is not being tracked :)