Monitoring for Fun and Profit II (RUM/RUA vs. Synthetic)

For the second piece in this series on monitoring for fun and profit we will focus on RUM/RUA. As was done previously will generically define RUM/RUA and cover a typical RUM set of metrics as a basis for a RUM starting point. I prefer the Real User Analytics (RUA) and will use it interchangeably with RUM.

RUA is complimentary to Synthetic

With RUA, there are NO agents but beacons that send back analytics information. RUA can come from any location, any ISP, any type of device RUA (desktop, tablet, phone, other) and using any level of bandwidth-at any time, and on any RUA-tagged page. If a page has a RUA tag but is never requested, there are NO analytics that can be derived from that page–it is a passive technology that waits for activation when the page is requested/processed.

So the technologies, while very different, are also very complimentary to each other.

The case for RUM/RUA

When you want to know granular analytics of the page/website experience is when you want to deploy RUM. There is no scripting for RUM because the RUM tags simply beacon back analytics when the page that they are on is displayed on a browser. What you don’t get with RUM is predictability. You are waiting for pages, tagged with a RUM beacon, to send you back whatever information they have, whenever they have it.

So you can see that the use case is very different.

Because of the sheer amount of data that RUM can generate (trillions of data points in a short period-imagine all the pages that are tagged and being hit by all the potential users) some RUM platforms either aggregate the data or samples a percentage of the total beacon data. Say what is kept and reported on in a RUM system is 5% of the total beacon data. This is a disadvantage because, by definition, we are missing 95% of the rest of the pages but that is the point; to simply sample and extrapolate with little fuss over which pages are tagged and how to modify them by customer development teams.

That said, we still extract some very important information from RUM data that we cannot get from synthetic. Important use information like who accessed what page from what kind of browser and OS. Or which pages were being delivered and what regions/locations were they going to?

Most RUM beacons can answer some/most of these questions about the WHO:

  1. What Browser was it?
    1. What Browser Version was it?
    2. What location did they come from?
  2. Continent
  3. Country
  4. Did they use HTTP/2?
  5. What end-use IP Address?
    1. What IP Version was that?
  6. What was the Network Carrier’s Name?
  7. What Network Type was this?
    1. Cable, Dial-up, DSL, Fiber Optic, ISDN, or mobile.
  8. What Operating System/Platform was the browser using?
  9. What Time did this happen?
  10. What was the URL that was delivered?

Conclusion

So, in the world of monitoring and getting feedback about your website, these two technologies occupy complementary use cases and should be used cooperatively with each other. Depending on who is consuming the data, the choice should not be one over the other but which use case/scenario is the best fit.

Marketing Team