My First Boosted Facebook Post

7d18aa2e-ddb1-45a5-ac15-d9c5ef51b58fI’ve recently finished a 4-day boosted post on Facebook. It was about a class I’ve got coming up, and included an image because posts with images are more likely to be interacted with, and interaction with a post loosens the death-grip of Facebook’s filter.  It had a link to more information about the class, which the text of the post lead up to. I put $20 into it–a rather inexpensive promotion, provided it gave decent results.

Here’s what I discovered once it was over.

The First Red Flag

The first thing I noticed, even while it was running, was where the “ad” was being run. It was shown 2,941 times to mobile users, and 27 times to desktop users. However, the trend from the start was that the desktop users were the ones most likely to interact. Of the 56 “engagements,” 11 were desktop users. That makes the “engagement rate” of desktop 40.7%, while mobile was 1.5%. And yet Facebook served the ad to the least likely audience 99.1% of the time.

No points to Facebook for effectiveness on that point.

Waitaminute … this is Seattle, right?

I had targeted the post to people with interest in business (specifically, “Business, Retail, Wholesale, Website, Communication”) within 10 miles of Seattle, to get some of the outlying communities who might still come to the class.  That’s roughly going to correspond with King County, WA.  The demographics of King County are roughly 71% white, 15% Asian, 9% Hispanic/Latino.  About a quarter of King County residents speak a language other than English at home.

I’d expect that the interactions with my post would, very roughly, approximate that demographic…more than half white, for example, and mostly English speakers.

But what I got was very much not that.

I can only see the people who Liked (35 people), Commented (0), or Shared (0) the post.  Of those, more than 1/3 had what appeared to be Hispanic/Latino family names, and only one had a name consistent with China, Taiwan, Japan, India, Vietnam, or Cambodia.

That got me thinking, so I looked at profiles, and about half of the people whose posts I could see posted predominantly in non-English languages, with Spanish being by far the most common.  Very strange for King County and it’s 9% Latino population if my post is only being shown within 10 miles of Seattle.

So I did a little more digging and looked at where people lived.   A third didn’t have any geographic information at all.  Just over a quarter came from Seattle and neighboring communities I’d expect to see with the “+10 miles” modifier.  And the rest came from other places.  Like Ohio, and Florida, and New York.  Puerto Rico.  Suffice it to say that none of those places, while still United States, are not within 10 miles of Seattle.

Nor are the people from  France.  Tonga.  Ecuador.  Mexico.  El Salvador.

Well, at least the overabundance of Spanish started making sense.  But either lots of people were lying about where they lived, or Facebook was doing a terrible job of actually delivering the audience I was paying for.

Or, I suppose, since they said that 100% of the impressions were US based, maybe the French, Tongan, Ecuadoran, Mexican, and Salvadoran people who saw my post were coincidentally all part of a tour group that was visiting Seattle while my boosted post ran, which they saw on their mobile devices.  But I don’t think that’s too probable.

And you’re how old?

While I was looking up where people lived, I noticed another strange pattern.   Exactly 2/3 of the profiles had a birth date listed between June and September, 2014.   Only 2 people had a birth date prior to 2014.  At first I figured this must have been not really their birth but when they joined Facebook.  After all, no babies are going to be Liking my posts about business classes.  Looking further, though, virtually all of those suspiciously young Facebook users had made posts on Facebook prior to the date of their birth.  So unless the most common thing to do is to join Facebook, using your join date as your date of birth, posting and then editing the posts to backdate them to before you joined to look like you’d been there longer (but not editing the birth date), and doing so in Seattle but saying that you’re in Tonga or Mexico, and then going and Liking business ads, I’m not sure how to explain what I was seeing.

Oh, right.  The Results.

Really, none of that matters, right?  I mean, from a business perspective, what I wanted was for Seattle(ish) people to see that I had a class coming up and clicking the link to learn more about it and, hopefully, to have them sign up.  Maybe if there’s a little bit of donked up data and some hinky behavior it doesn’t really matter as long as my business goals actually get met.

Well, here’s what I paid for with my $20.  I got 57 interactions total, for a cost-per-interaction of $0.35.  A pretty good cost for online marketing…as long as you don’t look further.

That 57 interactions breaks down as follows:

  • 31 Post Likes (might be tangentially helpful because friends-of-friends could see that behavior, but not directly a business goal result)
  • 24 Photo Clicks (essentially useless–someone looked at the image close up)
  • 1 Page Like (Okay, so that one person has a 5% chance of seeing things I post in the future)
  • 1 Link Click (my preferred outcome)

In typical online advertising, you count cost-per-click (CPC), with the click resulting in the goal behavior, such as going to your website or lead generation page.  Of my 57 interactions, only one of those did that.  So if we apply typical online advertising terminology and definitions instead of Facebook’s, I had a CPC of $20 per click.  Which really is not that great.  If I could be sure that person was really interested in business and lived in Seattle, it might still be worth it…but clearly I can’t make that assumption.  For all I know, someone in Puerto Rico has now seen the information about my class in Seattle and will not be attending.  $20 wasted.

I could have, I suppose, sent out bulk mail fliers.  I could be assured, at the very least, that they would go to Seattle area people and that there were real people at the other end.  My Facebook ad supposedly reached 3064 people and 57 interacted, or 1.8%.  That’s about the typical response percentage that junk mail gets.  Only “touched the piece of email” wouldn’t be part of the count, so the junk mail route might be more effective, if more expensive.

The Silver Lining

So what if my ad was shown to infants, a third of which lived in other countries and didn’t speak my language?  At least I have “exposure” right?  Well, I suppose that assumes none of those profiles are fake, of course.  (One has been deleted already.)  Okay, so exposure isn’t even a good silver lining…

The silver lining is, then, is that at least I got a blog post out of it.

And maybe your comments.  Have you advertised on Facebook and had actual business come out of it?  Or have you done so and found similar weird things?  Tell me your experiences.  (If I seem to harp on Facebook a lot, it’s because I’ve never known someone personally who showed me data-centered evidence of benefit from Facebook.  I’ve heard people making the claims without any evidence, and seen second-hand or third-hand data, but nothing that I’d classify as a “primary source.”)

Of course, if you’d like me to do some analysis of your data to see if your Facebook activities have been effective at reaching your goals, I’d be happy to discuss it with you.  Just send me a message and we can get started.

Otherwise, leave your stories below in the comments!

Image credit:

Posted by Michael J. Coffey  |  4 Comments  |  in Social Media

A Month Marketing on Tsu

Hardly marketing on Tsu still yielded noticeable traffic to my siteMarketing on Tsu (the new social media site) is in some ways a whole new beast for small business owners.  In other ways it’s the same.  I’ve seen lots of articles on the pros and cons of Tsu as a social media site, its prospects for the future, and so forth, but not really a look at how an entrepreneur might use the site to develop business prospects.

But first, a brief introduction to Tsu for those who haven’t tried it yet.

What is  An Overview as a User

Before you start marketing on Tsu, you should know what Tsu is.  The easiest way to conceptualize it is Facebook, only with YouTube’s advertising model.  That is, the people who create the best content and get the most views on what they post are the people who have the most to earn.  Yes:  Tsu pays its users to use the site (which works much like Facebook).

Without going into too much detail, you need to join through an invitation.  (Here’s one for you if you haven’t joined yet.)  When you do that, you’ll become my “child” (or the child of the person whose invite you used) on the Tsu Family Tree.

That family tree comes into play really only when the ad revenue bit happens.  Basically, when people click on ads–and yes, they do–the advertiser pays money to Tsu.  The site takes 10% off the top.  The rest gets divided among the users in a particular way.  The creators of the content (i.e., posts) that were associated with the ad clicks get half of it.  The other half gets distributed, mainly to your ancestors up the family tree in quickly diminishing amounts.

While some focus on that last bit as being basically multi-level marketing, note that while having prolific and popular children can bring you in a little money, the single biggest way to increase your earnings is to actually post great content.  At least in theory.

Beyond that, it’s pretty much like using Facebook.  You can post pictures or links, comment, Like posts or comments, send private messages, mention people with their @username, install their app on your phone, and so forth.  One notable difference is that it’s much easier to become someone’s Friend (mutual connection) or Follower (one-way, like Twitter or Google Plus).  Other differences include the Bank, where you can see how much ad revenue you’ve earned in your social media activity, and Analytics, where you can get more information about how many views, Likes, and comments there have been on the posts of the last 30 days so you can learn what your audience wants more of.

Finally, you can write your posts on Tsu and, if you’ve connected your Facebook or Twitter accounts, send that same post to those social sites directly.

Marketing on Tsu:  Being an Advertiser

I can’t tell you much about this because currently there’s no interface for becoming an advertiser.  You simply have to email them and set something up.  That’s kind of a bummer because it would be great to be able to do some poking around before getting one of their sales people involved.

There are three ways you can spend your advertising budget at Tsu:

  1. Pushing content one layer out through your network.  For those familiar with the Google+ “Extended Circles,” this is similar.  It gets your post seen not only by your friends and followers, but also by their friends and followers.
  2. Targeting an audience.  This is a more traditional advertising approach.  You tell them the demographics you want to reach (whether they’re connected to you or not) and they try to get your post seen by those kinds of people.  They specify geography, age, and gender as demographic groups you can target.  Given the simplicity of the profile section, they might be the only demographics possible for targeting.
  3. Display ads.  This is the oldest and most established form of Internet marketing.  You pick the demographics, you make an ad image, and they show it.

It’s all pretty straightforward, but I haven’t been privy to any data in terms of effectiveness or pricing.  I’ve sent a message to the Tsu folks so if you’re seeing this sentence, I haven’t heard back yet.

Marketing on Tsu: Generating Traffic & Leads

Something that really stuck out to me about Tsu this last month was the traffic.  After doing some slight massaging of the data to make it a little simpler, the top traffic sources for my website look like this over my first 30 days on Tsu:

  1. Direct (41%)
  2. Email (18%)
  3. Google+ (15%)
  4. Organic Search (9%)
  5. Tsu (6%)
  6. Facebook (5%)
  7. LinkedIn (2%)
  8. Other traffic sources (2% or less, each)

I’ve been on Google+ and Facebook (and Twitter, which didn’t send me any traffic at all in this time period) basically since the beginning of July when my business launched.  That’s over 120 days.  I’ve been on Tsu for 1/4 of that time and it’s already sending me more traffic than Facebook, Twitter, or LinkedIn, and about half as much as Google+.  Now, that could just be my own pattern, or the fact that there are more avidly-interacting early adopters on Tsu than on other platforms, or simply a small sample size.  But it at least suggests there may be some marketing power in terms of traffic generation.  And if you’re curious, the other new social site, Ello, sent me no traffic either…though I’ve been less active there than on Tsu.

I should also point out that my Tsu account is not particularly branded as business.  It’s mostly set up as a personal account, but does talk about what I do in my business as has a link to my business website.  I’m assuming the impact would have been greater had I really focused on the branding/marketing aspect of the account.

Some Notes on Tsu and Google Analytics

I also wanted to quickly note something for those who are running to their Google Analytics to check their own Tsu stats.  Google Analytics doesn’t quite treat Tsu uniformly throughout.  My Tsu visits show up as “tsu / social” when I’m looking at the All Traffic section in Acquisition.  But in the Social section of Acquisition, Tsu isn’t shown at all.  So although All Traffic shows Tsu is recognized as a social network, its numbers aren’t included in social-specific sections.

Like I said, some of this may be because of small sample size or my own business’ patterns.  If you’re on Tsu and using Analytics, I’d love to hear if your patterns confirm what I’m seeing, or contrast in some way.  Let me know in the comments (or on Tsu!)




Posted by Michael J. Coffey  |  3 Comments  |  in Social Media

What Do You Want from the Internet? A Look at Key Performance Indicators

A man looking at his mobile device, probably deciding on what key performance indicators he wants to use for targeting his online marketing

Shareaholic’s recent 3rd Quarter Social Media Traffic Report has me thinking about key performance indicators and what my clients really want from the Internet.  There was a very interesting discussion between a number of internet-marketing type people related to the report over on a Google+ post by Ana Hoffman of Traffic Generation Cafe.

While I want to get back to that discussion and what it has to do with you in just a moment, I first would like to point out that while I kind of criticized their 2nd Quarter report in a previous blog post, and might say a couple of things that seem critical in this one, I harbor no ill will toward Shareaholic.  My purpose is to go a little deeper and not take their data at face value, as I see some people doing.

This might go without saying, but you are not the average Shareaholic website, and by extension, your clients are not average either.  But I’ll get to that in more depth after we cover what you want the Internet to do for your business.

Okay, so let’s look at teasing out the bits that are useful so we can avoid being led astray by data that doesn’t apply (even if lots of folks online are telling you that broad averages are somehow equal to the reality of your specific business).

 What are Key Performance Indicators (KPI)?

Key performance indicators, or more frequently referred to as KPIs, are measurements of success.  Where many business owners miss the mark is ignoring the first word: key.  In other words, the indicators that are key to success.  Lots and lots of things can be measured.  But the question that doesn’t get asked enough is, “Is this indicator critical to measuring success?

Why is this important?  Why do I often limit clients to just 3 (or maybe 4) KPIs?  Because there’s just too damned much information out there and you’ll get overwhelmed looking at everything.  So you first need to figure out what is most critically important.

How Do I Get More Traffic To My Website?

That’s a great question. It’s an important question. But is it the most critical one?  Here we go back to Shareaholic and Ana Hoffman’s post.  Shareaholic’s Danny Wong posted on their blog some of the data in an article entitled “In Q3, Facebook Drove 4x More Traffic Than Pinterest [REPORT]

“OMG!  Quadruple the traffic?  We need to be on Facebook NOW!”  Yes, I hear you thinking that.  But what are we talking about here?  KEY performance indicators.  Is the number of people who come to your website the most important thing?  Maybe not.

On Hoffman’s post, different people brought up a bunch of “yes, but…” scenarios, detailing things they thought were more important than simply the number of visitors.  These include:

  • Engagement level: the likelihood that someone will interact with your brand by endorsing (Like, +1), sharing, commenting, replying, etc.
  • Conversion rate: the likelihood that a visitor will subscribe, or purchase
  • Return on Investment (ROI):  Mentioned by Thomas E. Hanna in relation to bloggers, this could refer to either time invested, or money
  • Audience: Are the right people–your target customers–the ones who are coming?
  • Shares/endorsements from your website via a social-share button:  Visitors already on your site giving your social media exposure
  • Shares that spread your brand page content through social media: Social media exposure that might not mean people going to your website
  • Connecting with previously-established “fans” (and whatever value there is in those connections)
  • Connecting with new “fans” (and the value that those people have)

Now that you’ve seen some of those other options, how convinced are you that the number of people who visit your website is the Key Performance Indicator for you?  If you were given this list and told you could only look at one number, what would you choose?

Ardea’s 3Q 2014 Social Media Traffic Report

Okay, now back to that other point.  You are not an average business, and your customers are not average people.  To illustrate why you might not want to take aggregated reports like the Shareaholic numbers at face value, I’ve done my own version.  What they did (at least for the bit mentioned in the title of the blog post) was look at how much traffic came to all of the websites that used their service, and see what percent of that traffic came from each of the “top 8” social media sites.  Using their data from September 2014, the top sites look like this:

  1. Facebook (22.36%)
  2. Pinterest (5.52%)
  3. Twitter (0.88%)
  4. StumbleUpon (0.41%)
  5. Reddit (0.18%)
  6. Google+ (0.07%)
  7. YouTube (0.04%)
  8. LinkedIn (0.04%)

Now I’m going to do the same.  I’m taking the visits from sites I run, and the sites of my clients for whom I have Google Analytics data (for reasons I won’t go into here, it’s difficult to properly mix-and-match data sources, so I’m trying to keep the data clean).  Using my own set of “websites that used my service” I’ll do the same calculation–what percentage of total traffic came from each of these 8 social media sites in September 2014.  My rankings are as follows:

  1. Facebook (0.42%)
  2. StumbleUpon (0.31%)
  3. Twitter (0.15%)
  4. Google+ (0.09%)
  5. Reddit (0.08%)
  6. Pinterest (0.03%)
  7. LinkedIn (0.02%)
  8. YouTube (0.01%)

I should point out that my clients tend to be very small businesses, many of whom haven’t been on social media for very long.  But notice that Facebook doesn’t even account for half a percent of traffic.  It’s still the top position, though.  Pinterest sank from #2 to #6, while Google+ rose from #6 to #4.

In addition to the shifts in rank and in total percentage, my numbers showed WordPress at 0.08%–as much as Reddit and nearly as much as Google+.  Blogger beat the bottom three with 0.05%, and Yelp matched Pinterest.  So for my clients, the “top 8” social sites are a different set of social media sites than they are for Shareaholic’s clients.

If I do that “grading on the curve” technique of dropping the highest and lowest traffic sites, the rankings change again:

  • StumbleUpon (1.8%),
  • Twitter (1.0%),
  • Google+, Reddit, and Facebook (tied at 0.7%)
  • WordPress (0.6%),
  • LinkedIn and Blogger (tied at 0.3%)
  • Pinterest and YouTube (tied at no traffic whatsoever).

To belabor my point a little: if you see a big report on what the best sites are for a business to market on, take it with a grain of salt.  Don’t assume that a large collection of websites will return the same results as your website will, particularly if the sites aren’t of the same size or industry as yours.

And to bring this around to the larger issue, what you need to do as an entrepreneur or the owner of a small business or startup, is to zero in on where these two things intersect.  Once you determine your key performance indicators (and make sure they’re the “critical few” as author and analyst Avinash Kaushik puts it)

A Tiny Bit of Criticism

As MaAnna Stephenson notes, the Shareaholic data is pulled from Shareaholic users, which may have a certain bias toward brands that have used Facebook heavily for some time and therefore weight in favor of Facebook.  Ana Hoffman and Danny Wong both point out that the data includes paid traffic.  Since Facebook ads happen on Facebook, but Google ads do not appear on Google Plus, that also weights the Shareaholic results in favor of Facebook.  In essence, Facebook gets credit for organic traffic and paid traffic, while Google Plus only gets credit for organic traffic but not paid.  So it’s not really comparing oranges to oranges; it’s more like fruit salad to oranges and saying the fruit salad is better because you get more types of fruit.


I’ve already mentioned several people above who were involved in the discussion, but those that I didn’t mention by name yet and contributed to the list of possible Key Performance Indicators you might want for measuring your online success are Eli Fennel,  Thomas E. Hanna (whose “Weekly Insider Goodies” list from provided the image for this blog post), Justin Chung, and Kymmberly Gail.  Thanks for a great conversation.

Posted by Michael J. Coffey  |  3 Comments  |  in Analysis & Testing

A Great Tweet from the Department of Transportation

Sometimes, spending as much time as I do on social media isn’t fun.  Sure, I get to watch people look envious when I say I can go on Facebook whenever I want because I work for myself.  But really, things all start looking the same after a while.  “Really?  Another baby?”  “Oh, look!  Another kitten!”  “Oh, how about that…another picture of someone’s lunch.”

Every so often, though, something comes up which breaks the mold and it’s very refreshing.

Whoever was posting for the Washington State Department of Transportation’s Seattle area account was on top of it the other day.  They put together a tweet—about traffic, no less—that got me (who has never had a driver’s license in my life and who can only tell cars apart by color, not by make/model) forwarding a tweet about road conditions.

Here’s the Tweet

WSDOT Traffic Tweet

Why is this such a great tweet?  It’s got it all.  It’s informative, telling about a real, on-the-ground (or on-the-trestle in this case) situation that drivers should be aware of.  It handles a problematic situation (can’t verify via camera) in a humorous way.  And chickens—because everyone knows the Internet runs on cute animal pictures.  By which, I really mean that it’s entertaining.


By the way, if you want to see the original tweet, you can find it here.  At time of posting, it had been retweeted 48 times and favorited by 65 people.  Not bad for a traffic update, eh?

Posted by Michael J. Coffey  |  0 Comment  |  in Social Media

Testing Your Website: Are You Doing It?

Testing in an educational setting

Testing is something everyone with a website should be doing, but almost nobody does.   It’s the crucial activity to take the guesswork out of your site, whether you’re a blogger, a small business, or a nonprofit organization.  It impacts your written copy, your design…everything. I believe this so strongly, my professional recommendation is that you slap together a non-embarrassing website for cheap (more on this in another post), and then start testing different options.  Don’t hire a photographer or a web designer, or a developer until you’ve done your testing to discover what works for your audience.  Then you can tell them what you actually need.

Why should I be testing on my site?

Here’s why:  You’re just not going to guess correctly on a regular enough basis to be useful.  Professional designers and developers will probably be able to implement things that are more likely to get it right, but still not often enough.  People basically suck at guessing what’s going to work.  I’m talking to you.  And I’m talking about myself.

An Example: Testing vs. Guessing

To prove my point, here’s a little quiz: Let’s say you have a registration form you want people to fill out.  You wonder if adding some text about privacy will improve registrations.  Do you:

  1. Add “100% Privacy — we will never spam you!”
  2. Add “We guarantee 100% privacy. Your information will not be shared.”
  3. Leave it alone — nobody’s complained about it, so don’t mess with what isn’t broken.

Give yourself a moment to choose what you would do.  This process is how most companies go about making decisions about their website.  They come up with some ideas and pick one.  In this case, these options are actually from a real series of tests that were written up as a case study on, meaning you can find out the “right” answer without doing the test.

But first you have to guess.  There’s a 33% chance of getting it right randomly (even less so if you were to brainstorm more options).

Now for the data

For starters, the third option is what the case study already had in place, so it was the control.  No change in what you’ve showing means no change in your results.  If you chose the first option, you have done yourself a disservice.  That change reduced registrations by almost 19%.  But the second option on the list (the one that doesn’t mention spam) actually increased registrations by nearly 20% compared to the control.

So should you add a comment about privacy to your form?  Maybe…and maybe not. You won’t know for sure unless you do these statistical tests and see what impact they have.  I’ve done enough tests (like the coupon testing I talked about in a previous post) and read enough case studies like the one this example is from to know that sometimes big changes make no difference at all, and tiny changes can have a huge impact.  I saw one example where simply changing the color of the background, or a single word on a button, or the number of bullets in a list either boosted the goals of the page (or made them plummet!)

Test first.  Get the real answers.

THEN go out and tell your developers and designers what they need to do.  Because, with data in hand, you’ll know.  You’ll have eliminated the guesswork.

Posted by Michael J. Coffey  |  0 Comment  |  in Analysis & Testing

Lessons Learned in Two Months of Social Media Engagement

Before you get the wrong impression about my social media engagement experience, I’ve been using social media for more than two months.  My first personal status update on Facebook was on 26 October 2006.  My first tweet was on 18 April 2008.  And I got an early invitation to Google+, so I was on it before the public release: 30 June 2011 (the public launch was in September).  We won’t talk about MySpace, but I was on that for a bit before Facebook came along.

In contrast, Ardea Coaching as a brand page has only two months of data on these “big three” social sites.  That gives me some insight into what new businesses confront when they’re just getting started.

Caveat: Is Social Media Engagement Right For You?

A frequent refrain when I’m training someone on social media marketing is that success depends on what you’re looking to achieve.  “What’s your goal?” I often ask.  This blog post will assume that audience interaction assists those goals.  (Maybe your business goals focus on funneling people from social media sites to visit your website for more information.  Perhaps you don’t care if they interact with you.  Or maybe you’ve found that those who interact don’t visit your site.  In that case, recognize that what I say here may not apply to you.)

From my perspective, interaction is good.  For Ardea, I look at it this way: for every interaction, even if it’s a simple +1 on Google Plus or a Like on a Facebook post, there’s a person who is indicating they’re more interested in what I do than all the (let’s face it) millions of people who didn’t take the time to click that button.  They’re that much closer to referring business to me, or signing up for a class, or writing a recommendation, or asking for a proposal.  Therefore, I find interaction one way to pre-qualify leads.

Note that although I did start my G+ and Twitter accounts earlier than FB, I gathered data starting July 7th, my first official day of relaunching Ardea.  As mentioned in a social media post recently (Google+, Facebook, Twitter), my follower growth has been fastest on G+ by a fairly wide margin.


An endorsement is the term I’m using for the lowest-commitment social media engagement on the big three sites: a Like on Facebook, a favorite (the star) on Twitter, or a +1 on Google Plus.  It’s an interaction, so it shows interest, but not much more.  I compared the total of all endorsements, as well as endorsements-per-post to see where I got the most interaction; there wasn’t too much difference by percentage.  The chart below shows percentages by total endorsements.  By per-post, Facebook gained about 8 percentage points and the other two showed roughly proportional decreases, so the chart looked similar enough I didn’t include it here.

Social Media Engagement Pie chart for Endorsements: Google Plus 66.7%, Twitter 11.8%, Facebook 21.5%


Comments & Mentions

This is the next level up, in my opinion.  If someone comments on a post, or mentions you in one of theirs (or in another comment, or whatever), we’re talking about that portion of your audience who has risen above the single, easy, one-button click to actually typing things like words or even sentences.  Sure, occasionally it’s not more than “LOL” but usually there’s more thought, time, and effort expended on the part of the audience member.   Like with engagements, I looked at totals and per-post averages.  On a percent basis the two were even closer (Facebook gained only 4% share), so again only one chart.  Note that Twitter might be so big because of a disproportionately long conversation with Danny Wong sparked by my earlier blog post riffing off one of his articles.  His comments were about a third of all the Twitter mentions.  And because only Legolas’ elf eyes would be able to discern the difference in the chart, Google Plus is at 48.3% and Twitter is second at 46.7%.

Social Media Engagement Pie chart for Comments & Mentions: Google Plus 48.3%, Twitter 46.7%, Facebook 5.0%


You should note something else here, too.  Above, I linked to my social media posts about follower growth.  The people who commented on that post in G+ were industry leaders.  I’d consider Danny Wong to be an influencer or “big name” (big enough that I’m happy to have received so many mentions by him on Twitter).  In comparison, 100% of the comments on Facebook were made by myself (replies, follow-up information, etc.) or by people that I already know in my day-to-day life.  In other words, from a quality standpoint, Twitter and Google Plus are delivering interaction and connection with new and influential people.  They are delivering high volume.  In comparison, Facebook is low volume connection with people I already know.  That’s not to say the Facebook comments aren’t from quality people, rather, that I can connect with all those people without ever using Facebook.


Here’s the gold medal level of interaction.  A share is love.  A share is required for “going viral.”  A share is, to use Seth Godin’s terms, a “sneeze” that passes on the “ideavirus.”  It’s what exposes new people to your brand, your services, your marketing messages.  It’s an advertisement, a recommendation, and an endorsement all wrapped into one.  The note here is mainly that I specifically asked people on Facebook to share, whereas I didn’t on either G+ or Twitter.  I was trying to get a minimum number of followers on Facebook so that they would give me stats about my page (a requirement the other two sites don’t have and, even if the rule applied to all three, wouldn’t have been a problem).  Again, there wasn’t much shift between totals and shares-per-post; the difference is more or less the same as I described in “Endorsements,” above.

Social Media Engagement Pie chart for Shares & Retweets: Google Plus 67%, Facebook 29%, Twitter 5%Conclusions

Based on my own pages’ data, it’s pretty clear that from both a quality and a quantity standpoint, Google+ is the big winner.  It takes the top spot in all three areas in both total numbers and per-post averages.  Twitter still provided me with better visibility by and connection to new people than Facebook, even though it came in last in two of the areas.  Facebook only really supplied me with connections to people I already know, and only after I made the ask of my contacts.

If this seems like I’m bashing Facebook, I’m not.  If I wanted to concentrate my marketing on my friends and family, it would have come out on top (remember, “what’s your goal?”).  There are, however, many people who think being on Facebook with their business is the best thing they can do online.  My intent is to show those people that they should probably explore more sites as part of their online marketing strategy.  It’s particularly true if interaction with new and influential people is important to them.  Finally, the Twitter comments example shows that sometimes the needle moves in a big way because of some chance, serendipitous conversation.

Have you seen measurable success on Facebook relative to other sites?  Or are you seeing the same trend I am?  I’d love to hear your stories in the comments below, or on one of these social media sites!

Posted by Michael J. Coffey  |  0 Comment  |  in Social Media

Embarrassing Insights Revealed by Web Data

Be Careful: This Machine Has No Brain. Use Your Own.

You do your best to look good, to be professional, and to get the job done right.  But sometimes you just mess up.  It’s human.  And sometimes, it’s a little (or a lot) embarrassing.

Luckily, a regular review of your website data can help catch some of those mistakes before too many people notice.  Here, for your viewing pleasure, are a few of the errors I’ve caught while swimming in the data stream.  The identities of those at fault, and the businesses they worked for, have been anonymized to protect their dignity.  Because really, wouldn’t we all like to be seen as a little better than “merely human”?

I’ll refer to all of the websites as “” and whoever is in charge at the company as “Mr. Owner”.

Web Data Revealed These Goofs

  • I’m Sending Away Visitors I Already Have…to Myself!  Looking at the data from this site, I noticed a strange pattern.  Lots of visitors were leaving—abandoning it to go instead to Also, the #1 source of incoming traffic to was the site  Luckily for Mr. Owner, I knew immediately what was the problem at set out to fix it.  These so-called “self referrals” are usually due to incorrectly installed tracking code.  Sure enough, after a little exploration, I found some pages that didn’t have the code on it, so if a visitor went to that page, it was like they vanished (at least from the point of view of the tracking software).  And then they came back to a page with the code on it, the tracking program thought, “Hmmm.  Where’d this new visitor come from?  Looks like it was!”  Oops.
  • It’s a Miracle!  I’m Making Sales Without Even Having A Store!  In this case, Mr. Owner had more than one site.  One had an e-commerce site as well as some other sections.  Another website was only a blog.  But the person who had installed the tracking codes had gotten things mixed up and installed the blog-only account number on the blog…but also in the store section of the other website.  From the dashboard, then, it looked like his store was making no sales (despite the fact that he was getting orders), but someone people were buying things directly from his blog.  Crazy!  I made sure all the pages on both sites had the right account number, and now everything looks like it should.
  • You Know What’s Better Than What’s In Your Cart?  Knockoff Drugs From Overseas Somewhere!  Mr. Owner was hacked!  But it was a secret hack.  All that happened was that every once in a while (not every time, mind you), a person might randomly be sent to a drug-peddling site overseas when they clicked the “checkout” button in the store.  The issue was revealed, in part, because it looked like a strangely high number of people were supposedly clicking a link that we couldn’t find on the site, and leaving for a sketchy-looking website.  We had the site’s web host do a security check and that discovered the rest.
  • Button? What Button? Oh, You Didn’t Want To Sign Up, Did You?   Yeah, speaking of clicking on a button, if you have a page that says “Just fill in this brief form and click ‘Sign Up’!” you should probably have a button that says ‘Sign Up”… or a button.  Mr. Owner felt very embarrassed about this one.  But he put it in the middle of this list in the hopes that readers don’t notice (but if you tried to sign up for the Ardea Coaching mailing list in the last week or so, there’s a button now).  The data that revealed this problem?  I’d set up an alert to specifically track button clicks as a goal conversion and I was alerted that there had been no conversions in over a week.  So hurray it wasn’t longer than that!
  • We’ll Provide Your Professional Services!  And a Russian Mail Order Bride, Too.  The last site I’ll talk about had indicators similar to the knockoff drugs issue.  There were an odd number of exits following a link.  And that link happened to go to a site that… well, it was one that would not be good for your boss to see you looking at in the office.   But in this case, the site hadn’t been hacked.  What had happened was that the person who had designed the website originally had included a link back to their site.  Over time, they must have let the domain name expire (or sold it).  Whatever happened, the domain changed hands from a website template designer to a mail order bride site.  Every page on this professional services website had a link on it to a somewhat naughty website.  That wasn’t the kind of “professional services” they were selling, either.  I quickly removed that link from the footer so nobody would question the integrity of the site owner.

What Do You Do?

While ‘saving clients from embarrassment’ isn’t what I usually say when someone asks what I do, that’s sometimes a side effect.  And a reason you should always review the data you’ve got coming in from your site because you never know what surprises you might find on your own site.

Image source: (no modifications except resizing)

Posted by Michael J. Coffey  |  0 Comment  |  in Analysis & Testing

Low-Tech Copy Testing: A Real-World Example

Antique typewriter with typed words, "Testing...testing... 1... 2... 3..."

Testing my 1924 Underwood 3-Bank Portable typewriter

In this post, I’m going to talk about testing.  In particular, an example of low-tech testing that I really did for a former employer and some of the non-intuitive things we learned without investing in a ton of technology.

Right now, you might have some part of your business that you suspect could work more effectively.  How do you know if you should make a change or not?  Or maybe you know something must change, but you don’t know what the new thing should look like.  How do you choose?

The short answer is: data.   You want to test different options in the real world in a way that will allow you to see what option works best before committing to one.

High Tech and Low Tech Testing

At a basic level, all testing works basically the same.  You have at least two options and you measure how the options do against one another.  Low tech testing of marketing has been going on for at least a century (depending on what kind of analysis you’re looking at).  In the 1920s, for example, a number of studies were published about what coupons printed in newspapers and magazines had the highest response rates.

Over time, of course, technology has increased our ability to fine tune what we learn.  In the 1940s, magazine publishers were able to more easily do “tip-ins” (ads or features or whatever that are added to a magazine, usually after it’s been bound).  This opened the door for tipping in different versions of an ad to see which sold better.

In the era of the Internet, back in the early days of, you used to be able to hit the refresh button on your browser and see different front-page layouts.  You could actually reveal the different test versions they were doing at any given time.  Now, however, that testing has gotten even more subtle.  But it’s still all the same idea.  Does Version A perform better than Version B?  Let’s try them both on real users and see!

Back to Basics: My Mail Order Coupon Tests

As some people know, I’m something of a tea geek.  As such, I’ve worked for a number of different tea businesses (as well as having my own educational/consulting company related to tea).  In one of the businesses I worked for, we noticed that we had quite a lot of first-time mail order customers, but a very low percentage of repeat business.  It seemed like a reasonable goal to increase repeat business.  The way we decided to do that was to add a coupon in every outgoing mail order package.  The more coupons that got redeemed, by definition the more repeat orders we were getting, since the only way to get a coupon was to have already ordered at least once before.

We set up a baseline by sending out coupons for a month or two.  Our main metric was essentially the percent of total orders in a month that used a coupon.  We could have looked at how many orders were repeat orders, since at least theoretically, the coupon could have reminded people to be repeat orders without them actually using the coupon.  As a measure of effectiveness of the coupon as a repeat-order-building tool, a straight percentage was going to be accurate enough and easier to calculate given our order system.

Once that baseline had been established, we started testing.  Each month, I would design two coupons.  Version A was the ‘control’.  That is, whatever version had been performing the best so far.  It was our standard.  I’d also make a Version B that changed something about the coupon to see if that change increased the redemption rate.  I’d make an equal number of copies of each coupon, collate them so they alternated in the stack, and gave the stack to the warehouse folks.  That way as each order went out, the first one would get Version A, then Version B, then Version A, and so forth.  For tracking purposes, each version got a slightly different coupon code to use when redeeming so we knew which one the customer had gotten.

What We Tested

Over the course of about two years, we tested lots of things.  We tested the color of the paper it was printed on.  We tested amount of the discount.  We tested a discount described as a dollar amount (“$10 off your next order”) and a percentage (“10% off your next order”).  We tested serif vs. sanserif typeface.  We tested location of different parts, such as whether the discount information put at the top so it was the first thing made a difference compared to having it in the middle of the coupon after some kind of headline text.  Font size.  Line weight.  Wording.  All kinds of variations on all kinds of things.

The Results

After each pair of coupons expired, I’d do the math to figure out if we had a winner.  Often, we didn’t.  Statistically speaking, although one might have been a little ahead of the other in terms of raw numbers, it wasn’t enough of a difference.  It’s what’s called the “null hypothesis.”  You assume any difference is due to chance or inaccuracy of the numbers or whatever unless the difference is “statistically significant.”  We had lots of null-hypothesis results.  And that was important.  On the one hand, we’d just spent a bunch of time testing and found, essentially, nothing.  On the other, we could safely use either option knowing it wouldn’t hurt.  If the boss really liked Version B much better than Version A?  Great.  Doesn’t matter.  Go for it.  We’ve essentially got proof that it’s not worth worrying about any more.

Some of these “no result” tests were still instructive, however.  A 10% off coupon pulled just as many responses as one for 40% off (which, if I recall correctly, was the highest amount we tested).  Why use a larger discount if it didn’t get you any more purchases?  We found no significant difference between percent-off and dollars-off.  But in that case it was a little trickier.  The percent-off coupons resulted in larger purchases.  They didn’t result in more frequent purchases which is really what we were looking for, but the average order size was much larger, so when I noticed that, we stuck to percentages.

And there were also some surprising results that did show a difference.  An expiration date 2 months in the future didn’t do nearly as well as one with an expiration date 3 months in the future.

One big result was a “positive message” vs. “negative message” test.  One version was something like “Thank you for your order!  We appreciate your business and would like to offer you this discount on your next order to show how much we love our customers!”  Something like that.  Gratitude, customer-focused, that kind of good stuff.  Version B was totally playing off fear and loss.  It went along the lines of “You don’t want to find yourself without tea, do you?  You’d better use this coupon to reorder early or you might run out!”  Being negative soundly defeated the nicer version.

But the biggest, most decisive victory between versions was perhaps the least intuitive.  All of our coupons for all of the tests were printed on quarter sheets (4.25″ x 5.5″).  One round, I used no border.  On the other, I used a dotted-line border around the outside edge.  The one with the border showed a huge increase in redemption rate.  Like not just 5% or 10% more, but more on the range of multiple times as many.  Following that, I did lots of testing of the border, but I’d apparently hit the right one the first time.  A double solid line or a fancy scroll border or any other version didn’t make a difference.  But a dotted line around the outside was our golden ticket.  Why?  Maybe because it reminded people of a coupon you had to cut out of a magazine?  I don’t know.  They just needed to enter the coupon code on our website, or tell it to the person on the phone, so no cutting out needed to happen.  But for whatever reason, we found success in an otherwise completely irrelevant dotted line.

What This Means To You

Think back to that thing you’re not sure what to do in your business.  Apply some testing.  There’s lots of online technology to help you do that, or you may be able to devise something low-tech to track, like our two coupons with different coupon codes.  But do start thinking of making decisions based on data.  You may reveal your own “dotted line”—that non-intuitive thing that for whatever strange reason, makes your customers more likely to buy and buy again.

Posted by Michael J. Coffey  |  0 Comment  |  in Analysis & Testing

Watch Out For This Problem with Facebook Ads

Since Facebook went public, they’ve been trying to make a transition.  They started as a site with its core centered firmly in a free service that replicates electronically the connections we have in “real life.”  They now need to be a site centered in what the shareholders want: money.  One of the primary ways they’ve done that is with what I’ll call “Facebook ads”—but is broader than just ads.  It’s the various ways a business (or individual, but let’s stick to business) can promote itself.  This includes paying to “Boost Post” or paying to “Promote Page” in addition to using the Facebook Ads Campaign Manager or posting Offers.

However you want to do it, you’ve chosen to drop some money in Facebook’s pocket for some extra exposure to new (or existing) people.  You go through the steps to let them know the audience you’d like to reach.  You send them the cash, and they promote you to the right people.

Or, at least that’s how it’s supposed to work.  But there’s one problem with Facebook ads: they’re not that smart.

A Personal Example: The Facebook Ads Shown To Me

Here’s a screenshot of some ads I saw last week:

Facebook ads for and the Pilot Metal Falcon fountain pen

Facebook ads I saw last week

Seems like they’re targeting pretty well.  I’ve been talking about having started a business recently, so the Customer Relationship Management (CRM) site is a good bet.  I’m also a big fan of fountain pens, so the Pilot pen is a pretty good match for me.  Not only that, but Facebook has shown me those ads not just based on a guess, but based on web data collected behind the scenes.  These are exquisitely tailored to me in particular because I’d been talking about both in private messages on Facebook.  Yes, and not just a Pilot fountain pen, but the Pilot Metal Falcon, Extra Fine Nib, with a black barrel.

The Problem

What’s the problem?  I was talking about both of those things because I already have them.  I talked about because I was asking another business owner what CRM system he used and whether he’d tried Insightly or not.  I always want to be aware of alternatives so I can change if it makes sense to do so.  And I was talking about the Pilot Metal Falcon because I was telling a friend I’d just bought it and loved how it felt while writing.  (Here’s a German word for you: Schreibvergnügen — writing pleasure.)

Pilot Metal Falcon fountain pen

Gratuitous pen shot. The soft nib allows variation in line thickness based on pressure.

Still–why does it matter?  Think of it from Insightly’s or Amazon’s point of view.  They paid to have their ads seen by relevant people.  Although Facebook did a pretty good job at picking what to show me based on what it knows about me, it was too late on both guesses by a couple of weeks.  It wasn’t smart enough to understand the context in which I was talking about those products.  It couldn’t draw the conclusion, “Michael already has those things.  Although he’s talking about them, it’s not because he’s a potential buyer.”

In other words, I’m a false positive for an ad match.

If you choose to use Facebook ads, then, you need to be aware that some portion of your money will go to advertising to people who have precisely zero chance of buying because they just got—and may be highly pleased with—your product or an alternative to it.  (I bought my pen, by the way, from The Lost Quill a small, local, woman-owned business in Bainbridge, WA, just a short scenic ferry ride and walk from downtown Seattle.  Annabella made me try different pens and inks until I was sure the Metal Falcon elicited the most Schreibvergnügen.)

The Solution

I’m all about solutions, not just complaining about Facebook and showing off my lovely pen.  Here’s what you can do to avoid wasting money: measure your return on investment (ROI).  Set up your ads so that you can tell which ad or campaign it came from–for example, give it a unique name, or if you’re using Google Analytics, be sure to use UTM tagging on your link.  That way you have a chance at seeing how many people came to your site because of the ad. Depending on your tracking options and the goals for the ad, you should be able to determine how many of that set of visitors ended up converting.  (A conversion could be buying, creating an account, downloading, subscribing, watching a video, or whatever action you are looking for).  Then you can see whether the cost of getting that conversion was worth what you paid for it.

And here’s the best part: whether it was worth it or not, it doesn’t matter how many of the people who saw it were false positives.  If Insightly got enough paying members because of that ad that it was profitable to do so, it doesn’t matter if there were 10,000 false positives in addition to me.  The end result is the same: profit.  On the other hand, if Amazon finds that they didn’t sell enough pens with their campaign, it doesn’t matter if I was the only false positive out of 10,000 people.  It still wasn’t worth it.

The moral of the story, then, is twofold:  First, you should know that Facebook ads sometimes (often?) are shown to inappropriate targets.  Second, if you set up your ad campaign and web tracking correctly, the first item can be safely ignored.

If you have specific questions related to this, please feel free to contact me and I’ll do my best to answer them.

Posted by Michael J. Coffey  |  0 Comment  |  in Social Media

Thinking Critically About Social Referral Data

What should you do for the best social referral data? Like the picture says: Test, test, test!

A couple of weeks ago, Shareaholic published its social media traffic report for the second quarter of 2014.  In it they looked at social media data from 8 of the more well-known sites:  Facebook, Pinterest, Twitter, StumbleUpon, Reddit, YouTube, Google+, and LinkedIn.  Based on their social referral data, which come from over 300,000 websites, Facebook drives the most traffic to websites, hands down.  Like nearly 25% of all visits.  You might then immediately jump to the conclusion that your business needs to be on Facebook because it’s the best.

Let’s assume the numbers are correct and accurate.  It’s not something I always assume, but in this case I want to thinking about what they mean.  To do that, we have to think about the situation that might have contributed to those numbers.  How do people on those social media sites end up on the various websites that they measured?  In other words, what is the pathway that is being measured?

A Visitor’s Path

The most likely way, of course, is the social share.  Someone shares a link to one of the websites in a social media post.  You know, probably with a comment like, “OMG. This is so cool, you have to see it!” or “Great article” or whatever.  Their friends/followers see the post, click on the link, and *ding*!  They’ve been tracked as referral traffic from a social media site.

So far, so good.  The unsuspecting business owner still says, “Great.  If I’m on Facebook, more people are there, more people will see the post, and more people will click on the link and visit my site!”  Eh.  Not so fast.

The Other Half

If we apply systems thinking concepts to this, we can recognize that we’re only looking at the second half of the full system.  At this point we haven’t looked at how and what gets shared, and that’s where some issues start to be revealed.

Many sites include social sharing buttons.  You know, the little buttons with all the social media sites where you can just hit it to create a new post with the link to that article already filled in.  A person reading an article on a site might copy the web address and paste it into their own post manually.  But many people will use those buttons to share.  And here’s where we start really thinking critically: how much influence do those buttons have on the referral traffic back to the site?

I’ve been on many sites that only have a Facebook and a Twitter button.  I know that sometimes I want to share the article on Google Plus, and have in a few cases, not shared the article because there was not a G+ button.  As a result, my G+ followers didn’t see a link to the page, meaning they didn’t follow it and become a social-referral visitor.  Meanwhile, Facebook and Twitter are getting higher amounts of exposure because of the buttons, thereby increasing the social shares on those sites.

So here’s the thing:  Is Facebook really driving more traffic, or are websites just more likely to have a “Share on Facebook” button?

We could even go further with this…for example, some social share systems have a place where you can click and reveal a button for all the sites you’ve never even heard of.  But data shows the more clicks you need to do, the fewer people will follow through.  And people are more likely to click the first item than the second one, and so forth.  So even the order of the buttons might make a difference in how and where an article is shared.

The Answer: Look at Your Social Referral Data

The answer to this confusion brings us back to the image on this post:  test, test, test!  The only way to really be sure for your website is to look at your own visitor data.  Just now, I checked the last two months of data for and my other site,  One site got a strangely-round number of 60.0% of social media referral traffic from Facebook.  The other got 9.86% of its social referral traffic from Facebook.

Looking at Shareaholic’s article, you might be tempted to think you have to be on Facebook.  In fact, they say in a bold header, quite definitively, “Facebook is, by an extremely wide margin, the #1 source of social referrals” But I’ve got a site where over 90% of social referrals come from not Facebook.  It gets roughly three times as much traffic from StumbleUpon than Facebook, but from Shareaholic’s standpoint, Stumble upon only drives a measly 0.6% of traffic.

Who am I to believe?  Well, my own numbers, of course.  It doesn’t matter what the average is.  It matters what is working for you.  And if you don’t like what you’re getting, experiment with new things and see if you can improve your results.  That’s why systematic experimentation and testing is usually part of my digital strategy recommendations.  You can’t tell what results you’ll get by looking at averages, and you can’t tell what will work best unless you try different things and measure your outcomes.

Which sites work best for you?  (Or, if you’re not sure, what challenges do you face finding out?)  Leave your answer in the comments.

And don’t forget to share this article on social media by using one of the buttons!

Image source: my own notebook where a friend and I were testing different fountain pens and different inks.  Not social referral data, perhaps, but testing nonetheless.

Posted by Michael J. Coffey  |  2 Comments  |  in Analysis & Testing
  • Stay Connected