Skip navigation
AM0322ABtesting.jpg

Considering Different Communication Strategies? Try A/B Testing

Approaches to event marketing and online content presentation can be more effective with “split testing” conducted at the right moments.

When associations’ and companes' event-planning and event-marketing teams work together to create compelling e-mail copy, call-to-action buttons, and landing pages, it’s natural to rely on the teams’ collective wisdom and intuition, as well as prior data, to predict what will get members to respond.

The problem is, that mixture rarely points to one definitive path to follow—and sometimes, informed intuition can simply be wrong. To bring more certainty to a short list of preferred strategic options, teams can vet those alternatives against one another through what’s known as A/B or split testing.

Essentially an experiment where a subset of the association’s full audience or the company's customer/prospect audience is split in two so that each is exposed to a different approach, A/B testing needs certain parameters to ensure its success. First, you must create two different versions of one body of content—with changes to only one variable. Then, you use one version with one subset of your total audience, and the other version with a different, equally-sized subset. Afterward, you analyze which version performed better in that time frame based on the intended goal: more time spent, more click-throughs, more conversions, or another metric.

Success with Split Testing
Mackenzie Clauss, senior associate of marketing communications for association-management firm Smithbucklin, recalls a recent A/B test she helped run for a client, Financial and Insurance Conference Planners. With the goal of getting a better open rate for emails sent to members, Clauss and FICP decided to test a personalized subject line for an email campaign. The hope was that “seeing their name would slow people down enough to engage them,” she says.

The split test featured one batch of emails that, over the course of two months, used each member’s name in the subject line; another batch sent out during that time frame did not use members’ names. To ensure that the use of names would most likely be responsible for any difference in open rates, no other changes were made in the campaign compared to previous campaigns, including the time of day the emails were sent.

Screen Shot 2022-03-03 at 8.07.18 AM.pngThe result: A roughly five-percent increase in emails opened among the group that saw their names in the subject line. “It’s great to have this information,” says Clauss (in photo). “Going forward, we’re not necessarily going to put first names in every email we send, but we will adjust so that we’re using personalization when we think we might need it.”

One example of split testing that generated an unexpectedly strong return comes from the American Society of Transplant Surgeons. To promote its annual meeting a few years back, the group embedded brief videos featuring past attendees into one batch of emails. In the videos, attendees spoke for about 45 seconds about the most important thing they learned at the previous meeting and what they most looked forward to at the upcoming meeting. Another batch of emails was sent with no video embedded, just with text and hotlinks as in previous years. Emails that contained an attendee video got 320 percent more opens than emails that did not use a video.

Jennifer Kasowicz, senior manager of marketing communications for Smithbucklin, is actually not surprised by such a huge rise in engagement when video is involved. “Anytime we use video, it performs better—whether it's in email or on social media or on a web page, it beats straight copy by a lot,” making it ripe for A/B testing across any of the communication channels associations use.

Screen Shot 2022-03-03 at 8.06.52 AM.pngEven with online content presentation, associations could glean valuable insights about engagement preferences through split testing. For instance, Kasowicz (in photo) cites the choice of whether an association disables the audio as the default setting for all self-starting videos, leaving it up to the viewer to activate the sound. Two factors make this a good area for associations to test: First, surveys of the general workforce reveal that more people each year actively multitask—doing two or more tasks simultaneously—while at their desks. And second, closed captioning is becoming increasingly common for virtual sessions, so multitasking can involve watching virtual sessions with the sound turned off.

Rather than simply setting all self-starting video content to mute, an association might want to conduct an A/B test to measure the average time spent watching by those who encountered default muting and those who did not. The reason: “Every audience behaves differently. Something that works for one organization might not work for another,” writes marketing veteran Lindsay Kolowich Cox in this article. “In fact, conversion-rate optimization experts do not use the term ‘best practices’ because a particular tactic might not actually be the best practice for the members of a given organization.”

That last thought sums up the case for adding A/B testing to an associations’ marketing and communication toolbox.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish