Katryn Geane

While sitting in the second row of seats looking at heat and confetti maps of sample websites, I was reminded of the number one reason I love attending the National Arts Marketing Project Conference (NAMPC): all these smart people are sharing information that I get to go home and use, and everyone else will think I’m a genius.

OK, maybe not that last part, but how lucky can we get with colleagues who are willing to help us out like this? I’m as much of an internet nerd as the next new media manager, but it seems that there’s a new resource or tool every week that promises to track, update, monitor, and help you do something with your website, and I can’t be the only one who doesn’t have oodles of extra time to be cruising the internet testing new tools.

In the measuring and improving your ROI session, Caleb Custer and Dan Leatherman presented a metrics-driven and scientific method-inspired “try, learn, think” cycle for testing and implementing changes to an organization’s website.

By using tools they introduced as well as now old standards like Google Analytics, they urged us to “prove the user’s expectations right and they will feel more in control” (paraphrased from Jakob Nielson) and therefore happier with their experience with your site.

Plunk, Clue, Crazy Egg, and others were offered as options for testing user interface, and there were resources for tracking links, segmenting visitors, optimizing landing pages, and then even more about email layout and design, A/B testing…and so on, and so on…and more.

As Caleb and Dan presented us with a slew of resources for testing varying aspects of website and email communications, I realized, based on the questions and comments I heard, some of the people around me were feeling overwhelmed and probably wondering if they were super behind the times for not running these tests already.

As someone who has had this feeling many (many!) times, let me say it’s not too late.

After the session, I tweeted: “Must run home and heat map the org’s website. Like, yesterday.” And I totally meant it.

This may sound simple—and likely more than a few of you reading this are saying, “um, duh”—but after ingesting a brain-stuffing amount of information about one thing, like website optimization, the first step is to figure out what’s most important and what can you accomplish.

Maybe your organization does have the time and resources to redesign your whole website, but maybe it’s more important to figure out how to optimize your weekly emails first.

My process for taking information from NAMPC and using it back home, and this is how I’ll implement what I learned at Caleb and Dan’s session, involves reviewing my notes and seeing what I wrote down more than three times (because if three or more NAMPC presenters said it, it’s probably something to look into). I will craft a plan for how I’m going to present new ideas in a way that is going to get my co-workers on board for something new and help move the organization forward.

One of the new tactics I picked up at NAMPC 2012 was from Nina Simon’s keynote; we should do “xyz” because it supports the mission in this way.

I know the senior level staff at our organization are keyed in to the mission and want to make sure our efforts are supporting and moving it forward, and so with this kind of tactic and supporting information they’re going to feel better prepared to try something new.

This is my second favorite thing about NAMPC: synthesizing all my new information and making it actionable.

Here’s to new and improved websites and knowing way more about testing than ever before!

4 Responses to “Testing, 1, 2, 3: Measuring and Improving Your ROI”

  1. Brett says:

    The ROI session with Custer and Leatherman was one of my favorite NAMP sessions as well. The information was new to me, but very accessible. I look forward to using heat mapping and other tools to track ROI, and share them with the ~250 other arts organizations in our region.

    • Katryn Geane says:

      Agreed–learning new resources are out there is always a win. Not all of them prove to be useful for what you’re trying to get done, but I usually try it at least once to see what type and quality of information I can get.

      • So glad you all are implementing this into your process! It’s a great feeling to gather as much data possible about your user-base, and then segment it to make data-driven design decisions.

        Just for clarification, the simplified method was, “think, try, learn” :)

        Let me know if you have questions—I’m happy to help!

  2. Tim Mikulski says:

    Thanks for the follow-up Dan!

Leave a Reply

*