Skip to content
Contact Us
TS_logo_RBG_pos-1
Contact Us
August 25, 2022

Double Response Rate by Identifying Donor Interest

TrueSense Marketing Salvation Army Segment

Historically, camp appeals have lost tests against other offers, but one of our Salvation Army partners wanted to increase their camp fundraising. Through donor research, we know there is a group of donors with a strong commitment to summer camp. How do we find these donors? With our new data warehouse, one option was to identify donors based on their past giving behavior.

In a recent test, TrueSense identified every direct mail piece over 20 years that was focused on children and youth, and added a tag to the database. Using these tags, we created a test panel of donors who had historically given to appeals with messaging about children and youth.

We then created a control panel of general donors with similar seasonal giving patterns and value. Then we sent the two audiences our camp appeal.

The result was a 100% difference in response rate and 54% difference in net revenue.

Past giving behavior based on content was a strong indicator of response.

 

 

Children and Youth Audience

General Audience

Mailed

10,000 10,008

Response Rate

13% 6%

Average Gift

$111 $176

Net Revenue

$30,347 $19,700

 

The results of this test are significant because, in addition to a very real revenue optimization solution for camp fundraising, it points to new models of testing and deployment.

In traditional A/B testing, we create two audience panels to determine the performance of existing communication against something new. Depending on the criteria of success, a winning package is selected. It is a helpful and efficient method, but it guides communication decisions based on a binary outcome: everyone will get the winner, and no one gets the loser.

The problem with an over-reliance on standard A/B testing is that it results in sending the same creative to all donors in the future, despite their personal interests. It doesn’t take into account the people who responded to the losing creative and the vast majority of people who didn’t respond to either message.

Our donor interest test approached the results of mail appeals differently. It looked at the donor response to a particular type of messaging and then optimized based on their preference. The test resulted in two outcomes.

First, it validated the idea that, at some level, donors respond to offers they care about and are more likely to respond to similar offers in the future.

Second, it added additional donors to the children and youth affinity group. The 303 donors who gave to the camp appeal from the general audience have demonstrated their potential interest in camp messaging in the future. Offer testing isn’t just about a winning package — it also gives us insight into the behavior and interests of the donors who respond to all the packages being tested!

In conclusion, our camp audience testing resulted in a strong control strategy for identifying donors who should receive a summer camp appeal. It also points to a broader strategy of identifying interest by cataloging creative and selecting donors based on past giving to different creative offers. Over time, this additional data element could enrich our machine learning scoring and deliver more content aligned with donor giving interest.

 

Related Articles

View All