Case study with Beautonomy (E-Commerce Beauty Startup)
BEAUTONOMY CASE STUDY
In my latest role, I worked with Beautonomy, an e-commerce site that sold customisable beauty
palettes. During that time, I managed most of the paid marketing channels, as well as the email
automation channels and assisted with sourcing and managing partnerships with influencers and
other publishers.
FACEBOOK ADS
We started with Facebook ads initially. The creatives were provided by other team members, I
managed the targeting, reporting and calibration, together with my boss (the head of marketing).
We tested multiple audiences (I can't remember the exact number, but probably about 10 different
audiences. For example, one audience would be a list of beauty bloggers as interests, and another
would be a list of beauty magazines.
I used the audience narrowing to find people interested in multiple of these; the idea being that
someone who follows 3 or 4 different beauty bloggers is more likely to be very into beauty, instead
of someone who just follows 1.
I'd also intersect these with an interest in online shopping, to focus on people likely to purchase
online. In addition, we also tested broad interests groups such as the intersection of people
interested in "beauty" and "shopping". As we got more visitors to the website, and more customers,
I also built retargeting and lookalike audiences based on our visitors, newsletter subscribers, and
customers.
These Facebook ads were an ongoing calibration. Every week, I would look at which ad sets were
performing better/worse. I'd then:
-
increase the budgets of the best performing ones
decrease budgets of ones that weren't doing so well
pause ad sets that were performing very badly
add variations of high-performing ad sets (for example, if the beauty bloggers ad set was
performing well, we'd create another ad set with a different group of beauty bloggers to
test.
After gathering data for a few weeks, I also ran a simple machine learning analysis on our Facebook
ad data (tested both decision tree models and regression learning models) to determine what aspect
of the ads were making the most impact (copy vs day of week vs device vs color of image, etc). This
helped us narrow down which aspect to focus on (though I can't remember what the exact results
were).
As the creative team came up with more creative assets/visuals for ads, we would test these too,
running all the new creatives for a week, and then turning on/off some ads based on performance.
GOOGLE ADS
As we went on, we also tested Google Ads. I worked with the creative team on the copy, and
managed the targeting/bidding myself.
For Google ads, we used a tiered bidding strategy, with enhanced bids. For each ad group (each ad
group basically contained one main key phrase and some close variations on it), I would have a
broad match, a modified broad match, phrase match, and exact match - with increasing bids in that
order (because the exact matches are more valuable to us). We also ran ads towards remarketing
audiences and similar audiences built on that remarketing list.
We started with about 4 ad groups, and calibrated as we went. Each week, I would:
-
-
look through the ad groups to see how they were performing
add bid adjustments for devices, time/day, demographics based on performance
go through the list of actual search queries, and add negative keywords to the groups for
queries that were performing badly
look through the search queries to find keywords that were performing well but that we
weren't targeting specifically, and add them either in existing ad groups, or as a new ad
group (depending on how closely related they were to the existing ad group).
pause ad groups if they were performing very poorly
brainstorm new keywords that were similar to the high-performing ones, and create new ad
groups to test
adjusted keyword bids as necessary
To further enhance results, I also created personalisations of the homepage using VWO, where the
headlines and CTAs were varied based on the search keyword/ad they had clicked on to arrive on
the page; the personalised pages had more than double the conversion rate of the default home
page.
EMAIL FUNNELS
I used Drip to build our email automation funnels. We had a few different automated sequences
with different triggers, and the users were tagged in Drip based on actions they performed on the
site. The emails were written and designed by the creative team; I set up the funnels, segmentation
and tagging, and all the technical aspects of the automation.
NEWSLETTER SIGN UPS
5 days after they sign up / create an account, we would send a promotional email with a coupon
code, to encourage them to purchase
WHEN THEY ABANDONED CHECKOUT
For users who added items to the cart but didn't complete checkout, we would send an email
reminder the day after, to check if they had any problems, and offer to help. We'd then send
another email 5 days later (if they still hadn't purchased), to offer a promotional code.
POST PURCHASE
After purchase, we wanted to keep the customers engaged beyond just the standard order
confirmation.
A week after their order, we'd send an email asking what they thought. Customers would then be
segmented based on their response to this - and get slightly different follow up emails. The ones
who had a bad review, we would reach out to and try and see what was wrong and if we could help
them.
For ones who left a good review, we wanted to keep engaging them. So a week later, we'd send
another email, asking them to share a photo of their new look (since we were selling makeup
palettes) and to tag us on IG. A month later we would send another email asking them to refer us to
a friend if they were liking their palette, with a link to a referral program.
INFLUENCER ENGAGEMENT
When I was at Beautonomy, we only started the early stages of engaging with influencers. My boss
shortlisted a few influencer search platforms. I wrote the brief, shortlisted and selected influencers
to reach out to (based on their reach, how much they would charge, etc) and worked with them as
they made their posts.
REPORTING
Every day, we would generate an in-depth report. Using Supermetrics and Google Sheets, I had
created an in depth dashboard, that pulled together data from Google Analytics, Facebook Ads, and
Google Ads, as well as our own direct sales data from WooCommerce. I used various (and
sometimes complex) Google Sheets formulas to combine these data into a digestible report. The
report was customisable by date (there were start and end date cells that updated the rest of the
document when changed).
Based on this data, I would create a weekly list of actionable insights. This included ad performance
and test I wanted to perform the next week, but also suggestions for the development team. For
example, if I noticed a big drop in engagement on a certain page, I'd dive into the stats to try and
figure out why. If it was a load time issue, or if I had other suggestions (copy, design, etc) I would
suggest it to the team for them to update.
END RESULTS
Overall, through these efforts, we increased our conversion rate on the website over the 6 months,
and took the company to >15,000GBP of monthly revenue.