Using AI to help small business users write better job ads

I was responsible for introducing Seeks first AI feature on the employer side of the business. The experiment saw a 6% increase in conversion and had a 36% adoption, and has now been scaled to all of Seeks markets across Australia and Asia.

Timeline

Timeline

3 weeks (research & design)

Responsibilities

Responsibilities

Interaction & visual design

Role

Role

Lead Designer, Manager

The successful design variant increased yield by 5.3%

The successful variant increased yield by 5.3%

problems

Many small business hirers don't know where to start when writing a job ad, let alone a effective one.

Based on prior research and analytics, users were continually defaulting to the least expensive product option, not considering other, higher-value products that may better meet their need. We classified this as 'auto pilot behaviour', which was characterised by:

The ‘write’ step as part of the job posting flow has long been a problematic experience for hirers. Analytics data reflects this, with this step in the flow having the highest bounce rate as well as the longest time spent on page.

Page bounce and conversion impact

Page bounce and conversion impact

Adoption metrics for performance and premium products hadn't changed despite extensive experimentation to improve the inclusions and pricing.

Writers block

Over 50% of SME users interviewed in past research stated they didn't have clear performance expectations for any of Seeks products.

How can I stand out?

Metrics indicated that users were just defaulting to the cheapest product without considered higher-value, better fit alternatives for their hiring needs.

The current state experience (control group)

The current state experience (control group)

opportunity

How can we help hirers to create effective job ads & building confidence to complete their purchase?

Our teams ingoing hypothesis was that we could be doing a significant amount more to drive our users to consider higher-value products. We had the constraint of only utilising the existing data points and features available in the current experience, but with flexibility to re-structure their presentation altogether.

Goals

Goals

  • 5% increase in performance ad adoption

  • Increase consideration of higher value products by 15%

Metrics to measure

Metrics to measure

  • Ad mix (percentage of different product types purchased)

  • Time on spend on selection page per user

process

Clearly communicating the problem we wanted to solve

Managing stakeholder expectations

Managing stakeholder expectations

Through cross-functional workshops as well as 8 qualitative research sessions run over 2 days, we landed on 3 key themes that were driving the observed behaviour.

Existing ad selection design

Copying and pasting generic content

Copying and pasting generic content

The current design emphasised price and the product name title, which was encouraging users to make a purchase desicion based on price and not the value received.

The current design emphasised price and the product name title, which was encouraging users to make a purchase desicion based on price and not the value received.

Lack of structure

It was unclear to users what they could expect from actually selecting a product. Feature information was positioned below CTA, leading to difficulty comparing product features and inclusions.

It was unclear to users what they could expect from actually selecting a product. Feature information was positioned below CTA, leading to difficulty comparing product features and inclusions.

Lack of confidence

Current designs made it difficult for users to understand why they should pick a particular ad over another with similar terminology being used across products causing confusion.

Current designs made it difficult for users to understand why they should pick a particular ad over another with similar terminology being used across products causing confusion.

Existing ad selection design

Interaction explorations

Interaction explorations

Next I focused on a fast and structured exploration looking at numerous different approaches to design existing ad content in a way that clearly communicated value and resonated with users.

Existing ad selection design

Design explorations

Final designs

Showing restraint in order to ship fast and learn

Ensuring differentiation across variants

Ensuring differentiation across variants

A key part of preparing for on-platform experimentation was ensuring both variants were clearly differentiated. I emphasising different elements within the product hierarchy across both variants to ensure we could understand specifically what moved the metrics.

Unoptimised hierarchy of information

The current design emphasised price and the product name title, which was encouraging users to make a purchase desicion based on price and not the value received.

No laddering of value

It was unclear to users what they could expect from actually selecting a product. Feature information was positioned below CTA, leading to difficulty comparing product features and inclusions.

Unclear use case for selection

Current designs made it difficult for users to understand why they should pick a particular ad over another with similar terminology being used across products causing confusion.

Results

Relevance and simplicity best resonated with users

We had some exciting results with Variant 1 increasing performance ad adoption by 5.2% and time on page by 20% per visit. This outcome was a fantastic demonstration of the power of user-centred thinking, and has meant the team can isolate specific changes like hierarchy, composition and copy and directly point to their impact on business success.

Variant 1

Yield: +5.3%
Time on page: 20% increase
Conversion: Steady

Variant 2

Yield: +1.7%
Time on page: 10% increase
Conversion: Steady