Hotwire Tech Blog

Scribes from Hotwire Engineering

It’s no secret that marketing spend is a significant boulder for any consumer company. There are numerous ad-networks and platforms that marketing teams use to run their campaigns. Given the plethora of choices, it is really hard for them to make a decision on how to distribute their marketing spend across these different channels. Marketing attribution provides them with actionable data. It is the act of tracking the source/touchpoint for incoming traffic, and attributing transactions to those touchpoints based on specific attribution models[1].

At Hotwire we treat marketing attribution as a first class citizen. We recently simplified the attribution flow to make it much easier for client applications.

PROBLEM

Our initial architecture for attribution was strongly coupled with search transactions. Clients used to persist any marketing parameters that identified a channel, and then pass those parameters along-with search and purchase API calls. The marketing parameters were then extracted in the API layer and persisted into the backend tables alongside a lot of other transaction related data. The BI processes would then use ETL on those relevant tables at regular pre-defined intervals and apply the attribution model.

Figure 1. below depicts the flow of attribution data.

attributionblog1

There were a couple of blatant issues with this approach :-

  1. Client applications had business logic for attribution. They had to parse the parameters, persist them locally, and pass them along with every search and purchase request.
  2. It violated separation of concerns in the backend. Core transaction related logic was tightly coupled with attribution.
  3. Attribution was lossy. Clients could have installed the app as a result of a particular marketing campaign, but if they did not initiate at-least one search through that client, cross device attribution [2] wouldn’t work.

SOLUTION

We decoupled the collection of touch-points and moved that business process into a new micro-service called Usher. If any campaign ad or e-mail results in a launch of our native applications, the client parses the incoming deeplink for marketing parameters, and then simply passes the data to Usher fronted by an API. Our BI team during their ETL process would correlate these touchpoints to transactions and determine attribution.

Figure 2. shows this revised flow of attribution

attributionblog2
We also took this opportunity to simplify install attribution on apps and moved from AdX to Tune as an attribution vendor (install attributions [3] are a complicated process in the app world). With AdX, we had to poll their servers with every launch to determine the marketing parameters, whereas with Tune to process is transparent to the client and there is absolutely no logic on the client. Tune simply uses the same endpoint provided by Usher to send install attribution data.

NEXT STEPS

  1. Usher can be improved for scalability and fault tolerance by using Kafka queues.
  2. Currently only iOS and Android clients have moved to Usher. We need to move all our client applications (for ex. mobile web) onto Usher.

[1] https://en.wikipedia.org/wiki/Attribution_(marketing)#Attribution_Models

[2] http://www.visualiq.com/products/iq-envoy/cross-device

[3] https://help.tune.com/marketing-console/main-methodologies-for-attribution/

Leave a Reply

Your email address will not be published. Required fields are marked *