Dear Booking.com, A/B testing is all about conversion rate but not about good user experience

By Sergey Bryukhno

Nearly a year ago I got a call from a recruiter that asked me if I am interested to appoint interview with Booking.com. I was curious enough to say yes again 😅. Actually it was the third time when hiring managers from Booking.com approached to me. The hiring procedure is quite standard and this time it was very much similar to my previous experience. Phone interview with recruiter, technical interview with two employees. Same questions and same answers. I easily passed to the final round and was asked to come to Amsterdam for an interview in office.

I was asked to prepare actionable improvements to booking.com website and share my ideas during the interview with the designers.

I performed a heuristic analysis and defined the problem: When dates are changed and ‘Check availability’ button pressed it takes user to a new hotel search results page.

The marked with red bound area for me perceived more like for changing dates of stay in a current hotel and I am expecting to see available rooms for the changed dates when press “Change search” button (CTA texting is controversial from my point of view).

Instead of getting available options for the changed dates, I was taken to the search result page with the list of available hotels again. Such action was very much confusing to almost people I showed the case.

During the interview, I proposed to let users to change the dates and to see available room options without taking them to the hotel list page again.

Booking.com is well known for its A/B testing culture and data driven design. “If it can be a test, test it. If we can’t test it, we probably don’t do it.” as they say.

The reaction to my proposal was something like this:

– But you found what you were looking for?

– No, I lost my hotel. I just wanted to change dates.

– Believe me, everything you see on booking.com is already tested and works better in conversion.

The position I was interviewed for called UX Designer. I was expected to propose actionable design improvements and check it with A/B tests. The described case is not about using dark UX pattern to sell something extra to the user. We just need to let the user to finish booking. It’s a mystery to me how booking.com designers came the this hypothesis to take users to the hotel list instead of let them finish a booking of already selected hotel. It seems to me and to my fellow colleagues that this is not a good user experience. It also lengthens the conversion funnel.

Why A/B test displayed a better conversion rate as they claim?

My hypothesis is that user intention to achieve the designated goal is strong enough to overcome the obstacles in user interface. In such situations, A/B test may not reflect a conversion rate determined by a better user experience.

I noticed similar user behaviour when I worked for fintech startup as a UX architect. Users put all efforts to complete loan application and get money fast even if application funnel interfaces doesn’t work as intended. UI errors was identified later with remote video session recording. Even critical errors in the interface that confused users and took time to overcome did not influence decline in conversion.

The event of conversion during A/B test doesn’t not necessarily determined by a good user experience.