We're talking A/B Testing today, i.e. Testing two versions of the same thing to see which option will yield better results.

A simple use case- A/B testing creative concepts for herd mentality

Problem: It's not uncommon to see brands/ads get #cancelled online every now and then - people are more aware and sensitive towards issues, and as marketers/advertisers we need to be mindful of these things, than dismiss them as a mindless bunch who don't know better.

No consumer can skip the herd mentality due to a larger blanket of culture

An example: The recent & much talked about #FabIndia campaign 

Solve: A/B testing your Ad concept before actually rolling it out to the public

Machine Learning, quite literally, learns past patterns based on the #data we feed it.
Similarly, using #ML to A/B Test our creatives, can be done via having a dataset on past consumer patterns, that have called out brands.

The dimensions of the data set could be as follows:

What were the key points of outrage?
Racism, sexism, wage gap, religion, language used?
Who was outraging more?
Ages 18-24? 25-35? Males? Females? The LGBTQIA community?
What geographic areas were involved in the outrage?
Tier 1, Tier 2 cities? 
What words were used?
Cancel? Boycott? Sentiments hurt? 

The machine will learn a pattern of socially woke consumers with more attributes to the dataset we feed the machine.

For FabIndia, life would be easy if they A/B Tested whether ‘Riwaaz” or “Riwaaj' works better for the kind of festive season they were launching in.

We can always learn from the past and better fit our brand to the social environment of our consumers.
At the end, we really can't be at war with the consumers.

It’s good to undertake PESTEL analysis, top it off with A/B testing versions of your concept with this data: The machine points out the social lacking in your concept, saves you from being the next brand under the radar.