My "rule of thumb" for each activation type is below:
Sigmoid - Sigmoids are great for yes/no marketing decisions - will this sell or not?
ReLU- No limits ReLUs excel at quantitative forecasts and analysis. For example, predicting how many items might sell at different prices.
Tanh - help smoothly grade subjective criteria like monitoring social media sentiment changes about products.
Of course testing different functions trains your neural network better for different business tasks! More intelligent models yield more revenue!