Why do you think regions of the United States are more conservative & other regions are more liberal? (For example, the West coast is more liberal & the Southern states are more conservative)
because the west coast is not where all the business are (so they are more liberal as a result) but, the southern states are where most business are (so they are more conservative).