The US should go to war if it believes it is absolutely necessary. Apparently, most Americans think it is — although I doubt this is an informed opinion. But they shouldn’t expect the world, even their traditional allies, to follow suit.
What happens after the war is more important. Occupation?