A core concept of “Americanism” is the belief that the United States has aGod givenright to control all of the Americas in the name of democracy and freedom–but in reality, for plunder and commercial interest – historian Gerald Horne joins Paul Jay of www.therealnews.com.