Guns have been part of the United States since that fateful day in 1776 when the Founding Fathers declared independence. The right to bear arms is so fundamental to the nation that it’s in the Bill of Rights, and American...