A cultural meme has been permeating America, that America is in the midst of a financial crisis, also known as an economic crisis. But I posit the question, is America really in a financial crisis?
Assume that America appears in the midst of a financial crisis, wouldn't America need to bailout out America as opposed to just bailing out Wall Street? Wouldn't America need to be bailing out individuals rather than corporations? Judging from where the bailout going to be appropriated, it appears that only Wall Street with several selected financial firms are in the financial crisis, not America as a whole.
America is NOT in a financial crisis.