Hi guys,
Years ago, I use it remember that doctors in the 1940s-1970s, were treated like "Greekgods". They were admired by all, made huge incomes, were the center of american society, and they had authority.
Today, its the opposite of what I mention. All I see in the news, newspapers and hear on the radio are doctors are being sued, malpractice lawsuits, insurance fraud, misdiagnosing patients, incompetent decision making, and straight out resigning! I was told the M.D. in the end of the name doesn't have that status that it had years ago.
At one point growing up, I was told to become a doctor. Now it seems that it lost its allure and today's parents tell their children to go into computers & technology! Many feel its not worth the long years and debt.
What happen to the field of medicine? Why did it go to one extreme to another? What cause them to be heroes, but today, they get abused to blue collar paid workers?

