Has anyone noticed how many young American families are moving to other countries, stating they offer better healthcare and less stressful lifestyle ?
What is your opinion ???
I haven't noticed that. My opinion is, if you've been educated in America, and your family's here, you wouldn't want to live anywhere else.