Hi everyone! I am aware most of you on this website are American. I am Canadian. I have been very interested in moving to the U.S, but I'm not very educated about the differences. Ive never been there, but California and Florida look very nice. Where are some nice places in the U.S to live? Where are the best places to make a good living? Is it very different from Canada? What differences can I expect if I move there? I have many questions! Thanks guys.