LucyDog
Well-known member
So I am asking this in all seriousness....not in any way trying to be snarky, but a great deal of the time on this board I see people from outside the U.S. write back things like, "vets in the U.S. often miss this" or "vets in the U.S. don't really know about that." Are American vets just not educated as well as vets in other countries? Does the U.S. not do as much research on animal disease etc..as other countries? If you think that is the case, why do you think it is that way? I don't have any emotional attachment to the issues...not like I have some American pride thing going on....so please be honest. It just seems odd to me that in a country where pet owners must spend millions..probably billions of dollars... on their pets that it would make sense that our vets would be pretty good and that money would be spent to do research on animal medicine. The comments don't bother me, but it does bother to think that I might be spending a ton of money on vet care that is substandard. Not to mention that I adore my animals and want them to get the best care possible. I look forward to your thoughts.