Most people in the world know healthcare in the USA is bonkers and often doctors won’t treat you unless you can prove you have the insurance to cover it. But what non-americans might not know is the lengths that medical places go to extract profit from you at every turn. Here’s a real thing that …