When I was growing up I was under the impression that colonialism is when you taker over some territory and exploit resources/people for your own needs. But now it starts to look like doing anything at all in other countries is considered colonialism as well. Interesting that this view matches very well with America (or another country) First radicals who want to ignore what is going on in the world. Horseshoe theory in practice I guess.