I will guess. I might be wrong. The West I will define as Western Europe, extended to it's close allies. The United States broke away from Britian but re-established ties during the World Wars. So the US and Western Europe became close again. Also, the US subsumed Japan and made it a vassal. So Japan is part of the West South America came from Portugual and Spain, but didn't get involved in the World Wars (maybe?) and so it remains independent from the military ties that bind the rest. So the West has come to be a name for the allies of WW2.
All countries in South America were on the US side of the war and at least Brazil sent actually a sizeable number of troops to fight in Europe. Also Japan is not considered part of the West anywhere, I think, but it would make some sense.
I guess we don't talk much about South America in our history of WW2 because we are racist. As for Japan, it is Far East I guess. But in a lot of contexts it groups well with the rest of the West.
I was going to give my opinion but thought maybe I should see what the Wikipedia says: I'm glad I did because I would have left out NZ and Australia : https://en.m.wikipedia.org/wiki/Western_world The Western world, also known as the West, primarily refers to various nations and states in the regions of Australasia,[a] Western Europe,[b] and Northern America; with some debate as to whether those in Eastern Europe and Latin America[c] also constitute the West.[5][6][7] The Western world likewise is called the Occident (from Latin occidens 'setting down, sunset, west') in contrast to the Eastern world known as the Orient (from Latin oriens 'origin, sunrise, east'). The West is considered an evolving concept; made up of cultural, political, and economic synergy among diverse groups of people, and not a rigid region with fixed borders and members.[8] Definitions of "Western world" vary according to context and perspectives.[9]