This is a question has been bothering me as someone who’s country was colonized by the British Empire. We were taught about it in schools and how it lost power over time but never how the USA came to take its place especially over such a short compared to the British Empire.
Also important is the US came in royally late (as in WW1) after the battle of Stalingrad that decided on the eventual winner.
They massively profited from the war without putting much effort in it and made vasal states of Europe.
Something a lot of people don’t want to hear.
We learn how we got ‘liberated’ in school.
In reality we got taken over.