Is Japan considered formerly colonized? Why did it seem to go better for Japan than many other colonies?

I'm not saying that things were hunky-dory by any stretch, but looking at the horror stories from most of the world, Japan seems tonhave been considerably less fucked over. Why?