Well, Germany had to be made to do that. We had to remake the nation into one that has changed its dogma from "we were right" to "we were wrong". The fact that the populace accepted it is awesome, actually, but it needed to be done to them.
There is the theory that Germans accepted this culture of Vergangenheitsbewältigung because of a reason. While I don't necessarily believe this is a 100% true, it would explain the incentives. The theory goes as follows:
How would Germans be able to tell other Europeans what they need to do in which way if they didn't have the moral high ground of Vergangenheitsbewältigung?
While this might not be true for a big chunk of the German population (which is truly horrified by and sorry for the harm their nation caused), it is certainly part of the reson this has been successfully accepted.
Blaming WWI entirely on Germany is myopic, and is in part what set the social unrest in the Weimar Republic in the first place. The huge debt was put on Germany, and they were heavily dependent on loans. When the US economy crashed in 1929, they were done for. If Germany did not take the full blame for WWI, WWII arguably never happened. Because someone like Hitler wouldn't have been able to rise to power via a massive social unrest.
That is hard to prove contrafactual. The Germany still lost WWI which was massive sticking points for them. The "stab in the back" myth would existed either way, because Germany being having super militarized culture could not admit loss.
The revolution in Bavaria would still happened. And still irk right wing. The democracy would still be rejected and Weimar Republic would still be one massive mess.
Hitler did not gained power just because of social issues.
(And also, less importantly, had Germany won the WWI, they had plans to ask other countries for large reparations. It is not like those were something unheard of at that time.)