What happened after the war?
Many German men had been killed in the first world war and it was quite humiliating for Germany to lose the war. They were made to keep a peace treaty and this meant that Germany was not allowed any military power. They had to accept responsibility for the war, they lost land and they had to pay other countries for any damage that had been done during the war. The German citizens were quite annoyed about this and felt as if they needed vengeance. Many Germans thought that the treaty was unfair because some didn't have much choice about going into battle and the people in charge hadn't let on how badly Germany was doing in the war. This led to people being quite confused about why Germany had been defeated.
What changes occurred in Germany?
Germany's government changed. Before the war, people didn't get much of a say whereas afterwards, many people could vote. This even included women which was quite unusual in the world at that time. In 1929 there was a worldwide depression. The banking system collapsed which left lots of people poor and unemployed. Money became worthless. Germany had borrowed money from America but everything fell apart and Germany became very economically unstable. So there had been lots of changes. There was an unfamiliar government and one of the reasons Hitler was taken seriously was because people were so worried about the government. The Nazis were successful due to the problems in Germany.
Adolf Hitler |
Now you have an overview into what happened in Germany before Hitler came to power. Look out for an upcoming post about how the Nazis changed life for the Jews in Germany.
No comments:
Post a Comment