Germany
Germany is a country in West-Central Europe. Germany has a long history from the holy Roman Empire to the Unification, the First World War and Second.