Back to category: History

Limited version - please login or register to view the entire paper.

world war 1

World War One (WWI) changed women’s role in Europe. When men were at war, women took over their jobs. Women kept the country functioning, while men were away at the war. Women worked at post offices, bus stations, and stations. They worked for long hours for a little pay. Women worked in unsafe and very dangerous conditions. Women worked at jobs that were considered masculine before the war started. Women had never worked in factories, but they adjusted to the new working conditions quickly. Women learned to work at new jobs. Women became an important part of the economy. Women also worked as nurses at the battlefield. WWI changed women’s role in society, women had more freedom to work. After the war women faced a ...

Posted by: Rainey Day

Limited version - please login or register to view the entire paper.