How Women Claimed Their Place In America’s History Books

By Special To The Black Star News

Published on:

Follow Us
Women have always been part of history.

Photo: Black Women Radicals

Women have always been part of history. But for centuries, their participation in it was overlooked: Early history texts often excluded women altogether, aside from accounts of powerful women like queens.

Historians—who were almost entirely men—often saw the past through the lens of the “great man” theory, which holds that history is largely shaped by male heroes and their struggles.

That changed in the 20th century, with the birth of women’s history as an academic discipline, a push to recognize the achievements of women—and a movement to ensure women had equal access to the academic institutions where their history might be taught. In the United States, the result was National Women’s History Month, an annual celebration born from the activism of historians intent on making sure women got their due.

(These stories highlight a century of change for women.)

The U.S. has celebrated Women’s History Month every March since the 1980s. Read more.

See also  Congress Must End Corporate Welfare Received By Military Contractors