I think, Tom, that the answer depends upon the culture you are in. In traditional US culture I think it is an old belief that women were the weaker sex and I think it stemmed from very old thinking indeed. Historically women were viewed as ornaments or property, to be valued for their beauty and not their mind. As time marched on women challenged this belief. It later became more chivalrous or courtly to treat women as a fairer sex rather than a weaker sex.
While I can not speak for other cultures, this is just a personal opinion of what I've seen over time in the US.