War seems to benefit only a very few at the top, and they will gladly sacrifice their people to conquer others and steal treasure. We are witnessing PTSD, and even mass shootings because the myth of male aggression is lauded rather than sanctioned. Nature shows that most animals (much of the time) posture, but do not kill. Do you think that state funded murder, and encouraged “manliness” remains problematic?