The United States Women’s National Women’s Soccer Team has historically been bad at showing any level of pride in the country they represent.
However, even the most woke group of athletes can come back to their senses, and there are signs of normalcy returning to the women’s soccer team.