This is an automated archive made by the Lemmit Bot.

The original was posted on /r/AskMen by /u/Kayoo38 on 2023-10-04 11:39:12.


I’ve heard women say that to men (and men telling me they were told that) and always wondered how this lands on you.

Edit: does that happen to you at all or is this urban legend?

Second edit for clarification: I ask because I’ve heard this myself quite a bit (I seem to show quite masculine behavior whatever that means) and I always find it invasive and not at all inviting to share anything about me.