Feet are erotic. This is fact. It's present across all cultures and periods of history. Fucking cavemen knew this shit. Stop pretending its weird.
You can even go out today and just observe what everyone is wearing. All of the "pretty" girls, the ones fishing for a man, are baring their feet. Footwear is actually a big point of contention among women because they know they have appeal. Meanwhile men's footwear is patterned with muted colors and is most often close toed. That obvious double standard exists because women's feet are sexy.
Now we get to part two. Feet aren't just sexy, they're a symbol of intimacy. Think about the amount of people you have let touch your feet besides family or medical professionals. I bet it was almost none and for those few who did it was weird as hell. This is because feet are considered "dirty" like the mouth, but women still put on lipstick to dress it up and kissing is often a sign of coitus between two people. Similarly the privilege of touching the feet requires quite a bit of familiarity between people, and that intimacy is attractive. Consider this, would you give a man a foot massage? I bet the answer is no, because on some level you do recognize the romantic significance of the feet. That's all I have to say for now.