Is it normal for americans to equate relationships/love with sex?
It seems that for Americans a relationship is incomplete, failed or not meant to be if there isn't sex involved in the dating phase. "Getting laid" is somehow a goal that everyone there must achieve, preferably while still in high school/college. The very first thing that an onscreen couple does when they get together is have sex.