From the beginning of the film industry in the early 1900s, Hollywood has significantly influenced the perceptions of beauty in the US and worldwide. In the modern era, we see countless images of Hollywood stars on TV and social media who look beautiful.