View random article

What Is Bikini Medicine?

Bikini medicine is an informal term that refers to the tendency of some medical professionals to focus on health issues relating to the breasts and genital areas when treating women. This is as opposed to taking a look at the woman’s overall health. As can be surmised, the term bikini medicine is taken from the popular swimwear, which covers the two body parts in question. Bikini medicine has been blasted by advocacy groups as it does not take into consideration the entire body’s health. As such, it can be rather unhealthy - and even dangerous - for women.

It is important to note that the breasts and genital areas do warrant medical attention, whether or not there is an existing condition. That is why women need to pay specialists such as gynecologists regular visits. The issue of women’s health advocates arise from the experiences of some women when visiting a general practitioner. According to some, they get bikini medicine.

While the term may have feminist tones, bikini medicine has a legitimate basis. If a general practitioner focuses on women’s health issues, he may overlook other health concerns. For example, if a doctor tends to immediately assume that a condition is hormone-related, other possible reasons may be overlooked. As such, the patient is at risk of getting a wrong diagnosis. In fact, the medical community has gone as far as to recognize the existence of bikini medicine. What women can do is to open up to your doctor to let them know about your concerns.

Featured in Health