Women's Health

Podiatrist

Podiatrists are health professionals who diagnose and treat medical and surgical problems and injuries of the feet and ankles, such as corns, warts, plantar fasciitis, bunions, or hammer toes. They also perform reconstructive surgery.

Podiatrists provide care for people who need it, such as those who have foot problems caused by diabetes.

A podiatrist completes a degree at a college of podiatric medicine after undergraduate college. After receiving a podiatry degree, a podiatrist typically spends time in a hospital-based residency program.

Podiatrists can be board-certified through the Council on Podiatric Medical Education. Certification is not required in every state.

Current as of: July 31, 2024

Author: Ignite Healthwise, LLC Staff

Clinical Review Board
All Healthwise education is reviewed by a team that includes physicians, nurses, advanced practitioners, registered dieticians, and other healthcare professionals.