Should Schools Offer Dental Care?
- Posted on: Oct 13 2015
A child’s dental hygiene is the responsibility of their parents, but if dental care was offered I school would it help make children more likely to car for their teeth at home? Dental care and hygiene importance was once taught in elementary schools, but very few schools discuss is nowadays. This mean children only know dental care if necessary because of their parents.
Do you think dental care should be taught in school or is it a parent’s job to care for their children’s teeth alone?
Posted in: Quick Reads