Until very recently, the role of fathers in raising children, really be considered unnecessary. For hundreds of years people believed that women simply because their instincts, were much better at raising children. They were now with the right equipment and the like. But the present is all else: the world changed. The homes changed. The gender gap was reduced. Now it is said that a man who is actively involved in and takes responsibility for bringing up their children, have a tremendous impact (positive!) The children, the home and the world. But why do most men still feel like they need to complete a BA degree in "fatherhood" every time they hold their children?