Naturism

Webster's Dictionary of the English Language

·noun The belief or doctrine that attributes everything to nature as a sanative agent.