Is organic food really better for you? It’s a question I’m sure many of you have asked yourself and the answer is, not always.
Just because something is organic doesn’t automatically make it good for you. You can buy organic cakes, biscuits, pies and fizzy drinks and whilst it is nice to know that these products have come from organic sources, they are still not good for us.
However, I do believe that when it comes to food that comes from nature (real food, clean food) choosing to buy organic is always going to be better for you. A cocktail of different chemicals are sprayed onto crops as they grow and whilst I can see that their purpose is backed up by good intentions (they allow the crops to grow the best they can without disease or pests) do I really want to be putting these chemicals into my body?