So Earth Day is just a couple days away, and I'm starting to see all these articles and posts with stuff asking what everyone is going to do for Earth Day. This kinda irritates me. Mainly because EVERY DAY should be treated like Earth Day. If it wasn't for Mother Earth, I wouldn't be sitting here typing this, and you wouldn't be reading it, because we wouldn't exist! What do people not understand about going green and trying to treat nature a little better? We act like the Earth is going to be here forever, and who knows, maybe it will, but we (humans) are creating SO much destruction and just plain bad bad bad juju with nature that we could be given the boot any day now. If I was Earth, I'd be pretty pissed right now.
Don't get me wrong, I'm glad there's so much advertising for Earth Day and it brings much needed awareness to the cause. Come April 23rd, I guarantee I'll be seeing and hearing people saying how proud they are that picked up that coke bottle on the side of the road, or planted a flower, or chose to buy an organic cookie rather than a Snickers, but that needs to be going on EVERY. SINGLE. DAY.
For some people, it's a start to something great. Maybe some people will become enlightened to the idea that the world might not be around much longer if we don't change our habits. I'm no saint, trust me, but over the past few years I've been trying to go greener and greener. I'm conscious of what I'm spending my money on (making sure I'm supporting eco-friendly companies over the bad ones) and will go out of my way to pick up some trash if I see it, and have *almost* gone completely organic in the kitchen.
Anyways, my rant is over. If you have a bit of free time, check out these helpful books that I've read that has really opened my eyes to going green. *And best of all, the Kindle editions are FREE!*
Click to open.