Why Do Many Americans Avoid Planting Vegetables Directly in the Ground?

0
Asked By SunnySkyz123 On

I've noticed a trend where a lot of Americans seem hesitant to plant vegetables directly in the soil. It feels like there's a widespread belief that ground soil isn't good enough for growing produce. What could be driving this perspective? Is it just capitalism at play, the influence of gardening magazines like Better Homes and Gardens, or maybe the constant feedback from social media? It's interesting to see people opting for vegetable gardens in raised beds or buckets instead of using the free dirt available right below their feet. I'm curious about the cultural factors behind this aversion to planting directly in the ground.

3 Answers

Answered By GardenGuru88 On

I actually think the idea of 'terraphobia' is a bit overstated. It seems like you might be projecting your recent discovery of that term onto a larger group. From what I can tell, a lot of people still plant in the ground, even if it's not always the hip choice.

CuriousCabbage45 -

Fair point! But I feel like I've seen more pictures of people's raised beds than of them actually planting in the dirt.

Answered By PesticidePenny On

For me, it’s a bit risky to plant directly in the backyard. We have a lot of chemicals in our soil from pesticides and runoff from neighbors. Plus, I have no idea what the previous owners did with the lawn. It’s why I stick to safer options.

Answered By VeggieViking42 On

Personally, I love planting in the ground, but I'm considering raised boxes soon. Weeds are a hassle, and I’ve got gophers to think about too! It's all about the practicality for me.

WeedWarrior99 -

Yeah, but even weed seeds can be blown in by the wind, right?

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.