While we generally consider soil as the dirt on which we walk and that we use to grow our plants, it also serves as a useful analogy for cancer. One of the most dreaded maladies of our time is like a seed, and it can only take root if we provide the right kind of tissue in which it can nest. And that’s exactly what’s problematic with most cancer research and almost all anti-cancer therapies, including chemotherapy and radiation. They emphasize the seeds and disregard the soil.
Although largely ignored for more than a century, the “seed and soil hypothesis” of cancer is not new. It was contrived in 1889 by the English surgeon Stephen Paget, who found that the spreading of tumors to other places in the body, a process called “metastasis,” does not occur randomly.
Instead he suggested that tumor cells, which have dislodged from their initial mass, only grow in a distant organ if that organ offers the right soil. Owing to the revived interest in the seed and soil hypothesis over the past few decades, we now know that the same is true for tumor initiation. Young and healthy microenvironments can prevent cancer. Damaged soil, on the other hand, can facilitate tumor development by cells that would not form a tumor otherwise.
An exclusive focus on cancer cells is an incomplete analysis, as not everyone carrying faulty genes will acquire cancer. For example, a large fraction of women carrying a mutation in the BRCA gene similar to the one Angelina Jolie has will not develop breast cancer. On the other hand, women without obvious genetic abnormalities may still get the disease as only 5-8 percent of all breast cancer cases seem to be linked by genetic mutations.
The underlying conditions—the soil—significantly contributes to these clinical outcomes and receives little scrutiny.