4 min read

The Seeds of Change

Crop rotation – the practice of leaving some portion of fields fallow during a given season – developed early in the history of agriculture. Apparently, it wasn't too difficult to realize that if you keep planting in the same fields, after a couple of years, your crops will do poorly and you will have little to eat. Depending on the area, farmers adopted a "two-field" system (where half of their fields lie fallow each year) or a "three-field" system (where one-third of the fields lie fallow each year).*

It was considerably harder, if history is any indication, for farmers to come up with a solution that didn't involve leaving 30-50% of their fields uncultivated, even after the invention of many effort-saving devices like advanced plows and domesticated work animals.

But someone figured out a better way. In the Norfolk four-course system, farmers plant a different crop each year, in a four-year cycle: wheat, turnips, barley, and clover. No fields lie fallow (though animal grazing is permitted during some seasons). Although originally developed in Belgium, with variations sporadically implemented in spots across Europe, it was the British who most readily adopted the four-course system. This led to agricultural surpluses in Britain during the 1700s, which – along with its vast colonial empire, among other factors – set Great Britain on a different course than many Continental powers.

From the comfortable position of hindsight, the four-course system is just better. Fields are productive year-round. Each crop plays a role to improve the health of the soil and ensure excellent grazing for livestock. In some cases, farmers could double their productivity. All of this in an era when famine was still a routine occurrence.

So what took continental Europe so long to adopt such a beneficial practice? The innovation worked, the benefits were about as obvious as benefits of this sort might be, and the idea could be communicated in a few sentences. The main barrier, as I understand it, was infrastructure. Or, perhaps, infrastructure with a dash of misunderstanding.

It's easy to imagine farms in the past to be like farms today. If someone owns a farm, they can do what they like to it. Not so in Europe during this time period. Land wasn't collectively owned, exactly, but multiple different parties held various rights over the same plot of land. I will let my field lie fallow this year and you can share your crops with me, while anyone (peasant or noble) could graze their animals on the fallow land. Collective decisions about land management were routine. So the first burden to changing practice was a very common one, found across domains of all sorts: because of the infrastructure of land management, the decision to switch practices was a collective one – not an individual one.

This wasn't the only infrastructural barrier, however. My nearest neighbors didn't just happen to graze their flock on my fields – they had a legal right to do so. They may have a legal right to hunt game, tramping through my fields in certain seasons or other legal rights that would impair my ability to grow crops every year. So it wasn't just a collective practice that needed to change – it was an interconnected set of enforceable legal rights that needed to change.

These barriers were embedded within a highly fractured society. Regional variation – not standardization – reigned, and monarchs did not have the power to unilaterally change land management practices even if they wanted to.

Note that these barriers to change lie in a domain outside of the difficulty of the new practice itself. It wasn't as though farmers couldn't get barley or clover seeds. Or that planting in the four-course system was very burdensome. Or that farmers had to learn some special techniques to use the four-course system. Rather, it's other systems that impede the process of change.

A third form of infrastructure – beyond custom and law – also played a critical role: fences. Without fences, it's hard to stop your neighbors from grazing their animals on your land and it's hard to commit to a four-course system. With fences, individuals can do whatever they think would lead to greater productivity. Adding fences (along with modification of legal rights and duties) moves the decision-making back toward the individual, which – in this case – also enabled the practice to spread.

Europe's transition to the four-course system came in fits and starts, much to the consternation of agricultural researchers, who (in my mind at least) must have pulled their hair out witnessing the two- and three-field systems lead to food shortages when a solution was readily available.

Now for the misunderstanding. One argument against implementing the four-course system was that it wouldn't provide adequate grazing for livestock. Without a whole field lying fallow for the year, how would we feed our animals? This seems like a legitimate question until you consider all of the excess food you have from the four-field system. You can feed the animals the extra food you've grown, rather than have them graze on grass stubble.

In some cases, arguments like this represent a sincere misunderstanding of the benefits of a new practice. New practices usually require us to think differently about the situation, not just change what we do. But in other cases, arguments like these are really excuses or rationalizations. The arguer is thinking something like, "I don't want the four-field system, what is a reasonable argument against it?"

I suspect that you can find parallels between this story and many modern situations. But I'll leave that as an exercise for the reader.

*The three-field system was itself an important innovation in Europe during the 11th century.


As with any historical example, "it's a bit more complicated than that". But hopefully I've got the essence of the story right. My views on this topic come from Blanning, T. (2007). The Pursuit of Glory: Europe 1648-1815. Viking Penguin, pages 143-153. But you can read plenty of the background on Wikipedia.