I ran across an archaeological discovery where cheese-making was confirmed 7,000 years ago.
I'm sure most people know that cheese is an old-fashioned, pre-refrigeration way of preserving milk. It's a pretty interesting preservation method, because it involves a specific kind of bacterial growth—and bacterial growth is usually what's involved in things going bad. I started to wonder: how did people figure out that if you let milk rot in just the right conditions, it doesn't actually rot but turns into cheese?
Which bacteria grow depends a lot on the conditions. With specific nutrients and temperatures, certain bacteria will come to dominate. Sort of like with my home bioreactor, I kept the conditions right for the bacteria I wanted to dominate.
To my surprise, this one turned out to be quite easy, and not as much of a stretch the way chocolate was.
The most basic of processes for making cheese is to curdle it, cook it, drain out the water (and whey) left behind when the curds form, and salt it. More detailed instructions are specific to the type of cheese being made, such as whether it's curdled with acid (and which acid) or rennet, how hot and how long it's cooked for, how hard the whey is squeezed out, and how much salt is added.
But, I ask myself, isn't milk generally bad when it's curdled, unless you are specifically making cheese? How would somebody look at milk that's gone bad and figure out that it isn't actually bad, it just needs a few more steps and then it's a good way to store the stuff?
It turns out, fresh raw milk that's kept at room temperature but clean actually doesn't go bad when it sours, it just goes sour (and can apparently taste quite good at that stage too). This lady already knew how to make a simple cottage cheese, so she knew about the heating step. It looks like sometimes you don't have to add an acid to curdle the milk, the bacteria present will convert lactose to lactic acid on their own, given a few days.
Interestingly, the more acid is present, the less heat is needed. I wonder if a well-soured milk with no acid added could get sour enough on its own to curdle at room temperature, or maybe warm room temperature (hot summer, or near the kitchen fire), or if perhaps somebody tried to heat soured milk for another reason and found themselves with curds.
I'm guessing here, but since it seems really easy to get curds from milk, cheese-making was likely discovered many times in many places—so there is probably no one way that this sour+heat=curds step to making cheese was discovered.
Once you have the curds, you separate them from the whey, and that's the device the archaeologists found: a strainer, with milk proteins in the pores. This is a fairly obvious step, once you taste the curds and realize that they're safe to eat. After that, the differences between cheeses come down to what curdling agents are available, and how dry and how salty they need to make the cheese to suit their environment. (Drier, saltier cheeses keep longer, for example, just as drier, saltier meat keeps longer.)