Acid rock drainage is one of the big environmental problems facing hard rock mines, because it keeps going for decades after the mine is closed and abandoned, poisoning everything downstream with the toxic metals leached from the rock. It's a natural process that occurs wherever rock is exposed to oxygen and water; metal sulphides are oxidized by the oxygen to dissolved metal and sulphuric acid. Exposed rock is eventually consumed until there is little to no unreacted metal or sulphide accessible to oxygen.
However, as the process requires oxygen, it couldn't happen until the earth's atmosphere actually had oxygen in it. I haven't yet found a definitive description of what exactly the earth's atmospheric composition was before the change, but the difference between chemistry with and without oxygen is pretty clear, as oxygen is highly reactive and tends to get into everything. For one, the acid rock drainage I mentioned above. Another indication is the type of iron minerals deposited—with or without oxygen, and how much oxygen. As iron combines very easily with oxygen, if you find an iron deposit without any oxygen there's a good chance no oxygen was available to it at the time it was formed.
A new study just published last week adds another piece to the puzzle. It turns out that, at least in the case of pyrite, the acid rock solubilization reactions are very slow, unless a particular type of bacteria gets involved and makes the reaction about 1,000,000 times faster by munching on the iron in the rock for energy. It's an aerobic bacteria, which means this could only happen once the free oxygen content in the atmosphere started to grow, and it's happiest in acid water which dissolves quite a few different metals from the rock around it, and which it produces as part of its diet of iron.
This bug actually has recyclable food, which is cool and disturbing at the same time. It eats ferrous iron (2+ charge) and excretes ferric iron (3+ charge). It gets its energy from stealing that electron from the iron. Ferric iron reacts with the iron ore to release sulphuric acid, and turns back into ferrous iron—at which point the bacteria can both eat it again, and enjoy the sulphuric acid bath its own waste products just created for it.
Now, the iron I mentioned earlier is very common, so there are iron deposits all over the place. A less common element is chromium, and it is known to appear in pyrite deposits in small quantities. It is also known to be solubilized by the conditions those iron-hungry bacteria create—and to re-precipitate very quickly when the pH rises again.
Absent human intervention, acid rock drainage is a slow, stable trickle of metals here and there from the weathered surfaces of exposed rock. Mines expose a lot of fresh rock, and all that fresh rock is ripe for producing acid mine drainage at vastly higher rates than the norm.
But when the earth's atmosphere started carrying free oxygen, all the pyrite deposits on the planet were effectively freshly mined. They had, until that point, experienced no oxidative weathering, because there wasn't an oxidative atmosphere. As soon as they were colonized by the bacteria, there was an explosion of acid rock drainage and a major dump of acidic iron—and chromium—into the waterways. As soon as that water mixed with a non-acid stream, or reached the ocean, the pH went back up and the chromium precipitated.
By studying samples from ancient estuaries, the researchers found a significant spike in chromium deposits starting at 2.48 billion years ago—about the same time as other methods said the oxidation of the atmosphere occurred, caused by the beginning of oxidative weathering.