While going though some blog backlog on the train, I came across a term that I had not seen yet before. SSoR is the Source System of Record.

Here are some quotes to explain the concept
from Robert McIlree:
By definition, an SSoR is the final authority on the enterprise value of every piece of data so designated to it. Once exceptions to this start being made, the scheme breaks down rapidly into the data value and multiple movement/storage morass that they’re in now.

from Sandy Kemsley:
When data is replicated between systems, the notion of the SSoR, or “golden copy”, of the data is often lost, the most common problem being when the replicated data is updated and never synchronized back to the original source. This is exacerbated by synchronization applications that attempt to update the source but were written by someone who didn’t understand their responsibility in creating what is effectively a heterogeneous two-phase commit — if the update on the SSoR fails, no effective action is taken to either rollback the change to the replicated data or raise a big red flag before anyone starts making further decisions based on either of the data sources. Furthermore, what if two developers each take the same approach against the same SSoR data, replicating it to application-specific databases, updating it, then trying to synchronize the changes back to the source?

So what does this have to do with testing and quality? Every tme data gets moved from one system to another, there is an increased chance of data loss or loss of data syncronization. We can add this to the list of things to look for when testing and a concept to use when fighting for our bugs.