Quality data is like the Holy Grail, businesses all want to achieve it.
By “quality data” – it means clean, organized, actionable data from which to extract relevant information and insight. Data quality does not end with managing the incorrect entry of information, but the logic of data has to be taken into account too. If by 'logic' you mean how your quality measurements are interpreted, then yes, certainly. You are generally measuring your data in some fashion and recording the metrics. The process of arriving at that measurement needs to be documented carefully, so there is no misinterpretation of what the value means or how it is meant to be used. Statistics can be easily misused or abused, to be sure.
So what is the common element? The label. And this is the key to data quality. Giving the user a common language for universal elements of the data that they can use to determine the "quality" for usage. Because, the quality is determined by the user or community. Imagine the difference between user groups when it comes to "quality"... a regular user may be interested in the most easy-to-use data, while someone in an operational environment may be interested in the shortest latency possible, and an analyst who is focused on a study area may be interested in the highest resolution possible. So one of the data quality perspectives is that, instead of having a variety of data quality properties, which are unrelated or independent to a number of different organizational formats that users have to translate or decipher, the focus should be on one standardized format that can be used by a number of users within classified user groups.
Business context is indeed a very important perspective: You can walk through all the various dimensions of data quality such as accuracy, consistency etc, but business context is indeed a very important perspective. Data can be accurate, consistent, timely, but data can also be shared among many different business groups, it can be transformed, aggregated, derived for various business needs, each with possibly their own views on what the expected definition and quality of the data should be.
Data governance: Performing data quality checks as close to source is important, but knowing who your data consumers are and how they plan on using it is equally as important and form the foundation of the main pillars of Data Governance capabilities such as MDM, Metadata Management and integration. Performing any of these capabilities within a vacuum without a broader data strategy will only result in limited benefits.
Data quality metrics are a form of metadata. They provide supporting information that helps you to interpret or assess the raw data. If you are managing the quality of your data from these measurements, then the measurements need to be made again once the corrections have been made. Data quality measurements can also highlight problems in other supporting metadata.
Quality data is like the Holy Grail, businesses all want to achieve it; but not sure if it’s very doable: Business operates in the real world, and the real world is muddy and chaotic. Organizations need tools that deal with muddy and chaotic data, not a focus on making the data adapt to somewhat weaker tools. In short, Data Quality doesn't mean you pursue the perfect data, but the good enough data being transformed into information, business insight and human wisdom.