Friday, December 28, 2007

ROI Tip 1 - Save operating cost from month 1.


Though our tests in the Nordic Regions it shows that organizations and businesses have from 5-30 percent duplicates in their databases.  What is the price for the duplicates? 

In an article in DM Review Thomas C. Redman comes with this assesment of the cost of bad data:

"Consider first the cost of efforts to find and fix errors. While organizations do, from time to time, conduct massive clean-up exercises, most efforts to find and fix errors are embedded in day-in and day-out work. Over the years, we developed the Rule of Ten: If it costs $1.00 to complete a simple operation when all the data is perfect, then it costs $10.00 when it is not (i.e., late, hard to interpret, incorrect, etc.)."

In my example I will use the price of 1 DKK pr record and 10 DKK for incorrect data.  I will use the conservative 5% duplicate.  

Cost of poor data



The Solution:
I have used the rentalprice of Omikron AddressCenter.  With rental you can deduct the whole cost in the operating costs, whereas if you buy the solution it will be in the investment costs.



Omikron AddressCenter or Data Quality Server is through test proved to be the most intelligent, efficient and easy-to-use tool to find/match duplicates.

Tip 1 to avoid Poor Data Quality - Easy and Efficient monitoring of data entries

Poor data quality can have different sources:
  • System inefficiencies. Neither your software or internal control system does not catch the poor data
  • We are only humans.  Poor data a entered for sevaral reasons. Typos,  we enter data in wrong fields,  enter 0000000 in postal or telephone fields or just fill in as little as you are allowed to do
doCustomers set up our Data Quality Server to do intelligent continuous cleansing across databases with intelligent search to find customers or adresses.  In theory this will result in clean master database(s), but because of the system inefficencies and the human factor, some poor data is still found.  

We have a feature that can monitor the entries of data.  You can analyze why the poor data is entered, and then be corrected.  A mail is generated every day to a supervisor, that can analyse the result.  If it's human error the person who have entered wrong/insufficient data will be informed so they have a chance to improve their quality.  If it is system ineffiencies this will be corrected.  With this monitoring service the amount of poor data entered is drastically reduced in a short time span.

Thursday, December 27, 2007

"Without proper data, BI is meaningless"

This claim is put forward by BI analyst Lyndsay Wise in an article in DM review .

She also writes: "Data cleansing tools are a critical component to ensure that dirty data is not brought into the organization's data warehouse. Non-cleansed data may lead to capturing duplicate, nonvalid records, building reports based on wrong data and decision-making based on inaccurate data. Data quality requires the ongoing management of data cleansing activities to meet and exceed defined data quality standards. Practical applications might include the data being used to run month-end reports or a sales manager using data to monitor the ongoing performance of her sales staff. Lastly, data delivery completes the cycle, distributing the right data at the right time."

The most efficient cleansing tool on the market will be Omikron AddressCenter or Omikron Data Quality Server.  You can also contact me for more information.

Study shows CFO stressed over Poor Data Quality

An IDC study called ‘Understanding Data Quality Needs And Practices In Asia Pacific’ shows that a majority of Finance Executives admitted to facing stress due to Poor Data Quality in their decision making.  22 percent found it highly or quite stressful.

Another interesting aspect of the study is that 30 percent of the respondents spent 8 hours per week or more verifying the accuracy and quality of their data.  That is 20% of the scheduled work time.  If a CFO earns 500.000 DKK a year, that means Bad Data Quality cost 100.000 DKK just on the salary budget for the CFO.

Ian Parker, Product & Solutions Marketing Manager, Business Objects Asia Pacific & Japan says: “Instead of being an asset for finance, the data becomes a major source of angst and frustration, wasting resources, money, and contributing to poor decision making. The report shows that on average across the region, only 60 percent of respondents considered the purchase of a data quality software tool to solve their problem, despite wasting 100-300 hours of their finance executives’ time each year on the issue. One can argue that there are plenty of hidden costs when it comes to data quality and that CFOs should examine how this is affecting their organisation.”

I find this a little disturbing when you can find efficient and inexpensive high quality Software to help lower your stress and get better data.

You can read more about the survey here.

Deloitte's take on Data Quality

I stumbled over this quote from Deloitte:

"Ultimately, poor data quality is like dirt on the windshield. You may be able to drive for a long time with slowly degrading vision, but at some point, you either have to stop and clear the windshield or risk everything."

I like the analogy :-)

Friday, December 21, 2007

Data Quality is not an IT issue

Data Quality is often seen as an IT-Department issue.  It is time to change this perception once and for all.  Data Quality influences all departments in a business. Sales, Logistics, Customer Care, Financial - the list goes on.  You cannot solve the challenges of Data Quality unless all departments are involved.

In my work with organizations and business it is interesting to see that often the ones complaining about poor Data Quality is often the ones that creates the bad data.  Example:  I have worked in various sales organizations, and salespeople are notoriusly shortcutting in entering data. Then they complain about the system when they cannot find the right customer, or the goods are sent to the wrong adress.  To solve this you need the salesmanagers to focus on the problem to help find the right software and solutions together with the IT Department.

To keep the data good, it's important to choose easy to use and efficent software that can be operated by marketing, sales, logistics as well as the IT-Department

CIO's Highest Priority: Business Intelligence

A Gartner Annual CIO report showed that BI applications remain the highest technology priority for CIOs today.

However, only 36% of CIOs believe that management is using the right information to run the business.

CFO's finds Data Quality more important then Data Security

A new study show that CFO's are increasingly worried about the state of their firms' data quality.

Data quality is now more important by CFOs than data security, which is now fourth in the table. This means that data quality will be an important focus area for CFO's.

For 58 per cent of the respondents to the ninth annual Technology Issues for Financial Executives survey, data quality or data integrity were the top issues of concern.

The research, carried out by Financial Executives Research Foundation (FERF) in conjunction with Computer Sciences Corporation, highlights the importance of maintaining a high level of data quality in businesses and the growing recognition of the impact of data quality on a firm's finances.

You can download the research here.

Thursday, December 20, 2007

Laissez-Faire: the biggest obstacle to improved Data Quality

Out of Sight - Out of Mind.  Is this the reason why more businesses don't do more about the challenges they face concerning their data quality?

A lot of the costs are hidden and cannot be drawn easily.  How do you measure
- Customer dissatisfaction from multiple listings
- Lower employee satisfaction
- Data ownership conflicts
- Difficulties in decision making
- Time delays
- Correcting Errors
- Double work and rollback of wark already done.

And how do you find where the poor data entered?

Therefore many close their eyes and wait for better times.  In later posts I will show how some of this challenges can be solved.

Wednesday, December 19, 2007

Some more estimates of the cost of Poor Data Quality

Larry English writes this in this article:

Quality experts agree that the costs of non-quality are significant. Quality consultant Philip Crosby, author of Quality is Free, identifies the cost of non-quality to manufacturing as 15-20 percent of revenue. Juran pegs the costs of poor quality, including "custom complaints, product liability lawsuits, redoing defective work, products scrapped . . . in most companies they run at about 20 to 40 percent of sales." A. T. Kearney CEO Fred Steingraber confirms that "We have learned the hard way that the cost of poor quality is extremely high. We have learned that in manufacturing it is 25-30 percent of sales dollars and as much as 40 percent in the worst companies. Moreover, the service industry is not immune, as poor quality can amount to an increase of 40 percent of operating costs."

Poor Data Quality Cost 10-20% of revenue

Thomas C. Redman writes in this article that: Poor data quality costs the typical company at least ten percent (10%) of revenue; twenty percent (20%) is probably a better estimate.

Larry English writes the same in this article.

Poor Data Quality cost 100 largest Danish companies 4 Billion DKK

This is the result in a study PA Consulting has conducted in Denmark. They also mean this figure is a conservate figure and the real cost is probably higher.

Another result is that you can save 3,5% in purchasing if you have good Data Quality.

The question is why the management let this happen, and the conlusion is that it is because the costs are hidden and you cannot pull the numbers from the accounting.

You can read more about the study here. The article is in Danish.

Tuesday, December 18, 2007

Gartner Report about Data Quality

In this report Gartner concludes that through 2007, at least 25% of critical data within Fortune 1000 companies will continue to be inaccurate.

Just stop for a minute to think.  

No wonder the cost was 600 Billions USD in 2002.

You can download the report here.

In 2002, Poor Data Quality cost the the US Companies $600 Billon a year.

This was the result of The Data Warehousing Institute (TDWI) study in 2002. The study was based on survey data from 647 respondents.

What i find interesting in this study is this:

"Almost 50 percent of survey respondents express no current plans to implement an initiative to improve data quality, while 78 percent said their organizations need additional education about the importance of data quality and methods to maintain and improve it.

"The good news is that achieving high quality data is not beyond the means of any company," continued Eckerson. "Companies that have invested in managing and improving data quality can clearly cite the tangible and intangible benefits of doing so." Benefits include improved customer satisfaction, a "single version of the truth," and "greater confidence in analytical systems.""

This is the same conclusions I have reached after my cooperation with several customer in the Nordic area. With a small investment in cutting edge technology, you get more satisfied customers and employees, better decision data, and significant reduction in operating costs.

You can read more about the study here.
 
By the way. 600 Billon USD is 3.928.310.000.000 SEK, 3.340.560.000.000 NOK or 3.105.720.000.000 DKK

Poor Data Quality cost Dutch companies €400 Million pr Year

This is the result of a survey of 20.000 Dutch Organizations employing 10 or more people. Another interesting aspect of the survey is that even though 92,6% of the organizations find maintaining the quality of relationship data important, it appears that only 52% of the companies actually monitor the entry of data.

I believe the reason for this is that most organizations don't know how they can monitor this effectively and at low cost.  You can set up routines for this very easily and at virtually no cost. When you consider the cost the organizations aquire from bad quality,  you wonder why they let this happen.  

The problem is that the cost is generally hidden cost and can therefore not be easily accessed and analyzed.

After new year 2008 we will provide several solutions on how you can try to assess the cost of Data Quality issues.

You can read more about the Dutch Survey here