OCTOBER NEWSLETTER - Eccma · PDF fileWorld Codification Forum Copenhagen, Denmark November...
Transcript of OCTOBER NEWSLETTER - Eccma · PDF fileWorld Codification Forum Copenhagen, Denmark November...
linking the knowledge of today, with the power of tomorrow
As you read this I will be on my way back from Riyadh, Saudi
Arabia where I will have presented the results of the ECCMA work
on the phase I pilot of the Kingdom of Saudi Arabia Common
Open Catalog (KSA-COC). The project created a common
catalog from the data provided by some of the world’s largest
buyers in order to provide local market intelligence for both
government planners and local manufacturers. In phase II local
manufacturers and suppliers are expected to be encouraged to
also use this unique platform to increase the local and
international visibility of their goods and services.
Sheron and his team did an incredible job not only in mapping
the data from all the project participants to the eOTD and to
common templates, but also in creating concept equivalence
tables. The results are truly amazing providing the ability to view
the original data in a common framework and in a combined
ISO 8000-120 compliant view where every data element is
displayed with its provenance. For those of you who have taken
the time to read the Edition II of Managing a Data Cleansing
Process for Materials or Services published as an ECCMA white
paper and available under the Resources>Downloads>White
Papers, you will recognize the screens with characteristic data
and identification data along with short and long descriptions. In
phase II I expect to see classification data added to the KSA-
COC records as well.
Sheron and his team also built and successfully tested a web
services interface that allows the content of the KSA-COC to be
extracted and loaded into third party applications. This is very
significant and may prove to be a game changer in cataloging
and data cleansing by providing the tipping point for cataloging
(Continued on page 2)
FROM THE EXECUTIVE DIRECTOR
October 8th, 2013 ISO 8000 Data Quality Webinar
8:30 AM - 12:30 PM EDT
Conference Call
October 10th, 2013 eCDM Training Webinar
8:00 AM - 9:00 AM EDT
Conference Call
October 14th, 2013 Member Q&A Webinar
8:00 AM - 9:00 AM EDT
Conference Call
October 16th, 2013 Board of Directors Meeting
9:30 AM - 10:30 AM EDT
Conference Call
October 22nd, 2013 ISO 8000 Master Data Quality
Manager Workshop Travaasa Resort
Austin, Texas
October 22nd - 24th, 2013 14th Annual ECCMA
Data Quality Solutions Summit Travaasa Resort
Austin, Texas
November 6th - 7th, 2013 World Codification Forum
Copenhagen, Denmark
November 14th, 2013 Data, Tech & Ops Conference
Boston, MA
OCTOBER NEWSLETTER
PAGE FOURPAGE FOUR Article by I.M.A. Ltd,
“Common Reasons
Data Cleansing
Projects Fail”
PAGE PAGE
FOURTEEN FOURTEEN Article by Erie
Insurance,
“Beware of
Frankendata”
PAGE S IXTEENPAGE S IXTEEN Article by CADENAS
PARTsolutions,
“5 Ways to Win with
Standards”
October 2013 – ECCMA Newsletter 2
linking the knowledge of today, with the power of tomorrow
at source. Currently it is buyers that are sharing data through the common open catalog but in
phase II the catalog is expected to be open to manufacturers and suppliers. This will improve the
local and international visibility of their products and services by allowing international
eMarketplaces and exchanges to access and redistribute the data in the KSA-COC. The KSA-COC is
not limited to the products and services provided by local manufacturers, in fact it is expected that
the great majority of the items in the catalog will have been imported. Several of the buyers see the
catalog as an ideal solution to lowering the cost of obtaining the characteristic, identification and
classification data needed for import documentation from suppliers. This is particularly relevant as
we are seeing several countries requiring quality item descriptions that specifically contain the
characteristic data necessary to confirm the assigned customs code (HTS). As buyers have to
provide this information to all their customers the best place to do this in the Middle East will be in
the KSA-COC.
Also, as part of this project, I reached out to a number of our
members who provided advice as well as letters of support. I would
like to express my thanks to Ariba, Aura, Java Gulf, Kontenix and
PartNet for their valuable advice and support with the project.
The common open catalog application is web based and we are
already seeing interest from ECCMA members in using the
application for other local manufacturing initiatives and to view
master data from multiple sources within a group as the COC does
not require any modifications in the source applications. Using the
application in this way would be a good first step in any master
data consolidation or harmonization project.
While designing and implementing a successful KSA-COC pilot has been one of our main
challenges over the last few months it has not been the only one. In September we completed the
design and implementation of the new ECCMA Natural Location Identifier (eNLI) registry. Wasim
Akram Syed has been responsible for its development under the guidance provided by Elizabeth
Green. The best way to understand an eNLI is to visit the new ECCMA ePROP website
http://eccma-eprop.org and under the Standards tab you will find a link to the free eNLI generator
and eNLI decoder. I can guarantee that you will be surprised to learn that named street address
may be following the path of named telephone numbers. Following close behind the eNLI is the
ECCMA Controlled Property Identifier (eCPI) registry, essentially a public library of KML files that
represent legal property boundaries. Both the eNLI and the eCPI were developed to offer a solution
to property identification in the mortgage and finance industries and when used correctly they will
provide a way to make visible the underlying assets of a collateralized debt obligation (CDO).
(Continued from page 1)
(Continued on page 3)
October 2013 – ECCMA Newsletter 3
linking the knowledge of today, with the power of tomorrow
(Continued from page 2)
These two additional registries brings the ECCMA count of managed registries to six; the eOTD, a
registry of concepts and associated terminology, the eDRR, a registry of data requirements, the
eRGR, a registry of rendering guides (under development), the eGOR, a registry of organizations, the
eNLI, a registry of natural location identifiers, the eCPI, a registry of controlled property identifiers
(under development). With six registries of our own, we are also working on the ECCMA Quality
Identifier Registry (eQIR). This seventh and final registry is designed to provide the application
processable information necessary to resolve and validate any of the ECCMA or third party
registered quality identifiers. A Quality Identifier (QI) as an identifier that can be publicly validated
using a standard application processable data exchange. The eQIR is designed to support many of
our existing master data quality initiatives and is expected to provide our members with a way to
leverage their expertise in the use of ISO 22745 and ISO 8000.
October 2013 – ECCMA Newsletter 4
linking the knowledge of today, with the power of tomorrow
It is no surprise that we often run into companies who have experienced a
failed data cleansing project and no longer believe in the value that cleansing provides. It is
unfortunate to hear about companies who have invested thousands of dollars in data cleansing
projects, only to end up going back and correcting the data afterwards. When meeting with many
of the material, purchasing and procurement managers in these situations, we typically discover a
few common reasons why their previous cleansing projects failed.
One of the most common reasons for project failure is that the
previous service provider simply used automated software to
rapidly extract and classify thousands of existing items without
human review. While the speed and efficiency of this method
may have been impressive, the end result was not. In these
cases, data was returned to the customer with incorrectly
classified items, inconsistent descriptions, and often, inadequate
information. Although the quality of these automated software
applications has come a very long way and is continuously
improving, the truth is there is no software application that can reliably transform large files of
unstructured data into accurately standardized, enhanced and structured descriptions without
human intervention.
Another common reason why data cleansing projects fail is due to a lack of flexibility to
accommodate customer requirements and an unclearly defined Standard Operating Procedure.
Many data cleansing companies are very rigid and will only cleanse and format data to their own
standards. Obviously, this can become a significant issue as every company is unique and has
different business requirements when it comes to format, standards, abbreviations and project
timeline. If data is not standardized and structured according to customer requirements, it not only
defeats the purpose of implementing a data cleansing project, but also requires a significant
amount of time and effort for the customer IT department to re-work and prepare the data before
uploading. Project timeline is also critical as Data Cleansing is often part of a larger ERP
implementation. If the data cleansing deliverable is not completed on time and within scope, the
entire project will be delayed, costing the company valuable time and money.
The final common reason why data cleansing projects fail is due to the absence of a long-term
strategy to maintain ongoing data quality as items are added, modified and suspended within the
catalogue. If a catalogue management process is not implemented after the cleansing project is
complete, the data will quickly revert to its previously corrupt state. Once again, this common
mistake defeats the purpose of investing thousands of dollars into a
COMMON REASONS WHY DATA CLEANSING PROJECTS FAIL SUBMITTED BY: Jocelyn Facciotti Marketing Manager, I.M.A. Ltd.
(Continued on page 5)
October 2013 – ECCMA Newsletter 5
linking the knowledge of today, with the power of tomorrow
Data Cleansing project.
Data Cleansing can provide many short and long-term benefits for all units of the business when
implemented properly. If you are one of the unlucky companies who have invested thousands of
dollars into a failed data cleansing project, don’t feel bad, you’re not alone. While Gartner claims
that the MDM market is still premature, there are a few solutions available that have been proven
and perfected over many years of experience. Although every data cleansing company uses
software to a certain extent, the best results can only be achieved through a combination of
software and human intervention. When considering a data cleansing project, it is well worth the
time and effort to research various service providers to understand their cleansing methodology
and ability to meet company specific requirements. After all, data is the foundation for business
decisions and if the foundation isn’t constructed properly, the entire investment will come crumbling
down.
About I.M.A. Ltd.
As a results-oriented company, I.M.A. Ltd. is dedicated to providing the most accurate, consistent
and reliable data available in the industry, while continuously developing and improving solutions
based on the changing market and feedback of customers. Although many competitors have
chosen to sell software and services based solely on speed and efficiency, I.M.A. Ltd. believes that
quality remains the most important factor when dealing with critical inventory data. The I.M.A.
theory suggests that a balanced combination of technology and human intervention is required to
achieve the highest level of data quality.
For more information, please visit www.imaltd.com or contact [email protected].
The ECCMA Member Q&A webinar is being made available exclusively to its members. This one-hour
webinar is hosted by Sheron Koshy, President of ECCMA India and responsible for the R&D team. During
this session, we encourage you and your team to ask any questions you may have regarding your
membership and ECCMA’s available resources. Questions can range from use of the eCDM, mapping
support for the eOTD or implementing ISO 22745 or ISO 8000. Join us and take this opportunity to speak
with the Mr. Koshy directly to get your questions answered!
Check out our available sessions through December 2013 on ECCMA’s Upcoming Events Page. If you
are interested in learning more, please contact Vicky Falcone at [email protected] for
complete details.
(Continued from page 4)
October 2013 – ECCMA Newsletter 6
linking the knowledge of today, with the power of tomorrow
GPS, TRADITIONAL ADDRESSES AND EMERGING DATA STANDARDS SUBMITTED BY: Elizabeth Green
Chair, ECCMA eProp Workgroup
September has been an exciting month with the announcement of the “ECCMA Property Team” or
“eProp” for short, who is responsible for soliciting and developing terminology to standardize how
digital content about real property is represented in the ECCMA Open Technical Dictionary (eOTD).
The group is also charged to build and maintain domain specific data requirements and register
these in the ECCMA Data Requirements Registry (eDRR). This effort includes the identification of
properties in terms of location, type, use and purpose as well as characteristics of properties such as
structural details and land attributes.
The group is made up of a cross-section of real property professionals,
including appraisers, city planners, assessors, surveyors, inspectors,
engineers, architects and technologists. eProp seeks to align real world
concerns and information about real property with the application of
modern technology-based data governance, management, exchange
and analytics. I am pleased to be the Chairperson of this group and am
assisted by my friend, John Cirincione, Chief Appraiser at Collateral Ana-
lytics as Vice Chair.
I began working with ECCMA in the fall of 2012 in the quest for improving
real property identification for electronic data interchange in the
mortgage banking industry where I chair a property workgroup for the
standards organization, MISMO. I became a member of ECCMA in early
2013 and achieved the ISO 8000 Master Data Quality Manager Certificate.
Technology is leveraging property information in ever-increasing ways, from simple directions to a
restaurant to complex tracking of real property as collateral underlying a mortgage-backed
security and I am committed to demystifying the inherent complexities in property information.
Comprehensive location identity and core knowledge about the property are both fundamental
and foundational to any property-related activity. Common definition and data standards are the
key to unlocking the full potential of property data in modern society.
Another exciting development is the release of the ECCMA Natural Location Identifier or eNLI™.
Certainly, GPS technology is without a doubt one of the wonders of our generation, but it is still
hampered by the street address system. Switching to pure latitude and longitude coordinates is a
challenge as there are four different formulas and, whichever way you look at it, they are all a lot of
numbers. It is from this challenge that a new standard has emerged that gives GPS coordinates a
human-friendly 14 character code. (Continued on page 7)
October 2013 – ECCMA Newsletter 7
linking the knowledge of today, with the power of tomorrow
The eNLI encompasses the postal address details, computed geocodes (a single latitude/longitude
point on the map) from a service such as Google Earth, as well as actual coordinates collected
electronically from the site or provided by the user including latitude, longitude and elevation.
The eNLI is a “natural” identifier that uses the ECCMA 1-4 Property Natural Lot and ECCMA 1-5 Prop-
erty Natural Unit open standards published by ECCMA. The eNLI as a “natural” identifier is very dif-
ferent from a licensed location identifier as not only do you not need to pay a license fee to get
one you also do not need to retrieve the actual location information from a registry, the location is
encoded in the eNLI itself.
An eNLI can be created and used to identify any location from a mail box, a front gate, front door
back door or the location of a propane tank. It can be used to locate the front door of an apart-
ment or condominium unit, any door in an office building or any piece of equipment in any office or
factory.
These new initiatives are important and exciting steps in the post Dodd-Frank economy and for both
the ECCMA community and the international data quality mission. Please check out our new web
destination: www.eccma-eprop.org. I look forward to meeting some of you next month in Austin!
About the Author
Elizabeth Green is the Principal Consultant with rel-e-vant Solutions, is a strategist,
solutions architect, speaker and valuation advocate. A recognized mortgage
technology veteran in software product leadership for solutions in residential property
valuation, loan origination, mortgage servicing and secondary marketing, Green is
helping to foster a new level of understanding in property valuation and collateral risk
assessment through the application of digital intelligence. She is the third term
chairperson of the MISMO Property and Valuation Services Workgroup, chair of the
Property Identification Development Workgroup and member of the Governance Committee. Ms.
Green is a new member of ECCMA and recently completed her ISO 8000 Master Data Quality
credential.
Street Address
GPS Coordinates
eNLI Calculated Geocode Actual Coordinates
Collected
2980 Linden Street
Suite E2
Bethlehem, PA 18017-
3283
USA
40.656364
-75.354772
40.656135
-75.354125 945647-5AB23T-H1
(Continued from page 6)
October 2013 – ECCMA Newsletter 8
linking the knowledge of today, with the power of tomorrow
Take a look at the ECCMA Data Quality Solutions Summit agenda!
(Continued on page 9)
Wednesday, October 23rd, 2013 8:00-9:00am Breakfast (Jean's Kitchen) 8:00-9:00am Registration & Badge Pick-up (Lone Star Foyer) 9:00-9:15am Welcome & Opening Address (Lone Star Room)
Peter Benson, Executive Director, ECCMA 9:15-9:50am Big Data & Data Quality- Experience & Perspective from Consumer Internet
Dan Defend, Data Quality Manager, Yahoo! 9:50-10:25am Big Data Quality: Can Data Quality Survive the Big Data Revolution?
Roger Wahman, VP, InterSys Consulting 10:25-10:35am Big Data Open Discussion 10:35-11:00am Morning Break (Lone Star Foyer) 11:00-11:35am The NATO Codification System: Linking Industry & Government Worldwide
Steven Arnett, Chief, Codification Services, NATO Support Agency (NSPA) 11:35-12:10pm A Standards-Based Approach to Bridge the Gap Between Electronic Catalogues &
Engineering Dan Carnahan, Program Manager, Rockwell Automation
12:10-12:20pm Data Provenance Open Discussion 12:20-1:20pm Lunch (Jean's Kitchen) 1:20-1:55pm Unique Identification of Real Property Using Open Standards
Elizabeth Green, Principal Consultant, Rel-e-vant Solutions 1:55-2:30pm Reference Data in the Cloud
Diane Schmidt, Managing Director, Noetic Partners Justin Magruder, Senior Managing Director, Noetic Partners
2:30-2:55pm Data Quality- Foundation of Good Spend Visibility Fred Henien, Master Data Management Solutions, Ariba an SAP company
2:55-3:05pm Measuring Data Quality (Part 1) Open Discussion 3:05-3:30pm Afternoon Break (Lone Star Foyer) 3:30-4:05pm What a Data Quality Tool Can Do for You
Mark Hudson, Senior Manager, CapTech 4:05-4:40pm Master Data Acquisition- A Project Necessity
Pieter Strydom, Director, PiLog 4:40-4:50pm Measuring Data Quality (Part 2) Open Discussion 4:50-5:00pm Closing Remarks
Peter Benson, Executive Director, ECCMA 6:00-8:00pm Dinner (The Spur Room)
October 2013 – ECCMA Newsletter 9
linking the knowledge of today, with the power of tomorrow
(Continued from page 8)
(Continued on page 10)
Thursday, October 24th, 2013 8:00-9:00am Breakfast (Jean's Kitchen)
8:00-9:00am Registration & Badge Pick-up (Lone Star Foyer)
9:00-9:15am Welcome & Opening Address (Lone Star Room) Peter Benson, Executive Director, ECCMA
9:15-9:50am Data Alliances- A Foundation to for Success Thomas Tong, VP of Client Engagements, Knowledge Transformation Partners
9:50-10:25am The Road to Noncompliance is Paved with Good Data Craig Laufer, QA Engineer, Erie Insurance
10:25-10:35am Data Governance Open Discussion
10:35-11:00am Morning Break (Lone Star Foyer)
11:00-11:35am The Secret to a Successful "Big Data" Initiative: An Effective MDM Platform William Miller, Director, Product Management - MDM, Oracle
11:35-12:10pm MDM as a Business Tool for Data Standardization Luuk van den Berg, Data Governance Lead, Cisco Systems
12:10-12:20pm Master Data Management Open Discussion
12:20-1:20pm Lunch (Jean's Kitchen)
1:20-1:55pm A Holistic Approach to Data Quality Timothy King, Executive Consultant, LSC Group
1:55-2:30pm Making Predictions with Data Mining Tools Mark Hudson, Senior Manager, CapTech
2:30-2:55pm A SaaS Approach to Entity Identity Management Dr. John Talburt, Director, ERIQ Research Center, UALR
2:55-3:05pm Turning Quality Data into Valuable Information (Part 1) Open Discussion
3:05-3:30pm Afternoon Break (Lone Star Foyer)
3:30-4:05pm TBA TBA, Officer, Kaygen
4:05-4:40pm TBA Hannes De Bruin, Manager- Supply Management, Exxaro Pieter Strydom, Director, PiLog
4:40-4:50pm Turning Quality Data into Valuable Information (Part 2) Open Discussion
4:50-5:00pm Closing Address & End of Summit Peter Benson, Executive Director, ECCMA
Visit
http://www.eccma.org/2013dqss/register.php
to register for this years conference!
October 2013 – ECCMA Newsletter 10
linking the knowledge of today, with the power of tomorrow
The Oil and Gas industry provides key source of energy as well as feedstock for petrochemical,
fibers and pharmaceutical industries. Individual Oil and Gas firms are amongst the largest global
business enterprises and as such are generally seen as being vital to the national interest of the
countries in which they operate. The sheer size of major Oil and Gas firms, the large volume of
industry-specific language, technology and methods, as well as the vertically integrated nature of
major Oil and Gas firms, spanning from exploration to retail delivery, makes them particularly suited
to a interoperability solutions project that will focus on their unique requirements for physical asset
management.
In 2012, an Oil & Gas interoperability solutions project was
approved by the Automation Systems and Integration
technical committee, ISO/TC 184. The intent of this project is
to address interoperability requirements for Oil and Gas
industry upstream and downstream facilities with a focus on
Operations & Maintenance and its associated requirements
for physical asset life-cycle engineering. The scope will also
address appropriate, closely related industry groups such as
the Petrochemical and Power Generation industries.
This project takes a full life-cycle approach to asset
management interoperability spanning from
conceptualization through remediation, while also having a substantial focus on establishing a
sustainable Execution Environment for Operations and Maintenance (O&M) based on open and
non-proprietary data, information and knowledge management practices.
Key objectives for the Oil and Gas Solutions Project will include:
Enabling improved data, information and knowledge management related to Oil and Gas asset
management integration.
Enabling open and non-proprietary data and information exchanges throughout the complete life-cycle and resource hierarchy of a platform, plant, facility, rig or reservoir, including development, procurement, deployment, and restoration.
Establishing both an Open and non-proprietary Reference Information Environment and O&M
Execution Environment with proper synchronization in order to enable a more sustainable Oil and
Gas industry.
As shown in figure 1, ISO 15926 is seen to play a key role in establishing the Reference Information
Environment, while ISO 18435, ISO 22745, ISO 13374, along with key industry standards associated
with the OpenO&M Initiative, are expected to form the core of the Execution Environment.
THE OIL AND GAS INTEROPERABILITY SOLUTIONS PROJECT, ISO 18101 SUBMITTED BY: Dan Carnahan Program Manager, Rockwell Automation
(Continued on page 11)
October 2013 – ECCMA Newsletter 11
linking the knowledge of today, with the power of tomorrow
The project scope will be bounded by specific use cases, such as the digital handover of O&M
information from an Engineering Procurement Contractor (EPC) to enable automatic provisioning of
O&M systems, production optimization and maintenance. Real industry data sets for both upstream and
downstream will be utilized in validating the interoperability requirments for the project.
ECCMA's role as US TAG administrator for TC 184, SC 4, and SC 5 TAGs provides the opportunity for
interested individuals and organizations in the US to engage in standards development, such as this
project, that affects a global community. The return on investment by participating in a TAG can be
measured not only by your ability to influence a global set of standards, but also by the career and
cultural opportunities provided that are not easily accessed in any other forum. For more information on
joining TAGs, please contact Sheron Koshy.
About the Author
Mr. Carnahan is a principal engineer for Rockwell Automation, Advanced Technology. He has
worked for Rockwell Automation for over 25 years in various program, project management
and systems, product, network-related, electronic catalog and safety standards areas. He has
several patents related to industrial automation technologies and applications. Mr. Carnahan
is the Chairman of USTAGs for ISO/TC 184 (Automation Systems & Integration) and ISO/TC 184/
SC 5 ISO/TC 184/SC 05 "Interoperability, integration, and architectures for enterprise systems
and automation applications". He also leads and convenes ISO/TC 184/SC 05/WG 07
"Diagnostic and maintenance applications integration" working group engaged in specification standards for
interoperability of diagnostics & production systems (ISO 18435). He is also currently participating as expert in
ISO/TC 184/WG 6 (Oil & Gas Interoperability Project), IEC/TC 65/WG 16 (Digital Factory), as well USTAG
member for IEC/TC 65, SC65B, and SC65E.
(Continued from page 10)
October 2013 – ECCMA Newsletter 12
GET INVOLVED
linking the knowledge of today, with the power of tomorrow
The ECCMA Corporate Dictionary Manager (eCDM) is an on-line dictionary
linked to the world’s largest open technical dictionary, the eOTD. The eCDM
assists companies with the creation and maintenance of their multilingual
corporate dictionary that all colleagues, throughout your corporation, can see
and use. This process avoids confusion in labeling data; the first step to data
quality.
To sign up for a one-hour complimentary training with a 30 day trial where you
will be able to utilize all the major functionalities of the eCDM and become famil-
iar with this open source tool, please visit www.eotd.org/ecdm.
To view available training sessions or to sign up,
click here
ECCMA 1 specifies requirements for identifying
a unit space. The identifier is an encoding of
the latitude, longitude, and floor of the front
door of the unit space.
Members can download this FREE by visiting the ECCMA Members Area.
Non Members can download this standard for $25.00 USD by visiting:
http://www.eccma.org/ECCMAstandards/index.php.
October 2013 – ECCMA Newsletter 13
linking the knowledge of today, with the power of tomorrow
GET INVOLVED
ECCMA holds monthly 4 hour webinars that provide an overview of ISO 8000, the international
standard for data quality. Individuals that attend are able to receive their ISO 8000-110 Master Data
Quality Manager certificate by taking an online test. Any previously certified individual may attend
this webinar free of charge. For complete details on the next webinar and certification, check out
our website at: www.eccma.org/webinar or feel free to contact [email protected]. Sessions are
now available through December 2013.
The Fellow Membership Award recognizes an individual or company who has
proven to have a big impact on the data quality or standards world for the
past year(s). Let your voice be heard and help ECCMA continue to recognize
leaders in this industry. If you feel someone or some organization deserves to be
recognized, we are accepting nominations. To submit your nomination, please
visit www.eccma.org/2013dqss/award.php#nomination. The award is
presented on Wednesday evening at the social event dinner.
An ECCMA Scoping Study analyzes up to 3 years of PO transactions and
compares these to the material and vendor masters to identify frequently
purchased items, as well as, the vendors from whom they are purchased.
ECCMA then analyzes the material master characteristic data, names and
descriptions as well as the “free text” descriptions in PO and uses these to
create an initial corporate dictionary and cataloging templates (data require-
ments). The initial corporate dictionary and the corporate templates are
modified during the actual data cleansing project. Finally, ECCMA revisits the
materials and ranks them according to a priority scale based on three dimensions (item unit cost,
total spend and number of suppliers). The end result is a much clearer picture of the goal and the
path to the goal, this makes it easier for contractors to quote accurately and it makes it easier to
quantify the internal effort required to complete the project with or without contractors.
If you would like to discuss your specific requirements, please do not hesitate in contacting Peter
Benson at [email protected].
October 2013 – ECCMA Newsletter 14
linking the knowledge of today, with the power of tomorrow
BEWARE OF FRANKENDATA SUBMITTED BY: Craig Laufer
QA Engineer, Erie Insurance
My girlfriend has a lawnmower that has been dubbed Frankenmower. It has parts that have been
cannibalized from many other mowers. Bent wheel? No problem. Just take one from another mower.
Broken pull cord? That’s what knots are for. It continues to do a remarkably good job. Why? Because it
has been watched over and given the attention it requires.
Test data is often cobbled together like this. Start with a regression test suite;
stitch in a few artificially created files; bolt on some production data to fill in the
gaps. And thus is born Frankendata.
Is Frankendata bad? Not necessarily. In fact, it is often essential to provide
good test coverage. However, like Frankenmower, it needs to be monitored
and maintained. Is the regression test suite still valid, or has it gone stale? Are
the generated files robust enough to meet the testing needs? Has the
production data been scrubbed to remove personal information? It is not
enough to patch together your test data; you need to control the individual
aspects, and you need to apply quality principles to the whole. In other words,
you need Test Data Governance.
You may be thinking, “This isn’t a problem. I have a good Data Governance program. It will
automatically take care of test data.” This is only true to a degree, because for test data, there are
additional governance aspects that are not encountered with other forms of data. Remember the data
scrubbing mentioned above? Privacy is a big topic, and it is only getting bigger. Consider that there are
significant external drivers for a company to invest in data governance, such as SOX and HIPAA. There
are also significant drivers for a company to safeguard privacy, such as GLBA Title V – Privacy. Use of
production data for testing is also getting attention. In reference to primary account numbers, PCI DSS v
2.0, requirement 6.4.3 states, “Production data … are not used for testing or development”. There is little
doubt that all of these concepts will eventually merge into an overarching theme, and the ramifications
will be far-reaching and costly. How much is being done in this discipline? Try an internet search on the
quoted phrase “test data governance”. There are about 30 meaningful hits. Compare this to thousands
of hits for “data governance”.
My recommendation? Don’t focus only on the present and the near-term tasks you need to accomplish,
because then you will not see the storm on the horizon. Rather, be forward thinking, and develop your
own Test Data Governance program now. Otherwise, Frankendata will assuredly grow bigger, and it will
very likely turn into a monster if it is not given proper oversight. And when that happens, the powers that
be will eventually run for torches and pitchforks, and you will be caught in the middle.
(Continued on page 15)
October 2013 – ECCMA Newsletter 15
linking the knowledge of today, with the power of tomorrow
About the Author
Craig Laufer has many years of IT experience, ranging from in-house developer to
consultant to testing engineer. He has spent the last five years leading a test data
management initiative designed to improve productivity by automating many aspects of
test data procurement. His involvement in this initiative from its inception gives him unique
insights into this discipline, which is becoming more significant with the heightened
awareness of privacy concerns, both from a legislative perspective and from increased
public sensitivity to the issue.
UPCOMING ISO 8000 MASTER DATA QUALITY MANAGER WORKSHOP Tuesday, October 22, 2013
Travaasa Resort, Austin, TX
(Continued from page 14)
October 2013 – ECCMA Newsletter 16
linking the knowledge of today, with the power of tomorrow
PARTS MANAGEMENT– 5 WAYS YOU CAN WIN WITH STANDARDS SUBMITTED BY: Jay Hopper
VP of Marketing CADENAS PARTsolutions
Wherever you are reading this, take a moment to look around. How many items
can you find that are based on a standard? How about the light bulbs over your head, electric outlets
on the walls, USB ports on your computer, that stapler on your desk and even the bottle opener in your
drawer? Standards are everywhere and make our lives so much easier, yet the day to day benefits are
so often overlooked.
Thanks to planners, product designers, engineers and industry visionaries, our world is made simpler, eas-
ier, more efficient, cheaper and more compatible because of the effective management and prolifera-
tion of standards.
Now think about what you design and build. How many
components within those products are considered stan-
dard parts? How much more efficient could you be if you
could better manage how your store, find, and re-use
those parts in your designs? How much better would your
product be if you could spend more time engineering solu-
tions rather than redrawing those parts?
The advantages we enjoy in our everyday lives thanks to
universal standards can also be applied to standard supplier parts used in your job. By recognizing and
managing standards in your designs, your processes can also be simpler, easier, more efficient, cheaper
and more compatible.
At PARTsolutions, we call that “standardizing standards” and we’ve outlined 5 ways any company, large
or small, can benefit from standards parts management:
1. Save Design Time
Engineers can literally save thousands of hours of design time utilizing a parts management system to
find, re-use and control standard supplier parts and internal standard parts as well. By centralizing stan-
dard parts, engineers access parts (or models) from a single source. Every standard part they need is in-
stantly available at their fingertips. If each engineer at your company could save 1 hour per day, what
would that mean for your company?
2. Spend Time on Design That Matters
That time savings can also free up engineers so they can focus on solving the problem, perfecting the
product or engineering the solution, rather than wasting time monkeying around with standard parts. En-
gineers spend way too much time searching for parts and redrawing them. The majority of an engineer’s
time should be spent on what we call “value-added design” i.e. design (Continued on page 17)
October 2013 – ECCMA Newsletter 17
linking the knowledge of today, with the power of tomorrow
that adds value to the business. Searching and redrawing common parts doesn’t add value.
3. Speed Up Product Development & Time to Market
By eliminating menial engineering tasks, like searching for or redrawing parts, you can speed up product
development and manufacturing automation design and get your end-products to market faster.
4. Let your Standards Supplier Do the Updating
By cataloging your standard supplier content in a central location, you create a “single authoritative
source” for your engineering teams. Using our PARTsolutions parts management system, commercial
parts suppliers provide their catalog content to engineers. The suppliers maintain and update all that
data, so engineers don’t have to waste their time doing it. Also, orders are accurate because product
data is always current. So, you won’t order an obsolete part, for example.
5. Standardization Gives You Purchasing Power
Best-in-class companies are taking parts management to the
next level. They are creating visibility between engineering
and purchasing departments to achieve unprecedented
purchasing power. By seeing what parts are being designed
into products and what vendor parts are being specified
across the company, purchasers can analyze this data to
identify opportunities for a leveraged buy from a supplier. Not
only can the company get a better deal on parts on a
volume purchaser, the order accuracy goes up substantially because the part data is accessed from a
singular source. Essentially purchasing and engineering are drawing their data from the same place, so
it’s super accurate. It’s a win/win/win – for the engineer, purchaser and the company.
The engineering world needs more visionaries who understand the value of embracing standardization
and can promote it within their companies. Standards help everybody. Good luck to you and your
efforts to help standardize standards!
About CADENAS PARTsolutions
PARTsolutions® LLC is a leading provider of PLM solutions for next generation 3D part catalog manage-
ment and hosting, delivering solutions since 1992. For large manufacturers, the PARTsolutions product
suite provides centralized 3D standard part catalogs making it easy for global design teams to find, re-
use, and control standard and proprietary 3D parts.
To learn more, please visit: www.partsolutions.com.
(Continued from page 16)
October 2013 – ECCMA Newsletter 18
linking the knowledge of today, with the power of tomorrow
Brady Corp Associate Member
Camcode Full Member
Camcode, a division of Horizons Incorporated,
provides a vast array of technical services involved in
all facets of tracking and identification labeling and
marking. We provide all forms of industrial
identification including production items, warehouse,
location, assets, rack and even high quality placards.
A particular specialty is our in-depth knowledge and
experience in Unique Identification (UID) policy
including development, implementation planning and
project assessment, data validation and cleansing,
engineering of the marking approach, label
production and installation, and item registration.
Camcode has also begun to explore data resolution
as a means of confirming authenticity and pedigree
of these items. Camcode’s full suite of technical
services can be customized to accommodate your
specific needs and requirements.
To learn more about this member, please visit:
www.camcode.com.
Minera San Cristobal S.A. Associate Member
To learn more about this member, please visit:
www.minerasancristobal.com.
Quantum Semantics, Inc Associate Member
Xtivity, Inc Full Member
To learn more about this member, please visit:
www.xtivity.com.
Have You Seen ECCMA’s Upcoming Events?
Which Events Will You Attend?
ISO 8000 Webinars, eCDM Training, ISO 8000 Workshops,
Member Q&A webinars now have scheduled sessions
through December 2013.
Follow Us on Twitter @ECCMA
WELCOME NEW MEMBERS
October 2013 – ECCMA Newsletter 19
linking the knowledge of today, with the power of tomorrow
DISA Full Member
Exxaro Associate Member
I.M.A. Ltd. Associate Member
In many cases asset-intensive organizations have
multiple plants spread across large geographic
regions, each with thousands of MRO spare parts on
hand. In such large organizations inventory data
becomes inconsistent and inaccurate, resulting in
large amounts of excess inventory, duplication and
false stock-outs. Each of these inefficiencies greatly
ties up time and money and will only continue to
worsen if appropriate steps are not taken to resolve
the problem. I.M.A. Ltd. offers solutions to cleanse,
manage, view and optimize Materials Master Data,
ensuring clients capture maximum cost savings.
To learn more about this member, please visit:
www.imaltd.com.
Kingshir Technology Associate Member
Kingshir Technology Solutions (P) Ltd provides
affordable, accurate, reliable, standard compliant
solutions for managing Master Data across the
organization. Kingshir provides solutions for master
data management in the areas of Materials, Assets,
Vendor, Services, HR and other Masters that are
required by an organization, through its life cycle.
To learn more about this member, please visit:
www.kingshir.com.
Oniqua Associate Member
Oniqua provides advanced analytics-based Asset
Performance Management solutions that dramatically
improve the efficiencies of asset-intensive
organizations across inventory, maintenance and
procurement operations. With cross-functional
integration, data cleansing services and 20 years of
industry experience and expertise, Oniqua helps
customers cut costs, minimize risk and achieve
significant savings in a matter of months. Oniqua is
proud to serve the world’s leading asset-intensive
organizations in the oil and gas, mining, utilities and
transportation industries
To learn more about this member, please visit:
www.oniqua.com.
XSB, Inc Associate Member
XSB, Inc. is a privately held Product Content and Data
Management (PCDM) technology company. XSBs
automated data management solutions transform
unstructured jargon-rich data into information that is
rich, structured, precise and timely; we call this
Coherent data. Coherent Data is the backbone of
any organization; it is highly structured, readily
queriable, and integrated across the entire enterprise.
XSBs patent pending approach to data
standardization and management represents the
convergence of ontologies and automated software
agents. Our solutions have set new standards for
information retrieval, data classification, master data
management and supply chain optimization.
To learn more about this member, please visit:
www.xsb.com.
RENEWED MEMBERS
linking the knowledge of today, with the power of tomorrow
ADMINISTRATIVE DIRECTOR MELISSA M. HILDEBRAND
MEMBERSHIP ADMINISTRATOR VICTORIA M. FALCONE [email protected]
EXECUTIVE DIRECTOR PETER R. BENSON
CHIEF TECHNICAL OFFICER DR. GERALD M. RADACK [email protected]
PRESIDENT, ECCMA INDIA SHERON KOSHY
ECCMA is a not-for-profit International Association of Master Data Quality
Managers set up in 1999, to develop and maintain open solutions for
Faster – Better – Cheaper access to authoritative master data.
ECCMA is the original developer of the UNSPSC, the project leader for ISO
22745 (open technical dictionaries and their application to the exchange
of characteristic data) and ISO 8000 (information and data quality), as well
as, the administrator of U.S. TAG to ISO TC 184 (Automation systems and
integration), TC 184 SC 4 (Industrial data) and TC 184 SC 5 (Interoperability,
integration, and architectures for enterprise systems and automation
applications) and the international secretariat for ISO TC 184 SC 5.
MEET ECCMA
STEVEN ARNETT NATO Support Agency
Larry Barth Vermont Energy Investment Corp.
Peter Benson ECCMA
Don Brown (Emeritus Director) PartNET
Chris Haydon Quadrem (Ariba)
Donald Hillman Lehigh University
Sheron Koshy ECCMA, India
Pieter Strydom PiLog International
Bern Werner Salar, Inc.
UPCOMING ARTICLES DUE BY:
NOVEMBER 18TH, 2013
NEXT RELEASE DATE:
DECEMBER 1ST, 2013
If you are interest in submitting
articles for our viewers, please
contact, [email protected]. Articles
may range from data quality issues,
cataloging projects or interesting news
or tips you’d like to share with our
members and audience. THANK YOU!
2980 LINDEN STREET, SUITE E2│BETHLEHEM, PA 18017 USA P: +1 610 861 5990│F: +1 610 625 4657
WWW.ECCMA.ORG
ABOUT ECCMA