by Jason Williscroft
Any MEDM implementation is at its heart a data mapping exercise, where data from input objects is mapped from table to table through a series of transformations that cleanse the incoming data, align it from source to source, master it, stage it for output, and ultimately generate an artifact for delivery to a target system. Tracking these mapping requirements is the fundamental design challenge of every MEDM implementation, since additions or changes can have effects that reverberate from one end of the system to the other. Even something as minor as a change to the decimal precision of a single table column can induce corresponding changes along the length of the pipeline, potentially affecting a dozen tables or more and scores of MEDM components.
In early 2013 HotQuant arrived at an implementation in progress, where the client had put in place a SharePoint-based system for tracking and managing mapping requirements and logic. This simple device–really nothing more than a connected set of custom SharePoint lists–provided the following capabilities:
- Characterization of input columns and data types.
- Definition of abstract mapping logic and application against multiple mapping scenarios.
- Tracking of requirements analysis and implementation status at a very granular level, enabling the relevant players at any point in the implementation effort to know precisely which pieces required more analysis, which were ready for implementation, and which were queued for deployment to Test.
A year later, at a complex new engagement in Milwaukee, HotQuant replicated this system with one significant enhancement. Since the data in the SharePoint system constituted a complete characterization of the data pipeline in low-level terms–meaning that it contained all columns, data types, and key requirements–HotQuant determined to create a tool that coupled this information directly with the development environment, so that any changes to requirements could be realized into objects in the development database with the click of a button.
HotQuant accomplished this by crafting an Excel VBA application that performed the following tasks:
- Consumed the relevant SharePoint lists as data sources;
- Performed integrity checks against these data sources and provided indications when requirements were incompletely or incorrectly specified on SharePoint;
- Applied design rules to determine the set of tables implied by the requirements;
- Exposed in summary form to project management the analysis and development status of the entire implementation;
- At the click of a button, leveraged the specified data types etc. to generate DDL that could be run against the database to create or replace tables and indices in order to reflect exactly any new or changed requirements on the SharePoint system.
This is the Requirements Dashboard. The effects of this tool on the new client were immediate:
- Complete transparency to project management, as the relevant players were able to view a summary of development status and drill down to specific lists of requirements that were awaiting BA review, in development, or pending deployment to test.
- A rapid acceleration in the development cycle, as the realization of table requirements into code was reduced to a trivial task that could be performed on demand, as frequently as necessary.
- A near elimination of data-layer implementation errors, as a consequence of automated DDL generation.
- The elimination of a whole class of component errors driven by imperfect data layer support, as the addition and proper configuration of system-level table columns and related objects were hard-coded into the Dashboard’s code generation routines along with other best-practice boilerplate.
Client-side project leadership has cited the Requirements Dashboard as a major innovation, and an approach to requirements gathering and code automation that will be applied to future projects across the enterprise.