Download PDF
Case Studies > Controlled Test Data for Payment Processing Applications

Controlled Test Data for Payment Processing Applications

Technology Category
  • Analytics & Modeling - Predictive Analytics
  • Application Infrastructure & Middleware - Data Exchange & Integration
  • Application Infrastructure & Middleware - Data Visualization
  • Application Infrastructure & Middleware - Database Management & Storage
Applicable Industries
  • E-Commerce
  • Finance & Insurance
  • Healthcare & Hospitals
Applicable Functions
  • Business Operation
  • Quality Assurance
Use Cases
  • Fraud Detection
  • Predictive Maintenance
  • Process Control & Optimization
  • Regulatory Compliance Monitoring
  • Remote Asset Management
Services
  • Software Design & Engineering Services
  • System Integration
  • Testing & Certification
  • Training
The Challenge
In order to test their payment processing application, the QA team at this financial services company determined their data feeds must be simulated in a highly controlled fashion. To reproduce complex transaction data feeds, the team copied a subset of their production data and prepared it for testing. Production data is attractive because it contains real transactions in the proper data interchange format. However, to prepare the data for testing, it had to be laboriously reworked by hand to create the data variations and permutations needed for test cases while removing all sensitive customer and merchant information. It took the QA staff 160 man-hours (an entire man month) to build a test data set. Because the data interchange format was revised every six months, the number of man-hours required for test data provisioning effectively doubles over the course of a year. The tedious nature of the provisioning process placed limits on the variety of test data available for functional, integration and regression testing. And the limits on the volume of data provisioned was impacting their ability to perform the load and performance testing required to simulate heavy transaction loads. In the end, they concluded there were too many problems associated with using production data alone for testing purposes. The following summarizes their rationale. Production data is not controlled data Without manual modification, test data copied from production data can only test for conditions represented by a given data subset. It does not provide the QA team with the necessary data to test edge case conditions, the presence of invalid data values, or specific input value combinations that might uncover software defects. To maximize code coverage under all potential operating conditions, test data must be controlled to simulate data feeds that contain all of the data variations required by each test case and its assertions. Production data is not secure data Business and IT leaders at this financial service company were very concerned about data privacy. The risk of a data breach that might expose sensitive customer credit information was too great when considering the legal and financial consequences. This risk was further compounded by the fact that much of the testing was being performed by offshore contract resources, limiting the internal control over the handling of sensitive customer data. Secure, high volume production test data is not practical Data masking is the conventional approach often used for mitigating the security risks of working with production data. However, masking all of the PII contained in the transaction data feeds used by payment processing systems is a monumental task. Transaction data feeds are complex, nested, fixed file data structures that contain control codes, record types, accumulated transaction values, and calculations for reward points and cash-back incentives along with real card holder and merchant account numbers and credit information. Finding and masking the sensitive information in this complex data stream while preserving the referential integrity of the data values is both daunting and time consuming.
About The Customer
The customer in this case study is a major financial services company in the United States that processes billions of debit and credit card transactions annually, representing trillions of dollars in purchases and payments. This company manages millions of cards issued to both business and consumer cardholders, ensuring the accurate, efficient, and secure processing of these payments is crucial to their operations. The company operates in a highly sophisticated environment where payment processing software must support a variety of vertical markets, including restaurants, hospitality, and e-commerce. The software also needs to handle a wide assortment of card categories, incentive and loyalty programs, credit histories, and spending limits for both consumer and business accounts. Given the complexity and volume of transactions, the company faces significant challenges in ensuring their payment processing applications are rigorously tested for defects, compliance with data interchange standards, and performance under heavy load conditions. The company is also highly concerned about data privacy and security, particularly in the context of quality assurance testing, which must be conducted without the use of any Personally Identifiable Information (PII).
The Solution
The team then evaluated the GenRocket TDG platform and the use of real-time synthetic test data to meet their needs. They presented their requirements to GenRocket and within three weeks GenRocket was able to provide them with a fully working proof of concept. First, GenRocket created a custom test data generator to recreate the “feature file” used to control test case conditions. This new data generator works in combination with custom test data receivers that format the data to match the company’s data interchange specification. Then a custom script was created to implement an API integration between their testing tools and the GenRocket TDG platform along with test data scenarios that contain instructions for generating test data in the required volume and variety that is needed for comprehensive testing. GenRocket worked closely with Cognizant, one of its premiere testing partners to produce an operational test environment that was ready for immediate use by the testing team. Here is a summary of the steps taken to set up their new test data provisioning platform: First Cognizant and GenRocket used the financial company’s data model to create GenRocket domains and attributes to simulate their payment processing database. Then the Cognizant team used GenRocket data generators to model the company’s business data for each GenRocket attribute. GenRocket created a custom FeatureFileGen generator for the purpose of reading “Feature File” data into GenRocket attributes. The GenRocket team then implemented custom data receivers to create formatted data. Together, GenRocket and Cognizant created GenRocket test data scenarios using the above components to consume “Feature File” data and produce the test data output. Finally, the GenRocket team created a groovy script that used the GenRocket API to orchestrate the entire process. The new custom GenRocket components created for this solution are as follows: FeatureFileCreatorScript: Used to generate a “Feature File” of 1 to 1,000,000 rows or more FeatureFileGen: The GenRocket generator used to query columns in a “Feature File” SegmentDataCreatorReceiver: Creates various segment files to represent the many data elements used in a typical payment transaction process SegmentMergeReceiver: Merges multiple segment files in the proper sequence and hierarchy to produce a consolidated payment transaction file GenRocket API Script (300 Lines): Integrates the test data generation process with test cases and ensures proper relationships of data in a dynamic data hierarchy
Operational Impact
  • The GenRocket solution allowed the QA team to generate test data in real-time, significantly reducing the time and effort required for test data provisioning.
  • The use of synthetic test data ensured that no Personally Identifiable Information (PII) was used during testing, addressing the company's data privacy and security concerns.
  • The custom components created by GenRocket provided the flexibility to simulate any data feed with 100% secure synthetic data, enabling comprehensive testing under various conditions.
Quantitative Benefit
  • The QA staff saved 320 man-hours per year, representing a significant cost and time savings for the organization.
  • The time to create 1 million rows of test data was reduced to less than 30 minutes.
  • The time to create 20 million rows of test data was reduced to less than 8 hours.

Related Case Studies.

Contact us

Let's talk!

* Required
* Required
* Required
* Invalid email address
By submitting this form, you agree that IoT ONE may contact you with insights and marketing messaging.
No thanks, I don't want to receive any marketing emails from IoT ONE.
Submit

Thank you for your message!
We will contact you soon.