Tokenization
Objective:
- Engineer a tactical solution to de-sensitize sensitive data of the organization
- Understand different types of data sources within the organization and create relevant strategies for them
- Develop engine and tools for automation of tokenization
- Improve the efficiency of the whole process as much as possible
Approach:
- Learnt about several data sources throughout the organization (with focus on transactional data)
- Solution contained
- Modular software design with early and late failure choices and several safeguards built to handle sensitive data
- One-way encryption cipher based algorithm to generate unique non-sensitive data to respective sensitive data point
- 24x7 service which any authorized entity could make use of
- Worked on large data loads with high performace thus minimal latency
- Plugins to support other non-essential features on case-by-case basis
Results:
- Delivered a tactical solution within few months of development
- Minimal Interface for simplicity whilst being completely extensible for any new feature
- Compatible with different types of files
- Able to handle unforeseen idiosyncrasies in data on a daily basis
- Metadata driven service leading to wide adoption around the floor
- Automated workflows for better maintenance