Tokenization

Tokenization is meant for the protection of sensitive data in such a way that its utility is preserved and data breaches all the while, the original data is never lost.