Tokenization is the process of anonymizing data without losing the analytical value of the data.
Tokenization is applied to community projects, for the common good. Occasionally we may find a sponsor for a Tokenization projects. Our Tokenization projects leverage open technologies to achieve these goals.
End products may be made available in the Shop
Contact Us to know more.