Hybrid Attention-GAN Framework for Secure Data Encryption and Decryption: Leveraging Transformer-based Attention Mechanisms and Adversarial Learning
DOI:
https://doi.org/10.63278/jicrcr.vi.2671Abstract
This study presents a new technique for data encoding and decoding that combines Generative Adversarial Networks (GANs) and the attention mechanism of the Transformer. The Hybrid Attention-GAN Framework the authors proposed takes advantage of the capabilities of GANs in data transformation and the capabilities of self-attention in transformers to better the encryption and decryption processes. In the encryption phase, a transformer encoder is utilized for capturing the long-range dependencies within the plaintext. Afterward, the encoder is obfuscatedso that the GAN generator produces a ciphertext that looks like the random noise. The adversarial training method makes sure that the encryption process is successful by making the ordered images into unidentifiable ordered patterns which is usually considered. The decryption process, a transformer decoder, which uses self-supervised learning, reconstructs the plaintext from the ciphertext by predicting missing portions of the data. This method provides a multitude of advantages such as enhanced security of encryption and higher accuracy of decryption. Nonetheless, challenges such as training stability, computational overhead, and the lack of formal security guarantees are yet to be addressed. The research is into-depth in the framework methodology, analyzes its performance, and concludes with possible future work to solve the challenges and optimize the method for the application of the real world.