[1]\fnmSahil \surNokhwal
[1]\orgdivDepartment of Computer Science, \orgnameUniversity of Memphis, \orgaddress\cityMemphis, \stateTN, \countryUSA
2]\orgdivDepartment of Electrical and Computer Engineering, \orgnameUniversity of Memphis, \orgaddress\cityMemphis, \stateTN, \countryUSA
3]\orgdivDepartment of Computer Science, \orgnameUniversity of Missouri, \orgaddress \citySt Louis, \stateMissouri, \countryUSA
Secure Information Embedding in Images with Hybrid Firefly Algorithm
Abstract
Various methods have been proposed to secure access to sensitive information over time, such as the many cryptographic methods in use to facilitate secure communications on the internet. But other methods like steganography have been overlooked which may be more suitable in cases where the act of transmission of sensitive information itself should remain a secret. Multiple techniques that are commonly discussed for such scenarios suffer from low capacity and high distortion in the output signal. This research introduces a novel steganographic approach for concealing a confidential portable document format (PDF) document within a host image by employing the Hybrid Firefly algorithm (HFA) proposed to select the pixel arrangement. This algorithm combines two widely used optimization algorithms to improve their performance. The suggested methodology utilizes the HFA algorithm to conduct a search for optimal pixel placements in the spatial domain. The purpose of this search is to accomplish two main goals: increasing the host image’s capacity and reducing distortion. Moreover, the proposed approach intends to reduce the time required for the embedding procedure. The findings indicate a decrease in image distortion and an accelerated rate of convergence in the search process. The resultant embeddings exhibit robustness against steganalytic assaults, hence rendering the identification of the embedded data a formidable undertaking.
keywords:
Metaheuristic optimization, Image steganography, Data hiding, Machine Learning, Information security, Steganography1 Introduction
The increasing adoption of the World Wide Web has led to the emergence of several individuals engaged in cybercriminal operations. These individuals aggressively search for and exploit vulnerabilities in vital communication infrastructure in order to hijack or disrupt them for the goal of sabotage or financial extortion. This has become an even more pressing issue with the rise of hacker groups that are incentivized and backed by nation-states that can execute attacks at a scale that wasn’t possible earlier. One of the most common forms of such attacks is gaining access to secure networks like corporate or government networks to steal sensitive information that is routinely shared on such networks. If communications on such systems are not sufficiently secured, it results in successful attacks leading to major disruptions in their day-to-day operations. So use of highly secure encryption in communications is a critical part of any network.
To guarantee the secure transfer of data over a network, it is essential to deploy suitable security measures that effectively protect vital information from illegal interception. Numerous encryption techniques have been suggested over the years to achieve this objective, whereby the source system encrypts the data prior to transmission, and afterward, it is decrypted on the recipient’s system. This measure safeguards the data against unauthorized access and guarantees the integrity of the received data. Although such data encryption systems have been a standard in network communication for a long time, there are still some use cases where such systems can be improved.
Humans have a well-known psychological trait of assigning a greater value to objects they can’t own/access and hence perceive a higher cost of access to be justified even if the underlying object doesn’t really have such high value. This scarcity mentality is regularly exploited by marketing agencies to sell luxury products at very high prices. But this same principle is also one of the main problems with encryption as a method of security. Any instance of encrypted communication available in public that can be viewed by unauthorized parties and this can act as a temptation for some to try and gain access to such information, even if the risk of legal action or the cost of actually breaking the encryption is too high. The mere presence of a cipher motivates at least some people to try and break the cipher and read its contents. Given enough technical skill and computational power, no encryption is 100% safe, so the safer method of security is to hide information in plain sight. Steganographic techniques do just that and ensure that sensitive data remains far from prying eyes.
Two important types of information concealment that have been covered in the past are watermarking and steganography. Each has certain benefits and drawbacks. The process of watermarking involves the encoding of transmissions with a discernible symbol, such as embedding a signature within signals. This serves the purpose of authorizing the owner of the signals and confirming their ownership. In general, a diminutive sign consisting of a range from one to several thousand bits is employed. The principal objective of steganography is to achieve clandestine communication, with the intention of concealing the presence of a message from an external observer. In contrast, cryptography does not endeavor to obscure the existence of covert communication; rather, its objective is to make a message indecipherable to an unauthorized entity. In order to achieve such outcomes, the desired information is embedded in a host signal (usually an audio or visual signal). The large size of host data allows reasonable bits of desired information to be hidden in between them without raising any suspicions. This methodology can be utilized across various domains, including the authentication of ownership, ensuring resistance against tampering, and facilitating the secure transport of confidential data.
Most cases where data security is critical involve some kind of text document like intelligence assessments, patent applications or even emails and other private communications between journalists and their sources. PDF documents, being one of the most universally used document formats, it is important for any steganographic method to ensure that even large files like these can be secured with little delay and with minimum distortion.
In this paper, we discuss the use of steganography to hide a PDF document in an image. This document may include any text data and may also include embedded images. In order to protect it against attacks by illicit actors, we wish to safeguard the file from public detection. In order to reduce visual distortion, we employed a meta-heuristic method called the Hybrid Firefly algorithm (HFA) developed by [1], to determine the best locations for concealing data. This requires comparing every combination of pixel values, which might take an exponentially long time to compute. This process was expedited by the use of the HFA algorithm. This work encompasses the following contributions:
-
1.
This paper presents a unique steganographic approach that seeks to embed a confidential PDF file inside a specified host image.
-
2.
Extensive experiments have demonstrated minimal distortion when using different host images.
This study introduces a new approach to improve the output of steganography by using the Hybrid Firefly algorithm. This work is structured in the ensuing fashion. Section 2 lays out the previous work done in this field and the current state of steganographic techniques. The existing degree of progress in the Firefly algorithm (FA) in [2] and Differential Evolution (DE) algorithm in [3] has been thoroughly investigated in Section 3.1 and Section 3.2, including its implementation. In Section 3.3, the proposed methodology and its resulting consequences are outlined. The utilization of the HFA algorithm involves the development of a comprehensive approach that encompasses LSB substitution and the selection of optimal pixels. Section 5 presents the experimental results, accompanied by a visual comparison of alternative methodologies that are available. Section 6 presents a concise summary of the contributions demonstrated in the earlier sections.
2 Related work in context
Over the last twenty years, several steganographic applications have emerged, with many utilizing data concealing techniques based on the least-significant-bit (LSB) paradigm. These approaches entail identifying certain pixels that exhibit the necessary qualities inside a host medium. Subsequently, data is integrated into the least significant bit (LSB) of these discovered pixels [4, 5, 6, 7, 8]. The authors of the study [9] proposed the use of a Genetic Algorithm (GA) to determine the most effective replacement matrix for hiding secret messages inside the important section of the cover image. Furthermore, the method of local pixel adjustment (LPAP) was proposed as a means to enhance the quality of the steganographic image. An effective strategy for k-LSB substitution was developed to address the problem presented by a high value of k.
The steganographic technique proposed by [10] employs the JPEG format and Particle Swarm Optimization (PSO). This methodology exhibits potential applications within the spatial domain for transform domain applications. [11] suggested the application of PSO for selecting optimal pixels for replacement in a grayscale cover image, suitable for embedding concealed grayscale image pixel data. While the efficacy of this strategy is recognized, it is noteworthy that search algorithms like Cuckoo Search (CS) have demonstrated superior results due to their improved solution space, as evidenced by [12].
Various spatial-domain embedding approaches leverage metaheuristic algorithms for optimization, enhancing the efficiency of the embedding process. Recent steganography works have been proposed by [13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 18]. In the study by [26], a spatial-domain image steganography system utilizing the LSB procedure is presented. Genetic algorithms (GAs) are utilized to optimize the sequencing of phases in the embedding process, such as pixel analysis, pixel modification, toggling of hidden bits, and other related tasks. The Peak signal-to-noise ratio (PSNR) is used as the aim function to ensure the effective arrangement of message bits within the cover picture. Other neural network-based works are presented in [27, 28, 29, 30].
[31] propose a bidirectional embedding strategy that combines histogram flipping and GA approaches. The researchers analyze the use of histogram flipping for different embedding, focusing on enhancing embedding speed and reducing distortion. The goal function is determined by calculating the discrepancy between maximum and minimum distortion values.
[32, 33] leverage chaos maps to enhance the data concealment scheme based on GA. Chaotic maps, including logistic and Gaussian maps, introduce unpredictability to genetic variation. PSNR is used as a fitness measure.
Particle Swarm Optimization (PSO) is a commonly used optimization technique in the field of computer intelligence. The study conducted by [34] employs Particle Swarm Optimization (PSO) to improve the effectiveness of pixel-value differencing. The optimization approach chooses the most favorable gray values for each pixel from a set of options provided by the modulus function. The primary goal function selects suitable pixels, whereas the secondary objective function calculates the best solution based on the results of the primary target function.
Utilizing PSO, as demonstrated by [35], allows for a significant increase in the embedding capacity within the spatial region of pictures. The Particle Swarm Optimization (PSO) algorithm is utilized to determine the optimal locations for embedding, while optimization techniques are performed to determine the optimum initial implantation point and the most efficient path for scanning pixels.
The work undertaken by [36] utilizes the Artificial Bee Colony (ABC) approach to enhance the allocation of blocks for covert image embedding. This technology exhibits superior resistance to particular types of noise assaults in comparison to similar steganographic techniques.
The solution presented by [37] seeks to attain content anonymity for healthcare photographs through the application of medical information masking techniques. This technique utilizes Sudoku-based encryption to guarantee the confidentiality of healthcare photos and exploits the Queen Traversal pattern to identify specific pixels inside the image.
Metaheuristic methods provide substantial advantages in the domains of finance, operational research, and manufacturing. The study undertaken by [38] shows that these domains often include complex challenges that traditional methods struggle to address, particularly in situations with limited access to significant computational resources.
The encryption technique presented by [39] uses the Cipher Block Chaining method and is specially designed to process several images simultaneously. Furthermore, it is capable of being used in conjunction with concurrent computing approaches.
The work conducted by [40] employs the Adaptive Black Hole Algorithm to tackle the set-covering issue, resulting in many globally optimum solutions for different set-covering scenarios. [41] proposes a modification to the CS algorithm to enhance its search capabilities. To enhance visual fidelity, [42] uses PSO in combination with Optical Pixel Adjustment. [43] introduces the concept of permutable keys, utilizing it to propose the k-LSB replacement technique. Gene Expressing Programming is introduced by [44] for determining the optimal key permutation for LSB replacement. [45] proposes a novel technique for steganography in grayscale images by employing graph coloring. An alternative approach is suggested by [46], introducing a technique for concealing text messages within 24-bit RGB color graphics, employing the Shuffled-Frog-Leaping algorithm. However, none of these works efficiently disguise a PDF file within an image.
3 Proposed architecture
The suggested model’s overall structure is founded upon the integration of three distinct algorithms, which together contribute to the attainment of the final model. These algorithms are:
-
1.
Firefly algorithm
-
2.
Differential Evolution algorithm
-
3.
Hybrid Firefly Algorithm
The following subsections provide a comprehensive analysis of the algorithms’ intricacies.
3.1 Firefly Algorithm
The FA, or Firefly Algorithm, is a meta-heuristic optimization technique developed by Xin-She Yang in 2008. It is inspired by the bioluminescent display demonstrated by tropical fireflies. The standard Firefly algorithm utilizes the observable communication patterns seen in tropical fireflies and integrates the idealized behavior linked with their flashing patterns. The construction of the mathematical model in FA is based on three idealized principles.
-
1.
Fireflies exhibit a unisex characteristic, whereby their attraction towards other fireflies is not contingent upon their respective sexes.
-
2.
The level of attractiveness exhibited by an individual is directly correlated with their level of brightness. Additionally, it can be inferred that attractiveness diminishes as the distance between individuals rises. Hence, in the case of two fireflies exhibiting intermittent luminescence, it can be observed that the firefly with lower luminosity will exhibit a tendency to travel towards the firefly with higher luminosity.
-
3.
The luminosity of a firefly is contingent upon the topography underlying the cost function. Hence, within the framework of a maximizing problem, there exists a direct correlation between the amount of light and the value of the fitness function.
The conventional firefly algorithm encompasses two crucial aspects that necessitate consideration: the formulation of the brightness and the alteration in appeal. Initially, it is reasonable to argue that the luminosity of the firefly is contingent upon the cost function landscape. Next, we establish the measurement of light intensity variation and develop a model to quantify the corresponding alteration in attractiveness. It is well-established that the intensity of light exhibits a negative correlation with the distance separating the light source and the medium that absorbs the light. In our computer model, we use the assumption that the brightness changes exponentially and monotonically with both the distance and the absorption of light parameter . The aforementioned relationship is expressed as:
(1) |
The symbol represents the initial light intensity emitted from the source, specifically at the point where = 0. Meanwhile, the symbol denotes the coefficient that characterizes the absorption of light. Based on the aforementioned idealized principles, it can be inferred that the perceived appealing factor of a firefly is directly correlated with the intensity of its emitted light, denoted as I. The attractive coefficient of the firefly’s light can be defined in a manner analogous to the illumination coefficient .
(2) |
where represents the initial level of light attraction when the distance is equal to zero. The Euclidean distance is employed to compute the separation between two fireflies, denoted as and , located at positions and respectively.
(3) |
where represents the total number of dimensions. The displacement exhibited by firefly towards firefly , which is more appealing due to its increased luminosity, is ascertained by:
(4) |
The initial term in the equation represents the present position of firefly . The subsequent term accounts for the lure between firefly and firefly . Finally, in the last term, randomization is included by using a vector of independent variables denoted as . These variables are sampled from many distributions, including the Uniform distribution, and the Gaussian distribution. The magnitude of the step size is determined by the coefficient scaling parameter in the third term.
3.2 Differential Evolution Algorithm
This algorithm finds an optimal solution for a problem while evolving from some randomly initiated starting points using a vectorized mutation operator combined with either exponential or binomial crossover. It was proposed by [3] and there have been many other variants of this algorithm that have been proposed since. One of the most widely used variants is the DE/rand/1/bin or the classic DE variant. In the context of a particular minimization problem with dimensions, population consisting of independent result vectors is utilized. A change in vector is formally characterized as:
(5) |
where and are three randomly selected solutions from the population and g is the generation index. There is also a perturbation parameter that adjusts the amplification of the difference vector .
Similarly, the binomial crossover operation is also used to generate a new trial vector from the perturbed or mutated vector and the target vector
(6) |
where the crossover constant and is a random permutation vector index that ensures at least one mutated vector is included in the trailing vector.
The selection mechanism for this algorithm is as below:
(7) |
This approach has a resemblance to previous algorithms in which a greedy acceptance strategy is used, wherein an update is deemed acceptable only if it leads to an improvement in the present target being pursued.
3.3 Hybrid Firefly Algorithm
Both the firefly method and differential evolution provide distinct benefits and demonstrate efficacy across a diverse array of optimization tasks. The hybrid Firefly method is derived from the FA and the DE algorithm. This hybrid approach aims to use the strengths of both algorithms, resulting in an improved optimization technique. The proposed approach integrates the attraction mechanism used in the FA with the diversification mechanism of DE algorithm. This integration aims to enhance the convergence rate while simultaneously preserving population diversity.
The use of an intensification or exploitation strategy involves directing the search towards a specific local location. This direction is based on either past knowledge or newly acquired information obtained during the search process, which suggests the potential presence of an optimal solution inside that particular region. Differential Evolution (DE) is capable of performing localized searches, particularly when converging towards local optimum solutions. Enhancing the accuracy and convergence rate of an algorithm’s solution may be achieved by effectively balancing the conflicting needs associated with these two factors. By integrating these two aspects, we may use their respective strengths to improve the efficacy of the hybrid algorithm in both exploitation and exploration.
It is vital to acknowledge that the consolidation and restructuring of individual location data takes place alone after the first iteration of concurrent FA and DE procedures. This stands in opposition to the generation of novel sites by random walks or alternative operators. The primary advantage of using this technique is to prioritize the exploration of current locations inside advantageous regions obtained during the preceding iteration, as opposed to traversing less promising places within the search space.
The hybrid approach demonstrates linear time complexity while the population size remains relatively small. Consequently, it efficiently computes solutions in terms of computational cost, mostly attributed to the evaluation of objective function values.

4 Proposed Technique
The proposed approach has two unique elements: the first component includes the embedding of data from PDF documents with the cover picture, while the subsequent component involves the retrieval of concealed information via the steganographic method.
Parameters | ||||
---|---|---|---|---|
FA | Lévy flight | |||
DE |
4.1 Embedding document using the HFA algorithm
The primary aim of our proposed technique is to determine the optimal arrangement of pixels for the reason of incorporating PDF data into a picture, with the ultimate goal of minimizing any possible alteration in the cover photo. The proposed technique is supported by a structure of procedures that encompasses the subsequent steps:
-
1.
The first stage is the perusal of a confidential document intended for incorporation into the cover picture.
-
2.
The binary numbers derived from the PDF document will undergo encryption using the Blowfish encryption technique, as described by [47].
-
3.
To ensure the covert embedding of the encrypted information inside the cover picture, the 128-bit encrypted information is converted into a sequence of binary numbers.
-
4.
The pixels of the cover picture undergo a process of transformation whereby they are converted into their respective 8-bit counterparts.
-
5.
The objective function is obtained by means of a computational procedure from the multi-objective function. This metric functions as a measure of the degree of accuracy between a steganographic picture and the fundamental data. The calculation of the PSNR entails evaluating the difference between the highest achievable magnitude of the steganographic signal and the strength of the noise that impacts the accuracy of its depiction.
-
6.
Population initialization: The firefly population initially forms through a random initialization process in designated groups denoted as and . Within the solution space , a set of fireflies, denoted as , is created randomly in each group , , , , . Here, signifies the cumulative determinants. Consequently, we define a vector , , , to represent the values of the selection variables associated with the firefly.
-
7.
Rank fireflies: The fitness of every firefly within the set is assessed based on the value . Subsequently, this procedure is employed to classify the population into distinct clusters. These clusters are then organized in descending order and distributed in a round-robin fashion.
-
8.
Execute DE algorithm: Once the population for group has been initialized and the parameters have been set, the next step is to execute mutation and crossover operations on the potential solutions.
-
9.
Rank fitness: The fitness of each produced vector would be assessed in a manner akin to the firefly method.
-
10.
Perform Search: An evaluation is carried out inside each group, wherein the proficiency of each participant in each group is assessed. The global results are then updated by considering the fittest individuals from each group. The groups are then merged and subsequently partitioned into the subsequent set of groups denoted as and , using a random allocation process. Consequently, a new population is formed. The aforementioned stages are iteratively executed until either the specified termination criteria are satisfied or an optimum solution is attained.
-
11.
Utilizing the HFA methodology, ascertain the optimal selection of pixels for the purpose of data hiding inside the cover photo.
-
12.
The 1-LSB method is used to discreetly include encrypted information bits into a host image, minimizing any noticeable effects on the image’s integrity.
4.2 Extraction of embedded steganographic PDF information
-
1.
Reduce the dimensionality of a 3-channel steganographic color picture (m x n x 3) to a 2-dimensional image (m’ x n’).
-
2.
Equation 10 is used to create a specialized objective function that takes into account SSIM and PSNR values.
-
3.
Population initialization: The initial population is generated in accordance with section 4.1
-
4.
After configuring all parameters, the HFA algorithm is executed in order to minimize the cost function. The result of this procedure is the generation of an ideal arrangement of pixels that could be used for the retrieval of embedded data.
-
5.
The encrypted data may be accessed by using the order derived from the preceding step.
-
6.
For decryption, Blowfish algorithm is used to transform the data into its binary representation.
-
7.
From the binary bits reconstruct a PDF file.
4.3 Evaluation Metrics
The Peak Signal-to-Noise Ratio (PSNR) is a numerical measure, expressed in decibels (dB), that enables the evaluation of the extent of degradation in the steganographic signal. The correlation between the quality of steganographic photos and the rise in PSNR value is directly proportional.
The SSIM, or Structural Similarity Index, is a quantitative metric utilized to assess the degree of resemblance between two given images. The comparison approach combines the structural characteristics and luminosity measurements of the images under scrutiny. The Structural Similarity Index (SSIM) is calculated by assessing the resemblance between two photographs based on their brightness, contrast, and general arrangement. The system produces a numerical result that is between -1 and 1. A score of 1 indicates a significant degree of resemblance between the two photos, whereas a score of -1 indicates a considerable degree of dissimilarity.
The Mean Squared Error (MSE) is a widely utilized statistical metric in academics for quantifying the average squared discrepancy between the anticipated and actual values of a designated variable. Currently, the assessment focuses on evaluating the quality of the steganographic image in relation to the cover image. The technique involves calculating the mean of the squared differences between corresponding pixels or patterns in the original and hidden images.
(8) |
where represents the and pixel of the cover and steganographic image, and as the sum of the host’s and the steganographic photo’s pixels is the same.
The equation representing the PSNR may be stated as follows:
(9) |
The variable denotes the highest pixel value included in the cover image. Hence, a comprehensive fitness function may be defined as
In this context, and denote two images that are being subjected to a comparative analysis. In the context of our study, the symbol is used to denote a steganographic image, whereas is used to represent a cover image. Since the cover image remains unchanged during the embedding process, the fitness function may be expressed as
(10) |
Here the value of is taken as 0.5.
The Structural Similarity Index (SSIM) value gives a consistent metric for evaluating the quality of a picture. Mathematically, it is defined as:
The window size of the steganographic image is denoted by variables and , while its average is denoted by variables and . The variance of variables and is denoted as and , respectively, whereas their covariance is denoted as .
This statistic considers the influence of the Human Visual System (HVS). The model distortions of the HVS consist of three primary elements: luminance distortion, loss of correction, and contrast distortion. The mathematical models used to represent them are as follows:
(11) |
The primary component has a linear relationship and encompasses a comprehensive range of values ranging from -1 to 1. The second element demonstrates a dynamic range including the values of 0 and 1, inclusively. The aforementioned range is used for the purpose of representing the luminance values of variables and . The third component exhibits a contrast similarity measure that spans from 0 to 1. Hence, the variable is constrained inside the interval [-1, 1]. It is essential to recognize that this condition is true only in cases when the denominator is not equal to zero. When the denominator term in the quotient being examined approaches zero, the expression demonstrates instability. To address this concern, the use of the SSIM is often employed as a means of quantifying the level of similarity between two pictures. The metric is derived by assessing the given equation.
(12) |
Take the window sizes of the original and steganographic images to be and , respectively. Let the median values of and also stand for attention to detail. The symbols and represent the variances of the variables and , respectively. On the other hand, denotes the covariance between and .

5 Experimental setup & Results
In the conducted experiments, the starting population is produced using a process of uniformly distributed initialization at random within the specified ranges or limitations of the intended variables. The number of individuals has been established as 40, whereas the size of the benchmark functions is 30. The halting condition for the algorithm was defined as a maximum number of iterations, which was set at 2000. The studies were conducted with a total of 30 unique runs for every procedure and algorithm, using distinct beginning settings for each individual example.
In the classic firefly technique, the starting attractiveness is determined by setting as . Additionally, the light absorption coefficient is calculated as , where represents the mean distance of the variables. The random variable is defined as , where represents the starting unpredictability component and iter is the iteration index. The reduction occurs in a slow and monotonous manner.
The Lévy distribution is used in order to generate random numbers due to its infrequent production of large jumps, hence decreasing the likelihood of being trapped in a local optima. The differential evolution method employs parameter values of for the coefficient of scaling and for the crossover constant. The findings are presented in the next section.










6 Conclusion
This work presents an innovative methodology for embedding PDF document in an host image, with the objective of minimizing the degradation of image fidelity. The Hybrid Firefly algorithm is employed to get optimal results in terms of minimizing distortion and maximizing efficiency in the process of embedding. The use of this approach successfully addresses the problem of image size inflation that arises from the embedding of PDF file. In the specific context of concealing PDF data inside images without causing noticeable distortion, our methodology exhibits exceptional efficacy in comparison to current cutting-edge methodologies.
Conflict of Interest
The authors of this paper assert that they possess no conflicts of interest.
Data Availability
The dataset employed in this work has been acquired from publically accessible sources and is available upon request from the primary author.
References
- \bibcommenthead
- Zhang et al. [2016] Zhang, L., Liu, L., Yang, X.-S., Dai, Y.: A novel hybrid firefly algorithm for global optimization. PloS one 11(9), 0163230 (2016)
- Yang [2009] Yang, X.-S.: Firefly algorithms for multimodal optimization. In: International Symposium on Stochastic Algorithms, pp. 169–178 (2009). Springer
- Storn and Price [1997] Storn, R., Price, K.: Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization 11, 341–359 (1997)
- Adelson [1990] Adelson, E.H.: Digital signal encoding and decoding apparatus. Google Patents. US Patent 4,939,515 (1990)
- Yadav and Verma [2016] Yadav, V., Verma, N.: Secure multimedia data using digital watermarking: A review. International Journal of Engineering Research and General Science 4(1), 181–187 (2016)
- Van Schyndel et al. [1994] Van Schyndel, R.G., Tirkel, A.Z., Osborne, C.F.: A digital watermark. In: Proceedings of 1st International Conference on Image Processing, vol. 2, pp. 86–90 (1994). IEEE
- Chaudhary et al. [2012] Chaudhary, A., Vasavada, J., Raheja, J., Kumar, S., Sharma, M.: A hash based approach for secure keyless steganography in lossless rgb images. arXiv preprint arXiv:1211.5614 (2012)
- Melman and Evsutin [2023] Melman, A., Evsutin, O.: Comparative study of metaheuristic optimization algorithms for image steganography based on discrete fourier transform domain. Applied Soft Computing 132, 109847 (2023)
- Wang et al. [2001] Wang, R.-Z., Lin, C.-F., Lin, J.-C.: Image hiding by optimal lsb substitution and genetic algorithm. Pattern recognition 34(3), 671–683 (2001)
- Li and Wang [2007] Li, X., Wang, J.: A steganographic method based upon jpeg and particle swarm optimization algorithm. Information Sciences 177(15), 3099–3109 (2007)
- Bedi et al. [2011] Bedi, P., Bansal, R., Sehgal, P.: Using pso in image hiding scheme based on lsb substitution. In: Advances in Computing and Communications: First International Conference, ACC 2011, Kochi, India, July 22-24, 2011, Proceedings, Part III 1, pp. 259–268 (2011). Springer
- Nokhwal et al. [2023] Nokhwal, S., Pahune, S., Chaudhary, A.: Embau: A novel technique to embed audio data using shuffled frog leaping algorithm. In: Proceedings of the 2023 7th International Conference on Intelligent Systems, Metaheuristics & Swarm Intelligence, pp. 79–86 (2023)
- Alkhliwi [2023] Alkhliwi, S.: Huffman encoding with white tailed eagle algorithm-based image steganography technique. Engineering, Technology & Applied Science Research 13(2), 10453–10459 (2023)
- Rathika and Gayathri [2023] Rathika, S., Gayathri, R.: An ensemble of monarchy butterfly optimization based encryption techniques on image steganography for data hiding in thermal images. Multimedia Tools and Applications, 1–18 (2023)
- Kiran and Vidhya [2023] Kiran, G.V., Vidhya, K.: Novel multi-media steganography model using meta-heuristic and deep learning assisted adaptive lifting wavelet transform. Journal of Statistical Computation and Simulation, 1–30 (2023)
- Melman et al. [2023] Melman, A.S., Evsutin, O., et al.: Efficient and error-free information hiding in the hybrid domain of digital images using metaheuristic optimization. Computer research and modeling 15(1), 197–210 (2023)
- Fofanah and Kalokoh [2023] Fofanah, A.J., Kalokoh, I.: Watermarking of frequency and steganography for protection of medical images based on bacterial foraging optimization and genetic algorithm. British J Healthcare Med Res (2023)
- Yang et al. [2023] Yang, A., Bai, Y., Xue, T., Li, Y., Li, J.: A novel image steganography algorithm based on hybrid machine leaning and its application in cyberspace security. Future Generation Computer Systems 145, 293–302 (2023)
- Salim et al. [2023] Salim, A., Mohammed, K.A., Jasem, F.M., Sagheer, A.M.: Image steganography technique based on lorenz chaotic system and bloom filter. International Journal of Computing and Digital Systems 14(1), 1 (2023)
- Sargunam et al. [2023] Sargunam, B., et al.: An empirical study for image steganography and steganalysis: A challenging overview (2023)
- Hameed et al. [2023] Hameed, M.A., Abdel-Aleem, O.A., Hassaballah, M.: A secure data hiding approach based on least-significant-bit and nature-inspired optimization techniques. Journal of Ambient Intelligence and Humanized Computing 14(5), 4639–4657 (2023)
- Mahalakshmi et al. [2023] Mahalakshmi, G., Sarathambekai, S., Vairam, T.: Improving security using swarm intelligence based optimal pixel selection in image steganography-a study. In: 2023 International Conference on Intelligent Systems for Communication, IoT and Security (ICISCoIS), pp. 568–573 (2023). IEEE
- Bahaddad et al. [2023] Bahaddad, A.A., Almarhabi, K.A., Abdel-Khalek, S.: Image steganography technique based on bald eagle search optimal pixel selection with chaotic encryption. Alexandria Engineering Journal 75, 41–54 (2023)
- Sharma et al. [2023] Sharma, N., Chakraborty, C., Kumar, R.: Optimized multimedia data through computationally intelligent algorithms. Multimedia Systems 29(5), 2961–2977 (2023)
- Apau et al. [2023] Apau, R., Hayfron-Acquah, J.B., Asante, M., Twum, F.: A multilayered secure image steganography technique for resisting regular-singular steganalysis attacks using elliptic curve cryptography and genetic algorithm. In: International Conference on ICT for Sustainable Development, pp. 427–439 (2023). Springer
- Wazirali et al. [2019] Wazirali, R., Alasmary, W., Mahmoud, M.M., Alhindi, A.: An optimized steganography hiding capacity and imperceptibly using genetic algorithms. IEEE Access 7, 133496–133508 (2019)
- Nokhwal and Kumar [2023a] Nokhwal, S., Kumar, N.: Pbes: Pca based exemplar sampling algorithm for continual learning. In: 2023 2nd International Conference on Informatics (ICI) (2023). IEEE
- Nokhwal and Kumar [2023b] Nokhwal, S., Kumar, N.: Dss: A diverse sample selection method to preserve knowledge in class-incremental learning. In: 2023 10th International Conference on Soft Computing & Machine Intelligence (ISCMI) (2023). IEEE
- Nokhwal and Kumar [2023c] Nokhwal, S., Kumar, N.: Rtra: Rapid training of regularization-based approaches in continual learning. In: 2023 10th International Conference on Soft Computing & Machine Intelligence (ISCMI) (2023). IEEE
- Tanwer et al. [2020] Tanwer, A., Reel, P.S., Reel, S., Nokhwal, S., Nokhwal, S., Hussain, M., Bist, A.S.: System and method for camera based cloth fitting and recommendation. Google Patents. US Patent App. 16/448,094 (2020)
- Chen et al. [2021] Chen, Y., Zhou, L., Zhou, Y., Chen, Y., Hu, S., Dong, Z.: Multiple histograms shifting-based video data hiding using compression sensing. IEEE Access 10, 699–707 (2021)
- Doğan [2016] Doğan, Ş.: A new data hiding method based on chaos embedded genetic algorithm for color image. Artificial Intelligence Review 46, 129–143 (2016)
- Tong et al. [2023] Tong, H., Li, T., Xu, Y., Su, X., Qiao, G.: Chaotic coyote optimization algorithm for image encryption and steganography. Multimedia Tools and Applications, 1–27 (2023)
- Li and He [2018] Li, Z., He, Y.: Steganography with pixel-value differencing and modulus function based on pso. Journal of information security and applications 43, 47–52 (2018)
- Rustad et al. [2022] Rustad, S., De Rosal, I.M.S., Andono, P.N., Syukur, A., et al.: Optimization of cross diagonal pixel value differencing and modulus function steganography using edge area block patterns. Cybernetics and Information Technologies 22(2), 145–159 (2022)
- Mohsin et al. [2019] Mohsin, A.H., Zaidan, A., Zaidan, B., Albahri, O., Albahri, A., Alsalem, M., Mohammed, K., Nidhal, S., Jalood, N.S., Jasim, A.N., et al.: New method of image steganography based on particle swarm optimization algorithm in spatial domain for high embedding capacity. IEEE Access 7, 168994–169010 (2019)
- Bala Krishnan et al. [2022] Bala Krishnan, R., Rajesh Kumar, N., Raajan, N., Manikandan, G., Srinivasan, A., Narasimhan, D.: An approach for attaining content confidentiality on medical images through image encryption with steganography. Wireless Personal Communications, 1–17 (2022)
- Nipanikar et al. [2018] Nipanikar, S., Deepthi, V.H., Kulkarni, N.: A sparse representation based image steganography using particle swarm optimization and wavelet transform. Alexandria engineering journal 57(4), 2343–2356 (2018)
- Snasel et al. [2020] Snasel, V., Kromer, P., Safarik, J., Platos, J.: Jpeg steganography with particle swarm optimization accelerated by avx. Concurrency and Computation: Practice and Experience 32(8), 5448 (2020)
- Soto et al. [2018] Soto, R., Crawford, B., Olivares, R., Taramasco, C., Figueroa, I., Gómez, Á., Castro, C., Paredes, F., et al.: Adaptive black hole algorithm for solving the set covering problem. Mathematical Problems in Engineering 2018 (2018)
- Cheung et al. [2016] Cheung, N.J., Ding, X.-M., Shen, H.-B.: A nonhomogeneous cuckoo search algorithm based on quantum mechanism for real parameter optimization. IEEE transactions on cybernetics 47(2), 391–402 (2016)
- Gerami et al. [2012] Gerami, P., Ebrahim, S., Bashardoost, M.: Least significant bit image steganography using particle swarm optimization and optical pixel adjustment. International Journal of Computer Applications 55(2) (2012)
- Mohamed et al. [2011] Mohamed, M., Al-Afari, F., Bamatraf, M.A.: Data hiding by lsb substitution using genetic optimal key-permutation. Int. Arab. J. e Technol. 2(1), 11–17 (2011)
- Mohamed et al. [2018] Mohamed, M.H., Mofaddel, M.A., Abd El-Naser, T.Y.: Hiding information using secret sharing scheme based on gene expression programming (2018)
- Douiri and Elbernoussi [2017] Douiri, S.M., Elbernoussi, S.: A steganographic method using tabu search approach. In: 2017 Sixteenth Mexican International Conference on Artificial Intelligence (MICAI), pp. 30–33 (2017). IEEE
- Habibi Lashkari et al. [2011] Habibi Lashkari, A., Abdul Manaf, A., Masrom, M., Mohd Daud, S.: A survey on image steganography algorithms and evaluation. In: International Conference on Digital Information Processing and Communications, pp. 406–418 (2011). Springer
- Schneier [1993] Schneier, B.: Description of a new variable-length key, 64-bit block cipher (blowfish). In: International Workshop on Fast Software Encryption, pp. 191–204 (1993). Springer