Abstract
Around the world, it is growing harder to provide clean and safe drinking water. In wastewater treatment, sensors are employed, and the Internet of Things (IoT) is used to transmit data. Chemical oxygen demand (COD), biochemical demand (BOD), total nitrogen (T-N), total suspended solids (TSS), and phosphorous (T-P) components all contribute to eutrophication, which must be avoided. The wastewater sector has lately made efforts to become carbon neutral; however, the environmental impact and the road to carbon neutrality have received very little attention. The challenges are caused by poor prediction. This research proposes deep learning modified neural networks (DLMNN) with Binary Spotted Hyena Optimizer (BSHO) for modeling and calculations to address this challenge. All efforts for resource recovery, water reuse, and energy recovery partially attain this objective. In contrast to previous modeling techniques, the DLMNN-training BSHOs and validation demonstrated outstanding accuracy shown by the model's high coefficient (R2) for both training and testing. Also covered are recent developments and problems with nanomaterials made from sustainable carbon and graphene quantum dots, as well as their uses in the treatment and purification of wastewater. The proposed model DLMNN-BSHO achieved 95.936% precision, 95.326% recall, 93.747% F-score, and 99.637% accuracy.
HIGHLIGHTS
The carbon neutrality has received much attention for water treatment.
The deep learning modified neural networks (DLMNN) with Binary Spotted Hyena Optimizer (BSHO) for modeling were used.
The proposed model DLMNN-BSHO achieved 95.936% precision, 95.326% recall, 93.747% F-score, and 99.637% accuracy.
INTRODUCTION
Intelligent models have developed to the point where they can model complex processes in water treatment simulations. It is challenging to evaluate and forecast complex ecological systems' behavior (Amoueyan 2017). Numerous factors influence environmental impacts, and the environmental engineers who study them and their interactions can be complicated, making evaluation challenging. It is difficult and demanding to operate industrial wastewater treatment systems with changing effluent quality and quantity (Elgallal et al. 2016). Industrial wastewater must be treated despite any barriers.
Wastewater treatment is one environmental industry that has benefited from AI techniques (ANN). Treatment of water is a complicated process. Thanks to recent advancements in intelligent approaches, they can now be used in complex modeling systems (Liu & Chung 2014). They can enhance performance prediction because of their accuracy, dependability, and engineering applications. A wastewater treatment facility's effectiveness is impacted by various variables: biological oxygen demands (BOD), total suspended matter, and chemical oxygen requirements (TSSs). Recent WWTP assessments used these characteristics as a model (Courault et al. 2017).
Pollutants from wastewater are removed at WWTPs by activated sludge (ASP) (Nasrollahzadeh et al. 2020a). Nitrogen, phosphorus, and biochemicals are examples of contaminants. A nitrogen-based carbon-doped ammonia sensor is used to predict wastewater quality using total suspended solids (TSS), TN, TP, and biological oxygen demand (BOD). This is separated from the sewage system using mathematical and physical models to improve wastewater treatment based on weather conditions (Khan et al. 2020a). Deep learning or machine learning must be used to process it as indicators of influent wastewater (Pakzad et al. 2019). IoT devices can deliver granular data at a much higher level of detail. For the energy usage of a building, for instance, IoT devices can collect data at the level of individual rooms or equipment rather than depending on aggregated data. Better insights into energy usage trends and areas for development or optimization are made possible by this granularity. IoT gadgets can monitor the functionality and state of infrastructure and machinery. Forecasting models can determine the need for maintenance before an equipment failure by analyzing real-time data. With this proactive strategy, energy waste from broken equipment is minimized, maintenance schedules are optimized, and the carbon impact of reactive repairs or replacements is decreased.
Organic contaminants and pharmaceutically active substances can be eliminated from nanomaterials using a variety of procedures, including adsorption, photocatalysis, AOPs, filtration, and others (Cai et al. 2018). Examples of emerging contaminants in wastewater discharge include pesticides, textile dyes, plasticizers, disinfection byproducts, PCBs, PAHs, PFOA, and PFOS, endocrine-disrupting substances, pharmaceuticals, and personal care products. When used as adsorbents, carbonaceous nanoparticles can remove PFASs from water. Adsorption of PFOS and PFOA involves ligand exchange and hydrophobic, electrostatic, and hydrogen bonding interactions. Large-surface-area nanomaterials with high reactivities (Zhang et al. 2019) show great promise for eliminating these pollutants.
Developing ‘greener’ and more environmentally friendly procedures has received considerable attention. Examples include using fewer dangerous solvents and fewer chemical reagents, precursors, and catalysts. Green chemistry and nanotechnology can address the challenges of emerging and significant contaminants and microorganisms. New and suitable methods are needed to destroy hazardous contaminants and pollutants safely. The environmental sustainability of processes that lead to negative externalities, such as water treatment using hydrogen and renewable energy, can be improved by green nanotechnology. Reduced costs and increased efficacy in wastewater treatment are possible thanks to nanoscale filtration, pollutant adsorption on nanoparticles (NPS), and contaminant breakdown by nanocatalysts (Sajjadi et al. 2020). Developing and developed worlds need more potable fresh water because micropollutants contaminate water sources (Ali et al. 2017; Das et al. 2018). Current decontamination techniques (chlorination and ozonation) produce toxic byproducts and use chemicals excessively (Westerhoff et al. 2016).
Due to their distinctive advantages, even in the face of possible health hazards, higher manufacturing costs, selectivity, durability, and recyclability, nanomaterials are attractive alternatives to traditional wastewater treatment methods (Lu & Astruc 2020). More research is needed on nanomaterials' environmental effects, toxicity, removal kinetics, simulation, and the behavior of dangerous pollutants. Drugs, endocrine disruptors, pesticides, toxic organic dyes, personal care items, detergents, and other innovative and resistant contaminants have been the primary focus of research (Mukherjee et al. 2020).
This research introduces deep learning modified neural networks (DLMNN) with Binary Spotted Hyena Optimizer (BSHO) to model and compute solutions to this issue. The methods utilized within and outside wastewater treatment facilities to treat wastewater carbon-neutrally are then discussed. All efforts for resource recovery, water reuse, and energy recovery partly meet this goal. In contrast to previous modeling techniques, the recommended model DLMNN-training BSHOs and validation demonstrated the model's outstanding accuracy, as evidenced by the model's high determination coefficient (R2) for both the training and testing stages. Recent advancements and issues with nanomaterials made from maintainable carbon and graphene quantum dots and their uses in the purification and treatment of wastewater are also discussed.
PROPOSED SYSTEM
Multiple boundaries for carbon accounting of the wastewater system
A sketch of multiple boundaries for wastewater treatment carbon accounting.
There are three limits: the ecological system, human society, and the limits of WWTPs. As flows enter the boundary, the inputs for most of the facilities within Boundaries 1 and 2 are streamlined because they need these resources to run. Indicators represent particular carbon emissions. Consumption of energy, chemicals, and recycled product flows imply indirect carbon emissions, which positively and negatively affect carbon emissions.
The water line is the most critical component in inventories of carbon emissions produced by wastewater treatment facilities. The entire process is covered by this line, which begins with the intake of incoming sewage and ends with the effluent discharge after going through numerous physical, chemical, and biological treatment procedures. These inventories monitor the system's CO2 emissions. Depending on area effluent targets and rules, wastewater treatment might be optional. Secondary biological treatment effluent may be released or applied to land as irrigation (Mainardis et al. 2022). A tertiary treatment step, like adding a membrane reactor, is necessary to reuse effluent in industrial production or as reclaimed water (Perumal et al. 2022).
Sludge, a crucial consequence of wastewater treatment, tends to gather contaminants. When assessing carbon emissions in the context of wastewater treatment, the critical significance of successfully treating and managing sludge should be taken into consideration. (Geetha et al. 2022) After thickening, sludge can be disposed of in several ways (Arias et al. 2021). Through conditioning and dewatering, sludge volume can be decreased. Anaerobic digestion (AD), composting, and pyrolysis are a few techniques that can be used to salvage the energy and resources in sludge. Stagnant sludge is disposed of in landfills, incinerators, or on farms. The first circle represents CO2 emissions from WWTP sludge treatment. Determine the pollution caused by WWTP sludge treatment and disposal. To achieve energy independence at the WWTP, the first circle considers the water line, the sludge line, and a variety of other energy conservation and usage strategies. Solar panels, wind turbines, heat pumps, and other technologies have been installed.
WWTP infrastructure expansion to urban infrastructure
Water must be collected and routed through the sewer system before being processed in WWTPs. The existing sewer network and WWTPs combine to form an urban wastewater system. Municipal facilities incorporate methodical management practices as more functions are added to sewage treatment. Co-digesting municipal trash, especially food waste, with sludge in WWTPs boosts biogas generation and decreases the amount of sludge transferred to landfills or incinerators. Some tertiary treatment WWTPs also produce reclaimed water for use in municipal or industrial applications. They are everywhere to be found. Carbon-neutral wastewater treatment affects the entire water and wastewater system and urban infrastructure. This goes beyond the medical center.
It is further expanding to human society and ecological systems
The third circle includes human society, water, soil, vegetation, etc. Water bodies receive treated wastewater in addition to serving. The distribution of resources recovered during wastewater treatment may lower the demand for comparable industrial goods. Agriculture and human society both share these resources. Depending on whether a region is urban or rural, sewage treatment may vary. Decentralized sewage treatment systems are used in rural areas because they are more environmentally friendly than central systems used in densely populated urban areas. Despite the advantages of both centralized and distributed models, resource recovery through decentralization is increasingly popular.
Anomaly detection and sensor calibration
If a sensor is not working right or there are strange things in the data, it is a good idea to use anomaly detection and sensor calibration methods. These methods can find and fix wrong sensor readings, making the forecasting model more accurate. The model's performance can be improved by finding, removing, or correcting strange sensor data.
Robust system design
DLMNN-BSHO primarily focuses on the predicting model, but a robust system design can help prevent system outages. You can set up backup and fallback systems to ensure the forecasting system keeps working despite short-term problems. This can be done with backup power sources, multiple places to store data, and infrastructure that can handle situations.
Classification using DLMNN classifier
The CFOA labeled as DLMNN optimizes the DLNN weight values. As a result, the amount of BP required to achieve the desired result increases. As a result, it has been improved. Therefore, the DLMNN is known as DLMNN. DLMNN layers use a hidden activation layer, and the consequences of these layers are also passed on to the next layer. They have a significant impact on the classifier's output. These processes are involved in DLMNN classification:
- •
There are w1, w2, w3, and so on, and the n no. of values of pre-processed data are represented by in , and wa, respectively, in this example.
- •
- •
The exponential of FAa represents the exponential of Pda. Other activation functions can be used with the proposed DLMNN, such as the Sigmoid or Hyperbolic Tangents functions.
- •For each DLMNN layer, repeat the preceding three steps. By combining the weights of all the input signals, you can finally attempt to estimate the output by determining the value of the output layer neurons.where Oa indicates the value of the layer preceding the output layer, wj defines the hidden layer weights, and Va signifies the in-question output component.
- •
Ti specifies the desired output, while Es indicates the error signal.
- •
- •
WCA is the weight correction, λ is the momentum term, and δa is the error distributed throughout the network.
Algorithm 1 is mentioned below:
Algorithm 1: DLMNN |
Do |
Compute the output |
End for |
k with k = n + 1 to DLMNN for k = 1 to n do |
Individual connections can be made by using |
Use the BSHO Algorithm to obtain the connection vector V for neuron i. |
Using the BSHO Algorithm, obtain the synaptic weights sw for neuron i. |
Using the BSHO Algorithm, |
Determine neuron i's bias bf. |
Neuron i's transfer function index tf can be obtained from the following: |
End for k = DLMNNm to DLMNN and |
Then compute neuron k's output as perform |
Weight Optimization with DLMNN Algorithm by Compiling the output at the end |
Algorithm 1: DLMNN |
Do |
Compute the output |
End for |
k with k = n + 1 to DLMNN for k = 1 to n do |
Individual connections can be made by using |
Use the BSHO Algorithm to obtain the connection vector V for neuron i. |
Using the BSHO Algorithm, obtain the synaptic weights sw for neuron i. |
Using the BSHO Algorithm, |
Determine neuron i's bias bf. |
Neuron i's transfer function index tf can be obtained from the following: |
End for k = DLMNNm to DLMNN and |
Then compute neuron k's output as perform |
Weight Optimization with DLMNN Algorithm by Compiling the output at the end |
The binary spotted hyena optimizer
The BSHO is strongly advised for use in this study. This program makes it possible to simulate the discrete binary search space used by spotted hyenas. Its hunting style is comparable to that of spotted hyenas. The spotted hyena is renowned for working together with other hunters. They locate their prey, then encircle and attack it. The BSHO visualizes the hypercube-like search space, restricting search agent movement to the cube's four corners. Since each solution is binary, it can only accept 0 or 1 values. Additionally, spotted hyenas are continually moving around, following the location of the most frequented one as fresh information becomes available. The spotted hyenas' positions are dynamically adjusted using the hyperbolic tangent function. The revised standings are calculated using the specified tangent function and lie between ‘0’ and ‘1’.
Spotted hyena optimizer





. In this case,
is lowered linearly over iterations from 5 to 0. It keeps the balance between exploration and exploitation.
a,
are the random vectors in [0,1]. Contain the random vectors that fall between [0,1]. To give spotted hyenas access to more areas close to where they currently are, the values of
and
are modified. Spotted hyenas update their positions in a haphazard pattern around the prey using Equations (11) and (12).



where depending on where the top search agent placed you, all other search agents' positions are modified. Hyenas can use BSHO to update their positions and attack their prey.
BSHO was created to address issues involving ongoing optimization. Discrete problem-solving is yet to be possible. Binary SHO addresses this issue (BSHO). Since variables can only be 0 or 1, BSHO substituted binary encoding for SHO's float-encoding system (Wei et al. 2020). The spotted hyena's position updating system might make local binary searches more effective. The hyperbolic tangent function is the most effective method for tracking spotted hyenas in BSHO, utilizing only ‘0’ and ‘1’ states. The dimension of the search space is restricted to a range between 0 and 1 due to the function selection. In contrast to previous binary metaheuristics, BSHO uses a cluster creation technique. The mathematical graphic that follows illustrates the formation of clusters. Algorithm 2 is mentioned below.
Algorithm 2: for Binary Spotted Hyena Optimization |
Input: ![]() |
Output: Hyena Best Spotted |
1. Initialize the population by randomly seeding n hyenas. |
2. Determine the worth of each search agent based on their level of physical fitness. |
3. Taking (p Max iterations) into account |
4. Each and every spotted hyena |
5. Adjust the search agent's position to comply with Equation (16). |
6. End for |
7. Make the following adjustments to the control parameters U, T, h, and N. |
8. Examine the spotted hyenas’ individual levels of fitness. |
9. If the new solution is superior to the old one, update ![]() |
10. Update the cluster ![]() ![]() |
11. p = p + 1 |
End while |
Algorithm 2: for Binary Spotted Hyena Optimization |
Input: ![]() |
Output: Hyena Best Spotted |
1. Initialize the population by randomly seeding n hyenas. |
2. Determine the worth of each search agent based on their level of physical fitness. |
3. Taking (p Max iterations) into account |
4. Each and every spotted hyena |
5. Adjust the search agent's position to comply with Equation (16). |
6. End for |
7. Make the following adjustments to the control parameters U, T, h, and N. |
8. Examine the spotted hyenas’ individual levels of fitness. |
9. If the new solution is superior to the old one, update ![]() |
10. Update the cluster ![]() ![]() |
11. p = p + 1 |
End while |
Let RAND stand in for a 0–1 uniformly distributed random number. In this case, the spotted hyena's position is symbolized by the binary code , where
= 1, where d stands for dimension and s for iterations.
RESULT AND DISCUSSION
This method employs Bayesian K-nearest neighbors (BKNN), Energy-based Learning approach for Multi-Agent activity forecasting (ELMA), and the fusion of DLMNN-BSHO with parametric values to predict wastewater quality using Python and version 3.1 for data collection and analysis. Evaluation matrices were used to indicate the classification, accuracy, specificity, sensitivity, precision, and score for water quality.
Evaluation metrics
Precision, recall, F-score, accuracy, and root mean squared error (RMSE) are standard metrics. These metrics are generally calculated using the four main metrics of a positive/negative binary classification outcome: true positive and true negative, which represent states that were correctly identified, and false positive and false negative, which signify conditions that were incorrectly identified (FN). The following are the statistical validation and evaluation parameters for our proposed wastewater treatment architecture:
Precision
Precision analysis for the DLMNN-BSHO method using existing systems
Number of data from dataset . | ANN . | CNN . | BKNN . | ELM . | DLMNN-BSHO . |
---|---|---|---|---|---|
100 | 88.947 | 91.647 | 86.547 | 85.103 | 95.764 |
150 | 89.647 | 91.537 | 87.446 | 85.537 | 94.038 |
200 | 90.877 | 93.847 | 87.904 | 86.836 | 94.637 |
250 | 92.543 | 91.974 | 88.747 | 87.746 | 95.635 |
300 | 90.546 | 93.748 | 88.446 | 86.544 | 95.936 |
Number of data from dataset . | ANN . | CNN . | BKNN . | ELM . | DLMNN-BSHO . |
---|---|---|---|---|---|
100 | 88.947 | 91.647 | 86.547 | 85.103 | 95.764 |
150 | 89.647 | 91.537 | 87.446 | 85.537 | 94.038 |
200 | 90.877 | 93.847 | 87.904 | 86.836 | 94.637 |
250 | 92.543 | 91.974 | 88.747 | 87.746 | 95.635 |
300 | 90.546 | 93.748 | 88.446 | 86.544 | 95.936 |
Precision analysis for the DLMNN-BSHO approach using existing systems.
Recall
Analysis of DLMNN-BSHO approach recall using existing systems
Number of data from dataset . | ANN . | CNN . | BKNN . | ELM . | DLMNN-BSHO . |
---|---|---|---|---|---|
100 | 83.864 | 92.437 | 89.546 | 86.084 | 93.448 |
150 | 84.873 | 93.764 | 90.308 | 86.844 | 94.833 |
200 | 83.048 | 92.837 | 89.647 | 88.647 | 94.038 |
250 | 85.327 | 92.984 | 90.647 | 88.225 | 95.536 |
300 | 84.653 | 93.747 | 90.964 | 87.226 | 95.326 |
Number of data from dataset . | ANN . | CNN . | BKNN . | ELM . | DLMNN-BSHO . |
---|---|---|---|---|---|
100 | 83.864 | 92.437 | 89.546 | 86.084 | 93.448 |
150 | 84.873 | 93.764 | 90.308 | 86.844 | 94.833 |
200 | 83.048 | 92.837 | 89.647 | 88.647 | 94.038 |
250 | 85.327 | 92.984 | 90.647 | 88.225 | 95.536 |
300 | 84.653 | 93.747 | 90.964 | 87.226 | 95.326 |
F-score
Analysis of F-scores for the DLMNN-BSHO approach using existing systems
Number of data from dataset . | ANN . | CNN . | BKNN . | ELM . | DLMNN-BSHO . |
---|---|---|---|---|---|
100 | 79.763 | 83.408 | 85.974 | 88.947 | 92.748 |
150 | 80.527 | 83.826 | 85.736 | 89.064 | 91.747 |
200 | 80.043 | 84.863 | 86.436 | 88.646 | 92.647 |
250 | 79.536 | 85.227 | 86.847 | 89.546 | 91.394 |
300 | 81.436 | 84.947 | 87.747 | 90.547 | 93.747 |
Number of data from dataset . | ANN . | CNN . | BKNN . | ELM . | DLMNN-BSHO . |
---|---|---|---|---|---|
100 | 79.763 | 83.408 | 85.974 | 88.947 | 92.748 |
150 | 80.527 | 83.826 | 85.736 | 89.064 | 91.747 |
200 | 80.043 | 84.863 | 86.436 | 88.646 | 92.647 |
250 | 79.536 | 85.227 | 86.847 | 89.546 | 91.394 |
300 | 81.436 | 84.947 | 87.747 | 90.547 | 93.747 |
Accuracy
Accuracy analysis for DLMNN-BSHO method with existing systems
Number of data from dataset . | ANN . | CNN . | BKNN . | ELM . | DLMNN-BSHO . |
---|---|---|---|---|---|
100 | 96.436 | 94.567 | 88.735 | 93.038 | 98.574 |
150 | 96.753 | 94.903 | 88.393 | 93.674 | 98.054 |
200 | 97.843 | 93.943 | 89.158 | 94.275 | 98.325 |
250 | 97.325 | 94.363 | 89.335 | 93.827 | 99.745 |
300 | 97.735 | 95.546 | 91.563 | 94.907 | 99.637 |
Number of data from dataset . | ANN . | CNN . | BKNN . | ELM . | DLMNN-BSHO . |
---|---|---|---|---|---|
100 | 96.436 | 94.567 | 88.735 | 93.038 | 98.574 |
150 | 96.753 | 94.903 | 88.393 | 93.674 | 98.054 |
200 | 97.843 | 93.943 | 89.158 | 94.275 | 98.325 |
250 | 97.325 | 94.363 | 89.335 | 93.827 | 99.745 |
300 | 97.735 | 95.546 | 91.563 | 94.907 | 99.637 |
Root mean squared error
Analysis of the RMSE for the DLMNN-BSHO approach with existing systems
Number of data from dataset . | ANN . | CNN . | BKNN . | ELM . | DLMNN-BSHO . |
---|---|---|---|---|---|
100 | 54.732 | 50.536 | 46.826 | 44.432 | 42.86 |
150 | 54.63 | 51.73 | 47.453 | 44.738 | 42.08 |
200 | 55.98 | 50.93 | 47.22 | 45.07 | 43.625 |
250 | 55.637 | 51.532 | 46.53 | 45.972 | 43.946 |
300 | 56.03 | 51.82 | 48.282 | 45.62 | 42.827 |
Number of data from dataset . | ANN . | CNN . | BKNN . | ELM . | DLMNN-BSHO . |
---|---|---|---|---|---|
100 | 54.732 | 50.536 | 46.826 | 44.432 | 42.86 |
150 | 54.63 | 51.73 | 47.453 | 44.738 | 42.08 |
200 | 55.98 | 50.93 | 47.22 | 45.07 | 43.625 |
250 | 55.637 | 51.532 | 46.53 | 45.972 | 43.946 |
300 | 56.03 | 51.82 | 48.282 | 45.62 | 42.827 |
Analysis of the RMSE for the DLMNN-BSHO approach with existing systems.
Training and testing validation
Training and testing validation analysis for DLMNN-BSHO technique with existing systems
Epochs . | Training validation . | Testing validation . |
---|---|---|
0 | 1.36 | 1.34 |
10 | 1.25 | 1.23 |
20 | 1.06 | 1.03 |
30 | 0.85 | 0.83 |
40 | 0.72 | 0.71 |
50 | 0.66 | 0.64 |
60 | 0.53 | 0.51 |
70 | 0.44 | 0.40 |
80 | 0.24 | 0.22 |
90 | 0.18 | 0.16 |
100 | 0.15 | 0.13 |
Epochs . | Training validation . | Testing validation . |
---|---|---|
0 | 1.36 | 1.34 |
10 | 1.25 | 1.23 |
20 | 1.06 | 1.03 |
30 | 0.85 | 0.83 |
40 | 0.72 | 0.71 |
50 | 0.66 | 0.64 |
60 | 0.53 | 0.51 |
70 | 0.44 | 0.40 |
80 | 0.24 | 0.22 |
90 | 0.18 | 0.16 |
100 | 0.15 | 0.13 |
Training and testing validation analysis for DLMNN-BSHO technique with existing systems.
Training and testing validation analysis for DLMNN-BSHO technique with existing systems.
CONCLUSIONS
According to predictions, activated carbon will absorb unwanted influent indicators in water treatment plants. The following recommendations are presented in this paper to address this issue: DLMNN and BSHO are used for modeling and calculations. The techniques used within and outside wastewater treatment plants to achieve carbon-neutral wastewater treatment are then described. All resource recovery, water reuse, and energy recovery efforts contribute to this goal. Compared to previous modeling techniques, the recommended model DLMNN-training BSHOs and validation demonstrated the model's exceptional accuracy, as evidenced by the model's high determination coefficient (R2) for both the training and testing stages. Recent developments and issues with nanomaterials made from sustainable carbon and graphene quantum dots and how they can be used to treat and purify wastewater are also discussed. Precision, recall, F-score, RMSE, Training and Testing validation, and accuracy were used to evaluate the model. Precision, recall, F-score, RMSE, Training and Testing validation, and accuracy averaged 95.936%, 95.326%, 93.747%, 42.827%, 0.15 l, and 99.637%. ANN, CNN, BKNN, and Extreme Learning Machines are the existing systems used in this paper (ELM). The work could be expanded to include optimizing the wastewater treatment model's performance across the range of analysis states. Performance enhancement is a goal of swarm intelligence.
AUTHOR CONTRIBUTIONS
L. S. S. rendered support in data curation and drafting the article. H. A. conceptualized the study deep learning algorithm. A. H. A. developed the methodology, rendered support in formal analysis and reviewed and edited the draft. V. R. A. developed the methodology and rendered support in formal analysis. All authors have read and agreed to this version of the manuscript.
DATA AVAILABILITY STATEMENT
All relevant data are included in the paper or its Supplementary Information.
CONFLICT OF INTEREST
The authors declare there is no conflict.