Models whose parameters were optimized by genetic algorithm (GA) were developed to predict the longitudinal dispersion coefficient in natural channels. Following the existing equations in the literature, ten different linear and nonlinear models were first constructed. The models relate the dispersion coefficient to flow and channel characteristics. The GA model was then employed to find the optimal values of the constructed model parameters by minimizing the mean absolute error function (objective function). The GA model utilized an 80% cross-over rate and 4% mutation rate. It started each computation with a population of 100 chromosomes in the gene pool. For each model, while minimizing the objective function, the values of the model parameters were constrained between [−10, +10] at each iteration. The optimal values of the model parameters were obtained using a calibration set of 54 out of 80 sets of measured data. The minimum error was obtained for the case where the model was a linear equation relating dispersion coefficient to flow discharge. The model performance was then satisfactorily tested against the remaining 26 measured validation datasets. It performed better than the existing equations. It yielded minimum errors of MAE = 21.4 m2/s (mean absolute error) and RMSE = 28.5 m2/s (root mean-squares error) and a maximum accuracy rate of 81%.

This content is only available as a PDF.