To overcome classical jar test limits of water treatment plants and offer substantial savings of time and money for operators, artificial neural network technique is applied in this study to large databases of three treatment plants with different processes in order to build models to predict the optimal dose of coagulant. Pre-modeling techniques, like data scaling and training database choice, are used to guarantee models with the lowest errors. Two models are then selected, with turbidity, conductivity, and pH as inputs for both raw and treated water. The first model, L45-MOD, is specific to raw water with less than 45.5 NTU turbidity, or else the second model ATP-MOD would be adopted. Compared to truly injected coagulant doses and to previous models, the selected models have good performances when tested on various databases: a correlation coefficient higher than 0.8, a mean absolute error of 5.47 g/m3 for the first model and 5.69 g/m3 for the second model. The strength of this study is the ability of the models to be extrapolated and easily adopted by other treatment plants whatever the process used.