Adaptive water management solutions such as controlled drainage have raised interest in Nordic areas due to climate variability. It is not fully known how controlled drainage affects seasonal field water balance or can help in preventing water scarcity during dry growing seasons (GSs). The objective was to simulate the effects of controlled drainage on field hydrology using a well-tested, process-based hydrological model. The FLUSH model was calibrated and validated to an experimental field. The model performance with non-local input data was moderate but acceptable for running the controlled drainage scenarios to test the response of the water management method to meteorological forcing. Simulation results showed that controlled drainage reduced drain discharge while increasing surface layer runoff and shallow groundwater outflow. Groundwater depths from the scenario simulations demonstrated that controlled drainage could keep the depth closer to the soil surface, but the effect diminished during the dry conditions. Controlled drainage can be used to change the water flow pathways but has a secondary effect compared with the primary meteorological drivers. The field data set and FLUSH formed a novel computational platform to study the impacts of different water management options on the whole water balance and spatial variability of groundwater depths.
Controlled drainage affected all water flow pathways in the field.
During drier growing periods, groundwater depth descended under the control level.
There was a risk of high drain discharge volume after control was released.
Meteorological conditions affected the hydrology more than control options.
In the Nordic-Baltic region, climate change is projected to increase precipitation outside the growing season as well as extreme events throughout the year (Øygarden et al. 2014) making it more demanding to design optimal water management systems for agricultural areas. Varying climate conditions in the Nordic areas have raised concern to find adaptive water management methods to enable productive cultivation (e.g. Øygarden et al. 2014). In Finland, agricultural water management has focused on field drainage, but recent growing seasons have shown the need for water-saving methods such as controlled drainage. The idea of controlled drainage is to decrease the drainage depth from soil surface when there is need to save water, but also to prevent nutrient loads from the field to surface waters (e.g. Carstensen et al. 2016).
Based on the literature, the effects of controlled drainage are ambiguous. Water protection benefits of controlled drainage are mainly based on reducing subsurface drain discharge (e.g. Carstensen et al. 2016; Sunohara et al. 2016). Studies in different countries have reported variable reduction potentials of drain discharge (Table 1). Investigating the effects of controlled drainage on the whole field water balance can explain mechanisms behind these differences and demonstrate how controlled drainage can be used efficiently for reducing nutrient loads to surface waters. For example, Wesström & Messing (2007) reported that controlled drainage reduced wintertime drain outflow by 60–95% compared with the conventional subsurface drainage. During their monitoring period of 4 years, they did not observe any surface runoff or monitor shallow groundwater outflow, but noticed that groundwater levels fluctuated from 20 to 100 cm below the soil surface in the controlled drainage plots during winter months. The study did not report how precipitation was divided into water balance components. Rozemeijer et al. (2016) were among the first who considered the effects of controlled drainage on water flow routes other than solely drain discharge. They pointed out uncertainties in reducing nitrogen (N) and phosphorus (P) transport to surface waters when applying controlled drainage, because water flow through other pathways (groundwater outflow or surface runoff) increased at the same time.
Solely monitoring drain discharge may not reveal hydrological connections or flow processes occurring in the field and affecting the functioning of controlled drainage. Österholm et al. (2015) reported higher drain discharge for controlled drainage compared with the conventional subsurface drainage plot. Another study at the same site hypothesized that there might be hydrological connections between the plots and inflow from the conventional plot to the controlled drainage plot which resulted in greater N leaching from controlled drainage compared with conventional drainage (Yli-Halla et al. 2020).
There are multiple factors (control level, timing, and length of control) that have effects on the functioning of controlled drainage, especially the drain discharge reduction potential that has almost a direct link to reducing nutrient loads. Carstensen et al. (2016) noticed that the control level had a significant effect on the reduction of drain discharge. Lower control level produced only 6–12% reduction, but higher control level 36–52% reduction. Wesström & Messing (2007) were able to show 60–95% reduction of drain discharge when the control level was 20–40 cm below the soil surface. Saadat et al. (2018) noticed that controlled drainage field plots were producing more drain discharge compared with conventional subsurface drainage plots during free drainage periods (i.e. inactive regulation), meaning that the conventional plots were not yielding more discharge compared with control plots in general.
The effects of controlled drainage on growing season groundwater depths are investigated less than the discharge impacts. Wesström & Messing (2007) noticed that the groundwater level was not maintained at the control depth especially during the growing season when the level was mostly at the drain depth (around 1.0 m). Growing season groundwater levels are most likely dominated by the crop water uptake.
There have been multiple experimental field campaigns to study the effects of controlled drainage, but challenges arise when aiming to isolate the effects of controlled drainage from other factors such as meteorological conditions (e.g. Rozemeijer et al. 2016) or hydrological connections (Saadat et al. 2018). Mathematical models provide the means to test different water management scenarios to the same input data and rule out other factors affecting the results. However, uncertainties regarding the model outputs arise without verifying the model performance against field data. By combining a field data set with a mathematical model, it is possible to create a computational platform to study different water management scenarios in present (e.g. Häggblom et al. 2019) and future climate (e.g. Salla et al. 2021). Scenario simulations should be conducted with data-based model applications to make reliable estimates about the studied management method.
The FLUSH model (Warsta et al. 2013) has been developed and applied in research purposes in Nordic conditions (e.g. Turunen et al. 2013; Nousiainen et al. 2015; Häggblom et al. 2019), and it has been shown to simulate water flow processes and close the water balance in cultivated, subsurface drained fields. The model has been benchmarked to multiple research sites and has been proven to be a powerful tool in understanding field hydrological processes and gain new insights on the processes that are difficult to measure or monitor (e.g. groundwater outflow and connectivity between the surrounding areas). The previous FLUSH model applications have focused on subsurface drainage, but have not considered regulating drain discharge and groundwater table with the use of controlled drainage description.
To understand better the effects of controlled drainage on field hydrology and water flow pathways, a model application was implemented to an experimental field in middle Finland. The FLUSH model was used to simulate field hydrology with and without controlled drainage. The aim was to quantify how controlled drainage affects field water balance and groundwater storage under varying hydrological conditions, including wet and dry years. A special goal was to address the performance of controlled drainage across the seasons in the Nordic study site to understand how regulation periods affect field water fluxes. FLUSH was chosen for the modeling to form a holistic picture of water balance due to its flexibility, inclusion of preferential flow, wintertime processes and previous model applications in agricultural fields.
MATERIALS AND METHODS
Sievi experimental field
The experimental field is in Sievi, northern Ostrobothnia. The field is flat (mean slope <0.2%) and surrounded by similar fields. The field was subsurface drained in June 2015 with two subsurface drainage installation methods and has been monitored since then. Originally, the field was divided into eight study plots, and its total area is 2.3 ha. Three subsurface drains were installed to each plot (15 m drain spacing), and four were drained by the trenchless method and four by the trencher method (see details from Äijö et al. (2017) and Salo et al. (2019)). The drainage depth was 1.0 m. In 2019, part of the field area (four original plots) was set with controlled drainage, and the other half (four plots) was left for conventional subsurface drainage (Figure 1(b)).
Before the controlled drainage experiment in 2019, drain discharge was recorded with a 10-min interval separately from the trencher and the trenchless plot. Starting from summer 2019, drain discharge is recorded separately from the controlled drainage area and the conventional drainage area. Depths to the groundwater level were observed manually (approximately twice a week) from each original study plot. Each of the original plots had seven observation tubes (PEH, polyethene, 50 mm) at varying distances (0.2–7.5 m) from the subsurface drain location (Äijö et al. 2017). The tubes were installed into the depth of 2.5 m and were perforated within a span of 1.5 m at the bottom of the tubes.
The soil type varies from loam to loamy sand and is coarser at the middle parts (Figure 1(c)). Soil samples were collected in 2015 (16 locations and two depths) and again in 2019 (six locations and three depths). In 2019, undisturbed soil cores were taken at three locations from topsoil (20–25 cm) and plow pan layer (35–40 cm) and at six locations from subsoil (95–100 cm). Soil structure (Äijö et al. 2017) and water retention curves (Äijö et al. 2021) were analyzed from the 2015 and 2019 samples, respectively. During 2015–2018 growing seasons, barley was grown at the field, and in 2019, the crop was autumn rye that was sowed after harvest in 2018.
Hourly meteorological data were gathered from the observation stations of the Finnish Meteorological Institute (Figure 1(a)). Precipitation (mm), air temperature (°C), wind speed (m/s), and relative humidity (%) were available from Ylivieska airfield (24 km from Sievi). Global radiation (W/m2) for calculating potential evapotranspiration was taken from Jyväskylä airfield (200 km from Sievi). Potential evapotranspiration was calculated with the Penman–Monteith equation based on the method of Allen et al. (1998). Missing meteorological values of the Ylivieska airfield station were replaced by data from the Pyhäjärvi Ojakylä meteorological station (60 km from the field).
The FLUSH model (Warsta et al. 2013) is an open-source hydrological model that simulates water flow processes in two-dimensional (2D) surface and three-dimensional (3D) subsurface domains. The subsurface domain is based on the dual-permeability approach in which water flows slowly in soil matrix and fast in soil macropores.
Precipitation is first stored at the depression storage of the surface domain where it can infiltrate into the subsurface domain, either into the matrix or the macropore system. If the storage capacity of the surface domain and infiltration capacity of the subsurface domain are exceeded, overland runoff occurs. Water can be lost from the surface domain by infiltration, evaporation, and runoff to open ditches. In the subsurface domain, water flows in the matrix and macropore systems as well as between the two systems. Water can be lost from the subsurface domain via evapotranspiration, surface layer runoff to open ditches, discharge to subsurface drains, and groundwater outflow from beneath the groundwater level.
Water flow in both pore systems is calculated with the Richards (1931) equation. Water exchange between the pore systems occurs from higher pressure to lower pressure, and the flux is calculated according to Gerke & van Genuchten (1993). Unsaturated hydraulic conductivity is calculated with the van Genuchten (1980) water retention curve.
The numerical model uses finite volume methods to discretize the computational domains and solve the governing partial differential equations (PDEs). Time derivatives of the PDEs are solved with an implicit Euler method. The model is spatially distributed using MPI (Message Passing Interface) parallelization (Warsta 2011; Warsta et al. 2013).
To simulate hydrology of the Sievi field, a 2D computational grid was parameterized based on the field data and literature (Table 2). The length and width of each cell in the computational grid was 2 m and the depth of a cell varied from 0.02 to 0.5 m, being smallest in the upper layers. In total, there were 192 cells in the x-direction, 1 in the y-direction, and 32 in the z-direction. The computational domain was 384 m in the x-direction, 2 m in the y-direction, and 3.4 m in the z-direction.
The model simulations were run using 1-h time step for the meteorological forcing. First calibration and validation simulations were made for a period of 2015–2018: the beginning of 2015 (1 Jan–20 Jun) was used as a model warm-up period, calibration was done using data from 2015 (21 Jun–31 Dec) and 2018 (1 Jan–31 Dec), validation was done using data from 1 Jan 2016 to 31 Dec 2017. Scenario simulations were done for a period of 1 Jan 2015–31 Dec 2019.
To consider soil moisture restrictions in root water uptake, Feddes et al.’s (1978) approach was used to restrict evapotranspiration. The restriction started at critical soil moisture (−5 m pressure head) and increased linearly to the wilting point (−150 m pressure head).
Groundwater outflow was simulated through secondary drainage below the subsurface drainpipes. Two additional deep drainpipes were set to a depth of 3.0 m at locations (Figure 1(c)) where the observed deep groundwater levels were lowest. The subsurface drainpipes were set to the actual locations (1.0 m depth and 15 m drain spacing) according to the field data. One topsoil layer runoff collector (375 m long) was set to the profile across the simulated 2D profile (Figure 1(c)) and another collector (2 m long) at the left border of the profile. Both collectors were 1.0 m wide and 0.3 m deep. Simulated groundwater observation tubes (28) were placed to the same locations as the actual ones in the field.
No flow boundary conditions were applied at the bottom and sides of the computational domain. A head- and state-dependent boundary was set on the uppermost cells of the subsurface domain. Initial groundwater depth was set at the depth of 1.0 m. Initial snow water equivalent (SWE) was set according to the measurements to 0.03 m.
Model performance was evaluated with the modified version of Nash–Sutcliff efficiency (NSE) (Legates & McCabe 1999), and mean absolute error (MAE). The modified version of NSE was chosen to lower the impact of high values in the time series. For the drain discharge, performance values were calculated using every time step. Groundwater depth observations were made at different distances from the subsurface drainpipe (Salo et al. 2019). The comparison between the observations and simulation results was made by computing the minimum and the maximum values over all the locations and comparing the range of the observed depths to the range of the simulated depths. The range of the simulated depths was formed by taking the minimum and the maximum values from matrix or macropore systems at each observation time. The observed and simulated ranges were compared by calculating the share of the observation times in which the simulated range (min–max) overlapped or fully covered the observed range (min–max).
Three controlled drainage scenarios were selected to study the effects of controlled drainage on field water balance and groundwater levels. (1) A constant regulation scenario (Y), in which control was on from Jan 1st to Dec 31st, was selected to demonstrate the maximal effect of controlled drainage as the drainage level was always at 0.6 m from the soil surface. (2) Growing season (GS) regulation, in which control was on from the sow to 7 days before harvest, was used to study the effect of controlled drainage on maintaining a higher groundwater level for the crop water use. (3) Autumn regulation (A), in which control was on from Oct 1st to Nov 15th, and winter regulation (W), in which control was on from Oct 1st to Mar 31st, were used for studying the effect of controlled drainage on water balance during the dormant season when evapotranspiration is low and most of the runoff and nutrient loading from the field to surface waters occurs.
The yearly controlled drainage periods were adjusted to the crop growing periods that varied between the years. Mainly the growing periods (sow to harvest) occurred from May–June to September (see Äijö et al. 2021). Control in 2019 was different from the other years as after the 2018 harvest (Aug 18th), autumn rye was sowed (Aug 24th), and the simulated growing period lasted from autumn 2018 to autumn 2019 (harvested Sep 4th).
Separate investigation was made on how controlled drainage affected the growing season groundwater depths at the different soil sections in the field (Figure 1(c)).
The simulations were run in CSC's Puhti supercomputer. The calculations were spatially distributed into 16 equal subprocesses.
Calibration and validation
Both calibration and validation had one relatively wet and relatively dry year. Annual precipitation was the highest in 2015 (522 mm, June–Dec) and 2016 (637 mm), and the lowest in 2017 (565 mm) and 2018 (498 mm). Figure 2(a) shows how the annual precipitation was divided into the simulated water balance components. The fast response in the drain discharge hydrographs (Figure 2(b)–2(c)), also observed in the measured hydrographs, was achieved through the macropore system. More than 99% of the annual drain discharge occurred through the macropore system.
Calibration and validation results in terms of performance criteria were moderate (Figure 2 and Table 3) as local climate data were not available and the meteorological inputs for the model were taken from the nearest observation station 24–200 km away from the field. The model was able to catch the occasional drain discharge dynamics (Figure 2), but reproduction of all discharge episodes was not obtained with the distant input data. NSEs for hourly drain discharge were partly affected by the winter conditions as timing of the snowmelt differed between simulated and observed. If only snow free times were considered, NSE values were slightly higher 0.40–0.41 for the calibration period and 0.26–0.27 for the validation period.
There were occasions in 2017 and 2018 demonstrating a mismatch between the precipitation input data and the observed drain discharge data (Figure 2(b)–2(c)). In 2018 autumn (Sep–Dec), the corrected total precipitation was over 200 mm, while the measured total drain discharge from the whole field was less than 10 mm. The simulated total drain discharge was 55 mm. In other autumns, the measured total drain discharge was 26–45% of the total precipitation, but, in autumn 2018, the total measured drain discharge was only 2% of total autumn precipitation (Sep–Dec). The simulated total drain discharge ranged between 17 and 28% of the total precipitation. In August 2017, the monthly corrected precipitation input was 60 mm and resulted in 7 mm drain discharge, but no observed discharge. The observed drain discharge began only 2 weeks after, at the beginning of September.
At the beginning of 2016, there were two simulated but not observed drain discharge peaks during midwinter melt events in February–March (Figure 2(b)–2(c)). Simulated SWE showed that melt of 6 mm caused a drain discharge event of 6 mm. There was snow cover present in early 2016 according to nearby stations, but SWE was not measured at the field site.
Cumulative drain discharges for each year were simulated with varying success (Figure 2(b)–2(c)). The best fit was for the calibration period (2015) and the worst for the validation period (2016 and 2017) when the difference between the simulated and observed ranged between 50 and 100 mm/a. The coefficient of variation of the simulated and the observed hourly drain discharges were close to each other (T0: 1.58 vs. 1.63 and T1: 1.81 vs. 1.98). The average hourly measured and simulated drain discharges were almost identical for both T0 (0.019 mm/h) and T1 (0.018 mm/h) sections, while the high and low intensities were not captured by the model. Figure 3 shows the simulated and observed ranges of groundwater depths at each observation time. The simulated groundwater depth was dynamic and had a higher range of variation in the macropore system compared with the matrix system. Mostly, the maximum simulated depth was from the macropore system and the minimum simulated depth from the matrix system. It is not clear if the observed groundwater depths represented the groundwater depth in the matrix or the macropore system. The simulated depth range was most of the time (for calibration 67–76% and validation 78–92%) overlapping the observed range, and the observations were fully covered by the simulated range between 21 and 43% of the time. The MAE values ranged between 0.38–0.50 m and 0.31–0.38 m for the upper part (minimum depth) and the lower part (maximum depth) of the ranges, respectively (Table 3). The highest differences between the observed and simulated groundwater depths occurred in the beginning of 2016 (i.e. the validation period) and at the end of 2018 (i.e. the calibration period). At the beginning of 2016, there was a short midwinter melting of snow that raised the simulated groundwater level (Figure 3) and generated drain discharge (in Figure 2(b) and 2(c)). At the end of 2018, there was relatively normal autumn precipitation (>200 mm September–December) that raised the groundwater levels in the model but was not visible in the observations.
Effects of controlled drainage on water balance
Simulated water balances of three different control option scenarios were compared with a conventional subsurface drainage scenario (Figure 4) to analyze the seasonal (beginning of the year, growing season, and end of the year) effects of controlled drainage. All three control options increased topsoil layer runoff, groundwater outflow, and evapotranspiration. The constant regulation option (gray bars in Figure 4) reduced drain discharge compared with the conventional subsurface drainage. The two other control options (GS + A and GS + A + W) increased drain discharge compared with conventional subsurface drainage outside the growing season (blue and orange bars in Figure 4). The increased drain discharge occurred after the regulation was stopped in autumn and in spring. The increased drain discharge originated from the saved soil water storage during the regulation period; therefore, at the beginning of the year, all outflow components (in GS + A and GS + A + W) increased compared with the conventional drainage scenario. The seasonal water balances demonstrate that control options altered both the timing and the pathway of the water outflow as the sum of the annual outflow components was in the same magnitude for the four tested scenarios (Supplementary Material 1).
The differences between the control options were more apparent in the beginning and end of the year periods when there were differences in the regulation times. There was more variation between the years than between the control options. During the wet years (2015–2016), all the control options had the highest reduction of drain discharge in the growing season. During the dryer years (2017–2019), the highest reduction of drain discharge occurred in the end of the year.
Effects of controlled drainage on groundwater levels
Growing season groundwater depths showed that controlled drainage had a minor effect on preventing dry conditions in the soils as control options could not stop the descent of groundwater level (Figure 5). Groundwater depth during the early growing season (May–June) stayed mostly above the drain depth (1.0 m), but approximately 50% of the depths in matrix (Figure 5(a)) were below the control level (0.6 m) for other than the constant regulation scenario (gray line in Figure 5). Groundwater depth probability in the macropore system for the early growing season (Figure 5(b)) showed that the groundwater depth was even deeper and below the control level for most of the time and for all the drainage scenarios. Deeper groundwater depths (below drainage depth of 1.0 m) were more probable during the late growing season (Jul–Aug) than during the early growing season. Constant regulation scenarios had the highest probabilities for groundwater depths above the control level.
Soil section comparison of the groundwater depth probabilities showed that controlled drainage had more impact on the soil section 2 groundwater depth probabilities and less on soil section 3. The difference between the soil sections was clearer during the early growing season, and the probability curves were more alike during the late growing season. The difference in groundwater probabilities between the soil sections 2 and 3 was likely caused by the distinct differences in the topsoil layer (0–0.25 cm) water retention curve properties (Table 2).
Figure 6 shows that elevating the drainage depth early enough is crucial for keeping the groundwater depth closer to the soil surface compared with conventional subsurface drainage. In case of relatively dry summers, when growing season precipitation (180–220 mm) was less than annual evapotranspiration (250–330 mm), (Figure 6(c)–6(e)) groundwater depth descends rapidly even in the case of controlled drainage scenarios (blue and orange lines in Figure 6). Only constant regulation scenario (gray line in Figure 6) was able to maintain groundwater depth closer to the drainage depth (0.6–1.0 m).
A model that has been benchmarked to other research fields is an attractive choice for producing simulations and complementing analysis under sparse and imperfect observations. Addor & Melsen (2019) found that usually model is selected by legacy rather than adequacy. FLUSH has been successfully tested at subsurface drained research fields in Finland (e.g. Turunen et al. 2013; Warsta et al. 2013; Nousiainen et al. 2015), but showed here that there were inconsistencies between simulated and observed field hydrology. The quality of model input data can cause differences between the simulated and observed outputs. Another possibility is that the processes in the field were not described by the model. For example, the effects of soil frost on groundwater depth during winter (Sheng et al. 2013) or the effect of soil drying to surface runoff (Rasa et al. 2007) as the model seemed to work weaker during winter and after drier periods. Beside the fact that FLUSH has been tested at several fields, it is a flexible tool for research purposes as field special characteristics can be described in the model parameterization. The special features in Sievi application were spatial differences in the soil characteristics, groundwater outflow routes, preferential flow, and in the follow-up model application by Salla et al. (2021), an open ditch at the border of the field.
The overall model performance was acceptable for running controlled drainage simulations and assessing the effects of control options on field water balance and groundwater depths. Model performance is typically assessed with the NSE, and a perfect fit between the observed and simulated data is likely impossible to achieve when dealing with natural systems. Good-quality input data are prerequisite for sufficient model performance. Seibert et al. (2018) suggested that model performance should be evaluated based on what is possible or can be expected for a certain data set. In this study, the most prominent model calibration was done with the use of the available data. Beven (2019) noted that the fit-for-purpose model evaluation could be used when the purpose is to understand how hydrological systems can function in different circumstances. In this study, the objective was to quantify how controlled drainage can influence on field water balance and groundwater depths during hydrologically different years and seasons. As a phenomenon, the model replicated the hydrological behavior of the simulated field as the distributions between the simulated and observed hourly drain discharges were similar. The model of the flat Northern field could therefore be used in analyzing the potential effects of controlled drainage with the given meteorological conditions. In a companion study (Salla et al. 2021), the created model is extended to produce scenarios of controlled drainage schemes under future climate conditions.
The model performance in Sievi was better for the snow free than snowy periods, which is an indication of limitation in the model or inconsistency in measurements. The mismatch in winter drain flow results was clear, e.g., in early 2016. Winter processes in FLUSH have been tested successfully against field measurements by Turunen et al. (2015). Häggblom et al. (2019) noticed that the model performance was better without winter periods. Salo et al. (2019) noticed that during the winter conditions in Sievi, deep groundwater depths were observed, which might have been caused by the soil freezing (Sheng et al. 2013). The effect of soil freezing on water flow is not presented in the FLUSH model, and the simulation results here showed that the observed deep groundwater depths were not replicated by the model. The winter processes in FLUSH should be further tested and developed. The fact that local meteorological input data were not available, and the input was compiled from the closest stations (distance >20 km), partly degraded the performance of the model in Sievi.
The investigation of individual years revealed that the controlled drainage option seemed to have a secondary effect on the water balance components compared with variability in annual meteorological conditions. The water balance differences were greater between the years than between the control options. In general, controlled drainage had a higher impact on water balance components during wet years, which was noticed also by Saadat et al. (2018). Carstensen et al. (2016) and Wesström & Messing (2007) reported that controlled drainage had less influence during wet years compared with dry years. Both studies focused on observing drain discharge changes during wintertime. Looking at the seasonal effects in this study, the reduction of drain discharge at the end of the year was greater during the dry years (<600 mm precipitation) compared with the wet years (>600 mm precipitation). The simulation results here showed that the effects of controlled drainage depended on the investigated time frame. To understand the future potential of controlled drainage, it is necessary to study long-term simulations with future climate predictions and assess the change in near, far, and long future (e.g. Salla et al. 2021).
Seasonal water balances revealed that controlled drainage did not always reduce drain discharge, but drain discharge from controlled drainage was smaller compared with conventional subsurface drainage when control was active, but higher when control was released. Saadat et al. (2018) reported that controlled drainage yielded higher drain discharge than conventional during the free drainage periods (i.e. groundwater-level regulation is off). Not many studies have reported greater drain discharge volume for controlled drainage than conventional subsurface drainage. Based on the simulation results here, it can be argued that looking only at annual water balances or drain discharge reduction when regulation is on can be misleading when assessing the overall potential of controlled drainage in regulating water outflow from the field. The seasonal water balance results in this study demonstrated that controlled drainage: (i) altered the water outflow routes during the regulation periods and (ii) shifted the timing of the drain discharge events as saved discharge water from the regulation period generated more drain discharge after the regulation period. The increased drain discharge originated from the previously saved water and not from the other outflow components. Another factor to be considered is hydrological connections between the field and surrounding fields that influence the performance of controlled drainage (Sunohara et al. 2014; Österholm et al. 2015; Virtanen et al. 2019).
It was clear from the scenario simulations that controlled drainage affected all water flow components and not only drain discharge. Rozemeijer et al. (2016) and Sunohara et al. (2014) had noticed that reduction of drain discharge was not the only effect to water flow routes. The simulation results of this study verify that groundwater flow and topsoil layer runoff increased as drain discharge decreased in the controlled drainage scenarios. The environmental benefits of controlled drainage are in most studies reported due to reduction of drain discharge (e.g. Wesström & Messing 2007; Carstensen et al. 2016; Povilaitis et al. 2018; Saadat et al. 2018), but should be in future assessed by considering if the saved water from drain discharge end up to surface runoff or shallow groundwater flow that also can end up to the same surface water bodies.
Scenario simulation results showed that controlled drainage options affected the growing season groundwater levels, but during the dry years the control level could not be maintained. Likely evapotranspiration dominated the groundwater depth behavior over the applied water management. Rozemeijer et al. (2016) concluded that the control needs to be activated early enough to prevent the groundwater-level falling. In this study, the regulation of groundwater level was timed with the harvest. The earlier start of regulation needs to be carefully thought as a too early start can cause weaker drainage conditions during the time of field works (e.g. Sunohara et al. 2016). Based on the simulation results, controlled drainage would, in present time, have only minor effects on preventing possible water scarcity, but higher impact in the future climate conditions (Salla et al. 2021).
There were differences how controlled drainage performed between the field soil sections. Soil was analyzed to be coarser in the middle of the field and finer at the sides (Salo et al. 2019). According to the simulation results, controlled drainage had more impact in the middle of the field (soil section 3 in Figure 1(c)) than at the left side (soil section 2) during the early growing season. The differences in the groundwater-level probabilities between the soil sections were likely influenced by the locations of the groundwater outflow routes that were at the sides of the field. This study did not consider the effects of surrounding areas on the groundwater depths, but Rozemeijer et al. (2016) and Salla et al. (2021) noticed that solely controlled drainage could not maintain the groundwater level if the field was close to an open ditch or main drainage ditch that had a secondary drainage effect on the field. There is an interest toward studying the effects of main ditch water-level regulation as an additional water management option (e.g. Virtanen et al. 2019; Äijö et al. 2021), and future modeling studies should focus more on testing the combined water management solutions to find suitable solutions for individual fields and drainage areas that are joined together with main drainage ditch networks.
Simulating four water management options (conventional subsurface drainage and three controlled drainage scenarios) during a 5-year period enabled analysis on how controlled drainage affected water balances and groundwater levels in varying hydrological conditions. The following conclusions were made based on the scenario simulations:
Controlled drainage influenced groundwater levels during growing seasons, but none of the studied control options maintained the levels in the drier years when growing season (May–Aug) precipitation was less than evapotranspiration.
Annual meteorological conditions affected the hydrology more than the control options.
Whole water balance should be considered in assessing the potential of controlled drainage as it increased topsoil layer runoff and groundwater outflow but did not always reduce drain discharge.
The timing of control drainage strongly affects its impact on water table.
FLUSH enabled the investigation of spatial variability of groundwater level changes in response to controlled drainage.
This study was done in the VesiHave research project led by the Field Drainage Research Association. VesiHave was funded by the Environmental Ministry of Finland, Drainage Foundation sr, Maa- ja vesitekniikan tuki ry, and Sven Hallin Research Foundation sr. We acknowledge CSC – IT Center for Science Ltd for providing generous computational resources for running the simulations. We acknowledge Markus and Sakari Sikkilä for providing the field data. Mika Tähtikarhu DSc (in Tech.) is acknowledged for his guidance in the analysis of groundwater depth simulations.
DATA AVAILABILITY STATEMENT
Data cannot be made publicly available; readers should contact the corresponding author for details.