Acadlore takes over the publication of IJEI from 2025 Vol. 8, No. 5. The preceding volumes were published under a CC BY 4.0 license by the previous owner, and displayed here as agreed between Acadlore and the previous owner. ✯ : This issue/volume is not published by Acadlore.
Forest Management for the Flood Mitigation Function of Forests
Abstract:
Paired catchment experiments is the method that estimates the change of runoff due to forest change by comparisons between runoff data from two or more adjacent catchments and evaluations the change of their relative relations between two periods when forest changes in catchments. The increase in volume of maximum daily runoff due to forest degradation was estimated in three treatment catchments in Japan using paired catchment experiments. In one catchment, slope failure occurred and 20% of the catchment area became bare, after which maximum daily runoff increased by approximately 1.1-fold. In the two other catchments, slope failure did not occur, while maximum daily runoff increased by only 6–8 mm day−1. This increase was irrespective of the rainfall volume. slope failure and the transition to bare land were identified as causes of the degradation of the flood mitigation function. The causes of slope failure were identified as inadequate forest management, such as clear-cutting in areas with high slope failure risk, simultaneous clear-cutting throughout a catchment, and delayed replanting after clear-cutting. Therefore, forest management strategies for the flood mitigation function of forests could include the avoidance of logging in locations with a high risk of slope failure, limits concerning the amount of logging, and prompt replanting after logging.
1. Introduction
Slope failure prevention is important for maintaining the flood mitigation function of forests. Tamai [6] reviewed studies that used water movement process models to reproduce water movement in forested catchments; the report indicated that the flood mitigation functions of forests originate from forest soil. If forest soil is lost due to slope failure, increased runoff volume during floods can be anticipated. Furthermore, daily runoff was reported to increase due to various types of forest degradation. These conclusions have been obtained with analyses of observational data [8]. Tada [5] demonstrated the importance of the reinforcing effect on slope stability by tree root systems for slope failure mitigation. Based on the reinforcing effect, appropriate forest management strategies for slope failure mitigation have been proposed [7]. The aforementioned previous studies suggest that proper forest management would contribute to the flood mitigation function through the prevention of slope failures. However, the contributions of forest management to the flood mitigation function of forests have not been fully explored. Therefore, this report discusses degradation of the flood mitigation function of forests due to slope failure and suggests the appropriate forest management strategies for this flood mitigation function.
2. Methods and Site Description
The runoff volume characteristics from forest catchments varies with many factors such as topography, geology and meteorology, in addition to forest conditions of vegetation and soil.
Among these factors, forest vegetation is relatively easy factor to modify artificially. Therefore, with the aim of improving the flood mitigation function by managing forest vegetation, attempts have been made to evaluate changes in flood mitigation function due to the only effects of forest conditions. Paired catchment experiments (Fig. 1) had been developed as the method for it [2]. In paired catchment experiments, runoff data from two or more adjacent catchments are compared to elucidate their characteristics. The influence of forest degradation can be evaluated based on the relative changes of runoff between two periods: a control period, during which forests in two or more catchments share a similar state, and a treatment period, when the forest in a control catchment is healthy while the forest in a treatment catchment has been degraded. Single catchment experiments, which compares the runoff volume from one catchment between a control period and a treatment period, cannot eliminate the effects of different meteorological factors from year to year. In addition, the parallel catchment experiments, which compares the runoff volume observed in the same water year from a control catchment and a treatment catchment, cannot eliminate the effects of different topography and geology between catchments. The paired catchment experiments is a method of excluding the effects of weather, topography and geology, and evaluating only the effects of forest conditions. This report presents the results of analyses of maximum daily runoff during each water year, as an index of the flood mitigation function, and the minimum daily runoff during each water year.
Previous studies (e.g., [2]) demonstrated that the annual runoff volume increases due to forest degradation. Thus, the control and treatment periods are determined from the fluctuations of the ratio Qrur, calculated using eqn (1):
where Qr and Qc represent the annual runoff volume from the treatment and control catchment, respectively.
Importantly, the annual runoff volume from each catchment varies with the condition of the forest; it also varies in each catchment even in the same water year with factors such as the topography and geology. Therefore, the values of Qrur vary in each catchment, even during the control periods when the forest conditions in the control and treatment catchments are similar. By comparing Qr and Qc observed in the same water year, the effects of meteorological fluctuations are excluded from the values of Qrur, but the effects of differences in topography and geology between catchments remain. However, the topography and geology do not change significantly in several decades. It can be considered that the fluctuations of Qrur is not affected by the change of topography and geology. Therefore, Qrur fluctuations represent the effects of only changes of forest conditions. Accordingly, control and treatment periods are determined by fluctuations of Qrat, rather than values of Qrat.

This report describes cases in the Tatsunokuchi-yama and Kamabuchi watersheds in Japan, as indicated in Table 1 and Fig. 1. The runoff data used in this report were collected by the Forestry and Forest Products Research Institute.
Case | Experimental site | Control catchment (Area) | Treatment catchment (Area) | Treatment event | Control period | Treatment period |
|---|---|---|---|---|---|---|
1 | Tatsunokuchi -yama | Kitadani (17.274ha) | Minamidani (22.611ha) | Forest fire and Withering of pine trees | 1937-1958 1966-1977 1985-2002 | 1960-1965 1981-1996 |
2 | Kamabuchi | No.1 (3.060ha) | No.2 (2.482ha) | Clear cut | 1939-1947 1983-2005 | 1948-1982 |
3 | No.3 (1.540ha) | Clear cut | 1961-1963 1985-2005 | 1964-1984 |

Year | Kitadani | Minamidani |
|---|---|---|
1937 | Observation start | Observation start |
1944~1947 | Clear cut | Clear cut |
1959 | Forest fire (Sep.) | |
1960 | Pine replanted (Mar.) | |
1978~1980 | Withering of pine trees |
The Tatsunokuchi-yama watershed (Fig. 2b; located at 34°42' N, 133°58' E, with an elevation range of 45–257 m) consists of two catchments: Kitadani (17.27 ha) and Minamidani (22.61 ha). The average air temperature and annual precipitation (1971–2000) are 14.3°C and 1,217 mm, respectively. Snow coverage is rare throughout the year. The geology consists of the Paleozoic Chichibu Formation, and hard sandstone is the predominant lithology [8].
Table 2 presents the forest management history of this watershed. Both catchments were covered with natural Japanese red pine forests in 1937 when observation began. Since then, the Kitadani catchment has grown broad-leaved forests. In contrast, in the Minamidani catchment, forest vegetation has been degraded twice due to forest fire and the withering of pine trees. In this report, the Kitadani and Minamidani catchments are designated as the control catchment and treatment catchment, respectively, for the paired catchment experiment.
The Kamabuchi watershed (Fig. 2c; located at 38°56' N, 140°15' E, with an elevation of 162–252 m) consists of four catchments: No. 1 (3.06 ha), No. 2 (2.48 ha), No. 3 (1.54 ha), and No. 4. In this report, the observed runoff data from catchments No. 1, No. 2, and No. 3 are used. Table 3 lists the forest management histories in these catchments. The average annual air temperature and precipitation (1971–2000) are 11.1°C and 2,406 mm, respectively. The period from December to May of the subsequent year is the snow cover season. The geology consists of Tertiary units, with predominant tuff and shale tuff [8].
In 1939, when observation of catchments No. 1 and No. 2 began, all catchments were covered with mixed forests that comprised both coniferous and broad-leaved trees. Since then, the forest in catchment No. 1 has been maintained continually. However, in catchments No. 2 and No. 3, the forests have been clear-cut. In particular, in catchment No. 2, forest soil was lost after clear-cutting due to avalanches and slope failures. Approximately, 20% of the catchment area has become bare [10]. For this report, the control catchment for the paired catchment experiments is catchment No. 1. The treatment catchments are catchments No. 2 and No. 3.
In the Tatsunokuchi-yama watershed, runoff reaches its annual minimum around March. Therefore, the period of April through March of the subsequent year was set as one water year in the Tatsunokuchi-yama watershed. For example, the year 2000 in the Tatsunokuchi-yama watershed represents the calendar period from April 2000 through March 2001.
For the Kamabuchi watershed, daily runoff data for June–November (no-snow period) [8] were used to remove the effects of snow accumulation and melting in the catchments. Therefore, for example, the year 2000 in the Kamabuchi watershed represents the period of June–November 2000. Furthermore, the QtQt and QcQc used in eqn (1) denote the runoff volumes observed between June and November.
Year | No.1 | No.2 | No.3 |
|---|---|---|---|
1939 | Observation start | Observation start | |
1947 | Needle tree cut (Dec.) | ||
1948 | Bloard leaves tree cut (-Summer) | ||
1960 | Cedar replanted | ||
1961 | Observation start | Clear cut in lower 50% area (Feb.-Mar.) | |
1964 | Clear cut in Upper 50% area (Dec.) | ||
1969 | Cedar replanted | ||
1970 | (Spring) |
3. Results
Figure 3a–c shows the fluctuations of QratQrat in the three treatment catchments. QratQrat values in the three treatment catchments differed from each other at the observation start year when the forest conditions were similar to the conditions of each control catchment. QratQrat was less than 1.0 in the Minamidani and No. 3 catchments and less than 1.1 in catchment No. 2 before the forest degradation due to fire and clear-cutting. Thus, the control and treatment periods were determined by the fluctuations of QratQrat compared with the values of 1.0 in the Minamidani and No. 3 catchments, and 1.1 in catchment No. 2. In the three treatment catchments, QratQrat increased immediately after clear-cutting, forest fire, and the withering of pine trees, it then decreased gradually until returning to its previous level. The duration in years necessary for the QratQrat to recover to the previous level (less than 1.0 in the Minamidani and No. 3 catchments, and 1.1 in catchment No. 2) was determined to be a treatment period; all other years were control periods. In the Minamidani catchment, QratQrat was generally less than 1.0 from 1937 to 1958, which was regarded as a control period. QratQrat increased to values larger than 1.0 because of a forest fire that occurred in September 1959; it remained larger than 1.0 until 1965. The occurrence of the forest fire was included in water year 1959. Thus, data observed in 1959 were excluded from the analysis, and the period from 1960 to 1965 was regarded as a treatment period. QratQrat was again less than 1.0 from 1966 to 1977; this period was regarded as a control period. QratQrat became larger than 1.0 in 1981–1996 after the period of 1978–1980, when pines died from wilt disease; the period in 1981–1996 was regarded as a treatment period. Data observed in 1978–1980 were excluded from the analysis because the forest condition was changing in the Minamidani catchment. After 1998, QratQrat recovered to less than 1.0, in general, such that this period was regarded as a control period. The details of this case (case 1) are summarized in Table 1. In catchment No. 2, QratQrat was less than 1.1 from 1939 to 1947; this period was regarded as a control period. QratQrat increased to values larger than 1.1 when clear cutting was performed from December 1947 to early summer 1948; it generally remained larger than 1.1 until 1982. Because the period of forest clear-cutting was before water year 1948, the period from 1948 to 1982 was regarded as a treatment period. After 1983, QmtQmt recovered to less than 1.1; this period was regarded as a control period. This case is summarized as case 2 in Table 1. In catchment No. 3, QmtQmt was less than 1.0 from 1961 to 1963, which was regarded as a control period. QmtQmt increased to values larger than 1.0 because of clear-cutting performed on the lower slope in February and March 1964; it remained larger than 1.0 until 1984. Because the period of forest clear-cutting preceded water year 1964, the period from 1964 to 1984 was regarded as a treatment period. After 1985, QmtQmt recovered to values generally less than 1.0; this period was regarded as a control period. Case 3 is also summarized in Table 1.

Comparisons of the minimum daily runoff from control and treatment catchments in cases 1–3 (Table 1) are presented in Fig. 4a–c, respectively. The liner regression lines for the Minamidani treatment catchment in the control and treatment periods are described by eqns (2) and (3), respectively. They are shown in Fig. 4a as a solid line and a dotted line, respectively:
where yc and yt represent the daily runoff from the treatment catchment in the control and treatment periods, respectively, while x represents the daily runoff from the control catchment. RC represents the regression coefficient.
The linear regression lines for treatment catchment No. 2 in the control and treatment periods are described by eqns (4) and (5), respectively (Fig. 4b):
The linear regression lines for treatment catchment No. 3 in the control and treatment periods are described by eqns (6) and (7), respectively (Fig. 4c).
The RC values of eqns (2)–(7) are small, within the range of 0.5200–0.7810. This means that the correlations between the minimum daily runoff in the treatment catchment and control catchment are not strong in either the treatment or control periods for all treatment catchments. Thus, substantial differences in the distributions of data points between the treatment and control periods may not be identified statistically. However, Fig. 4a–c shows that the distribution of dots representing data in the treatment period tends to be clearly higher than the distribution of dots representing data in the control period for all treatment catchments.

This finding suggests that runoff from the treatment catchment is greater during the treatment period than during the control period. Thus, there was a relative increase in minimum daily runoff with forest degradation.
Comparisons of maximum daily runoff between the control and treatment catchments in cases 1–3 (Table 1) are presented in Fig. 5a–c, respectively. The linear regression lines for the Minamidani catchment in the control and treatment periods are expressed by eqns (8) and (9), respectively. They are illustrated in Fig. 5a as a solid line and a dotted line, respectively.
The linear regression lines for treatment catchment No. 2 in the control and treatment periods are expressed by eqns (10) and (11), respectively (Fig. 5b).
The linear regression lines for treatment catchment No. 3 in the control and treatment periods are expressed by eqns (12) and (13), respectively (Fig. 5c):

Figure 5a–c shows that the distributions of points representing the data in the treatment and control periods overlap in all cases. Due to this overlap, it is unclear whether runoff from the treatment catchment was larger during the treatment period than during the control period, in contrast to the results for minimum daily runoff shown in Fig. 4a–c. However, the RC values of eqns (8)–(13) are in the range of 0.8784–0.9872; these are much larger than the values of eqns (2)–(7). This finding indicates that the correlations between the maximum daily runoff from the treatment catchment and the control catchment are strong in both the treatment and control periods in all catchments. Thus, eqns (8)–(13) are highly significant. Moreover, the linear regression lines for the treatment period are drawn above the lines for the control period for all treatment catchments (Fig. 5a–c). In summary, the maximum daily runoff from all three treatment catchments increased more in the treatment periods than in the control periods.
The maximum daily runoff volume varies with the rainfall volume. Points representing the data in the treatment or control periods in the upper-right quadrant in Fig. 5a–c were produced by larger rainfall volumes than were points plotted in the lower-left quadrant. The area between the two linear regression lines can be regarded as the average volume of the runoff increase that is attributable to forest degradation. The linear regression lines reflect the characteristics of water movement in the catchment. Thus, changes in water movement cause changes in linear regression lines. Therefore, the linear regression lines for maximum daily runoff were compared between the control and treatment periods to evaluate the degradation of flood mitigation function in each treatment catchment. Differing patterns were found among the three treatment catchments. First, catchment No. 3 was explored. In this catchment, the slopes of eqns (12) and (13) are 0.9308 and 0.9191, respectively, which are nearly equal. The intercept of eqn (13) calculated for the treatment period is approximately 8 mm day−1−1 greater than the intercept of eqn (12) for the control period. Similar to the case of catchment No. 3, the slopes of eqns (8) and (9) for the Minamidani catchment are approximately equal, at 0.9665 and 0.9911, respectively. Equation (9) for the treatment period has a greater intercept by approximately 6 mm day−1 than eqn (8) does for the control period. The two liner regression lines were almost parallel in Minamidani and No. 3 catchments. In catchment No. 2, the slope increased from approximately 0.9153 in eqn (10) for the control period to approximately 1.0399 in eqn (11) for the treatment period. The difference between eqns (10) and (11) is greater in the upper-right quadrant than in the lower-left quadrant of Fig. 5b. The increase in volume (ΔQ) and increase in ratio (R) can be calculated from eqns (10) and (11), and observed maximum daily runoff (x) from the control catchment No.1, as defined in eqns (14) and (15):
ΔQ increased in proportion to the increase in x (Figure 6).
4. Discussion
Two main patterns were observed in the comparisons between the linear regression lines in each treatment catchment. One pattern described the Minamidani and No. 3 catchments, while the other pattern described catchment No. 2. The causes of this difference are discussed below to examine how an increase in maximum daily runoff corresponds to degradation of the flood mitigation function in each treatment catchment.
When the two linear regression lines are nearly parallel, as in the Minamidani and No. 3 catchments, the average increase in maximum daily runoff volume from the treatment catchment is caused by the increase in the intercept from eqns (8) and (12) to eqns (9) and (13), respectively; it can be regarded as constant regardless of the rainfall volume producing the maximum daily runoff. To discern the mechanism underlying these results, reduction of canopy interception in the treatment catchment during the treatment period must be considered. Iida et al. [3] observed the time courses of rainfall volume outside and within a cedar forest during rainfall events and reported that the difference between these volumes reaches a peak during rainfall events and that the quantity of water that can be contained in tree bark and foliage (7.2 mm) is approximately equal to the maximum interception quantity (6.7 mm). This finding suggests that canopy interception is due to rainwater stored in foliage and tree bark that evaporates after the rain stops. In the Minamidani and No. 3 catchments, as the degraded forest lost foliage and tree bark in the treatment period, the storage quantity was reduced by approximately 6−8 mm per rainfall event. Irrespective of the rainfall volume, the maximum daily runoff increased by 6−8 mm day−1 due to the decrease in canopy interception. Based on the above discussion, the increase in runoff volume during the treatment period is presumably attributable to the degradation of forest vegetation alone, rather than forest soil. When the slopes of the linear regression lines for the control and treatment periods are nearly equal, as in eqns (8) and (9) and eqns (12) and (13), forest soil is maintained. The effects of forest vegetation degradation reduce canopy interception and increase runoff by 6−8 mm day−1, irrespective of the rainfall volume producing the maximum daily runoff, as shown in Fig. 5a and c.
As shown in Fig. 6, ΔQ increases in proportion to the increase in x. The value of x can be replaced with the rainfall volume producing the maximum daily runoff. Thus, the increased runoff volume from treatment catchment No. 2 during the treatment period includes components proportional to rainfall volume. Regarding the mechanism underlying this relationship, a reduction in the soil water holding capacity due to slope failure is one possibility. Thick forest soils with developed pores promote efficient osmosis of rainfall water, hold that water temporarily, and then release it slowly. Arimitsu et al. [1] compared two adjacent forest catchments and reported that runoff volume from a catchment with thin and immature soil was markedly smaller during a drought and larger during a flood period compared with the volume from a catchment with thick and mature soil. In other words, the volume of direct runoff is larger and its outflow is faster in a catchment with a low soil water-holding capacity than one with high water-holding capacity. When water holding capacity is reduced due to forest soil loss associated with slope failure, the ratio of base flow, which is water that penetrates deep into the soil and is released slowly, is expected to decrease. With this reduction in the ratio of base flow, the ratio of direct runoff is expected to increase, becoming proportional to rainfall volume. In catchment No. 2, bare areas were present due to avalanches and slope failure [10]. The rate of base flow is presumably reduced in such areas due to the loss of forest soil, leading to an increase in direct runoff. In this case, maximum daily runoff increases in proportion to rainfall volume.
After using a model to simulate the movement of water in a forested catchment, Tani et al. [9] reported water runoff characteristics calculated under various states of forest soil and vegetation. By comparing the values between calculations under healthy and degraded conditions of forest vegetation with the same soil conditions, the changes attributable to the presence of forest vegetation were determined. No difference in the maximum hourly runoff was found between calculations. The minimum hourly runoff under healthy vegetation is 0.002 mm hour−1, but it becomes 0.04 mm hour−1 under the degraded vegetation [6]. This finding agrees with the result showing increased runoff from the treatment catchment during the treatment period, as indicated by the clear increases shown in Fig. 4a and c for minimum daily runoff; the increases are less clear for maximum daily runoff caused by overlap of point distributions representing the data in the treatment and control periods, shown in Fig. 5a and c. Changes attributable to the presence of both forest soil and vegetation can be evaluated by comparing the results of calculations under healthy and degraded conditions of both forest soil and vegetation. In healthy condition, the maximum and minimum hourly runoff values are 5.0 and 0.01 mm hour−1, respectively. In degraded condition, the maximum and minimum hourly runoff values are 9.3 and 0.04 mm hour−1, respectively [6]. When both forest soil and vegetation are degraded from a healthy condition, both the maximum and minimum values of hourly runoff increase. This finding agrees with the minimum runoff results presented in Fig. 4b for catchment No. 2. For maximum runoff, changes between the control and treatment periods, shown in Figs. 5b and 6, are smaller than the difference between the model calculations mentioned above. According to the model calculations reported by [9], the maximum hourly runoff under the degraded condition of forest soil and vegetation was 9.3 mm hour−1, which was approximately 1.9-fold greater than the 5.0 mm hour−1 rate obtained from calculation under the healthy condition of forest soil and vegetation [6]. Although R increases with increasing rainfall volume, as shown in Fig. 6, the maximum increase is 1.1-fold. Possible reasons for the difference between the 1.9-fold difference reported by [9] and the 1.1-fold difference shown in Fig. 6 might be the percentage of the watershed area degrading forest soil. Tani et al. [9] presented the calculation under the degraded condition of forest soil based on the assumption that no forest soil is present anywhere in the catchment. However, according to a map that shows the forest status of catchment No. 2 in April 1978 [10], bare land occupied only approximately 20% of the catchment area. Changing the percentage of the area without soil in the catchment significantly affects the degree of increase.
For Minamidani and No. 3 catchments, the flood mitigation function of forests was presumably not degraded due to forest fire, withering of pine trees, or clear-cutting; this inference was made because the linear regression lines are nearly parallel (Fig. 5a and c) and the increased volume of the maximum daily runoff was only 6–8 mm day−1 in these catchments. It is likely that only the forest vegetation degraded and that the forest soil was preserved.
In contrast, in catchment No. 2, the flood mitigation function of the forest was considered degraded due to clear cutting, avalanche, and slope failure because the slopes of the linear regression lines were larger in the treatment period than in the control period (Fig. 5b); there was an estimated maximum daily runoff increase in proportion to the rainfall volume producing the maximum daily runoff due to slope failure and transition to bare land in 20% of the catchment area.
The above discussion suggests that forest soil loss associated with slope failures and avalanches causes a degradation of flood mitigation function.
Forest vegetation may be misunderstood as not having a recognized role in the flood mitigation function because degradation of the flood mitigation function may not be identified in treatment catchments where only forest vegetation is degraded while forest soil is preserved, such as the Minamidani and No. 3 catchments in this report. Nevertheless, forest vegetation contributes to flood mitigation by preserving forest soil, reinforcing slope stability with tree root systems. This effect is apparent in degradation of the flood mitigation function caused by slope failure in catchment No. 2 (Fig. 5b and 6).Among the three treatment catchments assessed in this report, slope failure occurred after clear-cutting in catchment No. 2, while it did not occur after a forest fire and tree withering in the Minamidani catchment or after clear-cutting in catchment No. 3. The reasons for these different outcomes are discussed at the end of this report.Before that discussion, a forest management system to prevent slope failure [7] is briefly described.
Forest tree root systems grow while binding and fixing soil particles. Thus, tree roots reinforce soil connection and slope stability, thereby preventing slope failure. If a forest is clearcut, this reinforcing effect is reduced as the stump root system decays. The reinforcing effects of planted tree root systems increase with tree growth. Even if replanting occurs immediately after clear-cutting, the reinforcing effects of root systems, including both stumps and replanted trees, decrease for decades after clear-cutting [5]. To maintain the reinforcing effects of tree root systems as completely as possible, the following list of proposals was presented in [7]. (a) Tree cutting should not be conducted in areas where the slope failure risk is high. (b) When tree cutting is necessary for areas with a risk of slope failure, sites were chosen for tree cutting should be in lower-risk areas or distant from residential areas to prevent damage if slope failure occurs. (c) The degree and extent of tree cutting should be minimised in areas with a high risk of slope failure. More root systems of trees should be left after cutting. (d) To shorten the period during which the reinforcing effect of tree root systems is reduced, nursery trees should be replanted promptly after cutting.
In the Minamidani catchment, Japanese black pines were replanted in March 1960, almost immediately after a forest fire that occurred in September 1959 (Table 2). This area was treated similarly to the approach in proposal d). After tree withering occurred in 1978–1980, the area was left to recover naturally. Although all pine trees died, other trees survived. Thus, the slope reinforcing effect by the root systems of the remaining trees was sufficient. Consequently, the effect of proposal c) was also achieved.
In catchment No. 2, trees in the whole catchment area were clear-cut from December 1947 through summer 1948; cedar trees were replanted more than 10 years later, in 1960 (Table 3). Stumps stabilized the snow on the slopes, prevented avalanches, and preserved the soil through their root systems in some years after clear-cutting [10]. Subsequently, as stumps and their root systems decayed, they became unable to resist the movement pressure of snow. In 1959, more than half of the catchment area experienced frequent avalanches. Moreover, some areas were transformed into bare land due to slope failure. Clear-cutting performed in whole areas within the catchment with a high risk of slope failure is counter to the approaches in proposals (a) and (c). Replanting was not performed immediately after clear-cutting in this case, which is counter to the approach in proposal (d).
In catchment No. 3, clear-cutting was conducted for 50% of the catchment area on the lower portion of the slope during February–March 1964. Cedar trees were replanted in the spring of 1970. Although cedar replanting occurred approximately 6 years after the clearcutting of the lower slope, it occurred immediately after clear-cutting of the upper slope area (Table 3). During 1964–1969, no trees were present on the lower slope, whereas trees remained on the upper slope. The movement pressure of snow was thus suppressed, allowing the stumps remaining on the lower slope to conserve the forest soil. Additionally, the remaining tree root systems on the upper slope reinforced the stability of the upper slope. The range of clear-cutting was limited to 50% on the lower slope, in accordance with proposal (c). The replanting of cedars immediately after 50% clear-cutting of the upper slope aligns with the approach in proposal d).
5. Conclusions
The increased volume of maximum daily runoff and degradation of the flood mitigation function of forests due to forest degradation was estimated using paired catchment experiments. In the two treatment catchments where forest soil was preserved and only forest vegetation was degraded, the flood mitigation function of the forest was judged not to be degraded because the volume increase was estimated to be only 6−8 mm day−1, irrespective of the rainfall volume producing the maximum daily runoff. Based on [3], it was discussed that the decrease of canopy interception with degradation of only forest vegetation causes the small runoff increase independent of rainfall volume. On the other, in the one catchment where forest soil was not preserved, the flood mitigation function of the forest was judged to be degraded because the estimated maximum daily runoff increased approximately 1.1-fold in proportion to the rainfall volume producing the maximum daily runoff. Based on [1] and [9], it was discussed that the increase of direct runoff with degradation of forest soil causes the increase of runoff in proportion to rainfall volume. By examining the forest management histories of the three treatment catchments, the causes of slope failure were assessed to identify inappropriate forest management activities, as follows: clear-cutting in areas with high slope failure risk, clear-cutting simultaneously across the whole catchment area, and delayed replanting after clear-cutting. The above findings imply that prevention of slope failures is necessary to maintain the flood mitigation function of forests. Therefore, several forest management strategies for utilizing reinforcing slope stability with tree root systems [5] are recommended for the flood mitigation function of forests: avoid logging in places with a high risk of collapse, limit the amount of logging, and promptly plant trees after logging.
The data used to support the findings of this study are available from the corresponding author upon request.
The authors declare that they have no conflicts of interest.
