Skip to content

Conversation

@glemieux
Copy link
Contributor

@glemieux glemieux commented Mar 23, 2020

Description:

This PR updates the trim_canopy subroutine to resolve #383 (LAI oscillation issue). The update approaches the issue by circumventing the use of a trim increment. Currently, the subroutine compares the leaf_cost against the net_uptake for all leaf layers within a cohort and decreases or increases that cohort's canopy_trim value by a fixed parameter defined increment. This can result in trimming values that appear bang back and forth around some median value.

@ckoven hypothesized that a better method would be to determine the 'optimum' cumulative LAI for the whole cohort based on the cumulative LAI versus the "net-net" uptake (leaf cost subtracted from the net uptake per leaf layer). The optimum in this case being the LAI associated with a zero net-net uptake. This optimum LAI can then be used to compute the fractional canopy_trim value.

The optimum cumulative LAI is computed by forming a least squares linear fit from the last nll layers of a given cohort and using the dgels routine from lapack to solve. If the minimum number of leaf layers is not present trim_canopy reverts back to using the original trim increment method.

Note issue #638 was generated in the course of testing this PR. The calculations of optimum_laimem can not be evaluated in the context of the larger trim_canopy implementation.

Collaborators:

@ckoven @rgknox

Expectation of Answer Changes:

Yes, there will be changes to the trimming per cohort and as such LAI and related values will be different.

Checklist:

  • My change requires a change to the documentation.
  • I have updated the in-code documentation .AND. (the technical note .OR. the wiki) accordingly.
  • I have read the CONTRIBUTING document.
  • FATES PASS/FAIL regression tests were run
  • If answers were expected to change, evaluation was performed and provided

Test Results:

CTSM (or) E3SM (specify which) test hash-tag: 242d6d14

CTSM (or) E3SM (specify which) baseline hash-tag: 242d6d14

FATES baseline hash-tag: 320a8fc

Test Output:

Regression tests: All expected PASS

ACRE comparison for 30 year BCI site run

@glemieux glemieux added status: Not Ready The author is signaling that this PR is a work in progress and not ready for integration. type: bug fix labels Mar 23, 2020
@rgknox rgknox self-assigned this Apr 22, 2020
@glemieux
Copy link
Contributor Author

glemieux commented Apr 27, 2020

Note that the code partially addresses issue #377 by setting the optimum value to be less than one: https://github.com/NGEET/fates/pull/623/files#diff-b2580903c042bbb143bb267a7b882e3fR632-R635
The original code that increases the canopy_trim fraction could go above zero if the increment is large enough. Will address this in a future PR.

@glemieux glemieux removed the status: Not Ready The author is signaling that this PR is a work in progress and not ready for integration. label Apr 28, 2020
@glemieux glemieux requested a review from rosiealice April 28, 2020 00:17
@glemieux
Copy link
Contributor Author

glemieux commented Apr 29, 2020

Regression tests on cheyenne all expected PASS:

/glade/u/home/glemieux/scratch/clmed-tests/PR623-canopytrim--C242d6d14-F76ec9103.fates.cheyenne.intel

/glade/u/home/glemieux/scratch/clmed-tests/PR623-canopytrim--C242d6d14-F76ec9103.fates.cheyenne.gnu

ERS_Ld30.f45_f45_mg37.I2000Clm50FatesCruGs.cheyenne_intel.clm-FatesSizeAgeMort FAILED on NPLANT_CACLS and NPLANT_CAPF which is a known bug (#639 ).

SMS_Lm13.1x1_brazil.I2000Clm50FatesCruGs.cheyenne_intel.clm-FatesColdDef FAILED on the baseline comparison. Values appear to be consistent and expected given that this is the longest run which should start showing differences in the emerging values based on a change to canopy_trim.

@glemieux
Copy link
Contributor Author

100 year BCI acre comparison plots. These runs were done with the standard fates variable outputs; I can add more to the plots if necessary.

@rgknox
Copy link
Contributor

rgknox commented Apr 30, 2020

@glemieux @ckoven is it possible that bfr_per_bleaf is uninitialized in some cases? I'm looking to see if that is possible, but maybe you can strike that for me:

https://github.com/NGEET/fates/pull/623/files#diff-b2580903c042bbb143bb267a7b882e3fR479

@rgknox
Copy link
Contributor

rgknox commented Apr 30, 2020

ok, I see that it is only used inside the same conditional where it is set. nvm

@rgknox
Copy link
Contributor

rgknox commented Apr 30, 2020

I think you had been bringing this point up before @glemieux . I see here:
https://github.com/NGEET/fates/pull/623/files#diff-b2580903c042bbb143bb267a7b882e3fR511-R519

that we are kind-of re-constructing the wheel on calculating the actual SLA at the layer of interest. I think we discussed if this PR should try to tidy this stuff up, and build some functions that can be used consistently in different places so there is no repeated code that could diverge. I still think it is fine to put this in another change-set, but just wanted to acknowledge it here

@rgknox
Copy link
Contributor

rgknox commented Apr 30, 2020

This is kind of a weird clause:

https://github.com/NGEET/fates/pull/623/files#diff-b2580903c042bbb143bb267a7b882e3fR579

It seems the intention here is really to filter out new plants. I think "isnew" might be more appropriate... Although, I think this routine comes after the daily growth happens, so at that point no plants are new. So this clause only rejects plants that have not incremented their height growth yet. Still don't know why we do that...?

if (nnu_clai_a(1,1) > 1) then

! Compute the optimum size of the work array
lwork = -1 ! Ask sgels to compute optimal number of entries for work
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

checked through the lapack dgels description. Verified that nnu_cla_b is not overwritten when work dne 0

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could we add some documentation here describing the arguments that dgels is expecting, etc, like what we have for the lapack dgesv call here: https://github.com/NGEET/fates/blob/master/biogeophys/FatesPlantHydraulicsMod.F90#L4812?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually i see now that's what is going on in the code starting at line 400 above... sorry.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also like the idea of some comments in line. Something that would explain what variables are equated to the matrix terms: ie Y = X * b, and how the X and Y terms are manipulated into arguments of dgels()

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Roger that. I'll flesh out the comments in that part.

end if

! Check leaf cost against the yearly net uptake for that cohort leaf layer
if (currentCohort%year_net_uptake(z) < currentCohort%leaf_cost) then
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@glemieux , why do we allow wholesale trimming of the leaf layer in these conditions? I thought the idea was to let the regression identify the intersection point and just go with that value. It looks like the regression would also over-write whatever was calculated here, so it seems unnecessary right? Also, does the final solution adhere to the trim-limit?

Copy link
Contributor Author

@glemieux glemieux Apr 30, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good question: we do this in case we have less than nll leaf layers to work with for a given cohort. In that case I just decided to let it stay with the old trimming methodology, baring some other rationale. There are two thoughts I had on this:

  1. I wasn't sure what we should do if we have a cohort with just one leaf layer come in to the routine, i.e. nv = 1. Presumably that would be for a new cohort so maybe we'd want to just leave it at the default starting trim fraction?
  2. Right now nll is somewhat randomly hard coded to 4 for all cohorts regardless of number of leaf layer, nv. I was thinking that there might be 'better' optimum intercepts if the nll layer is greater for cohorts with more leaf layers (up to a point). That said, I could make the value of nll dependent on nv so that we'd never have to use the old method except for nv = 1. Alternatively, I just set this to nll = 2 and call it a day.

@glemieux
Copy link
Contributor Author

glemieux commented Apr 30, 2020

I think you had been bringing this point up before @glemieux . I see here:
https://github.com/NGEET/fates/pull/623/files#diff-b2580903c042bbb143bb267a7b882e3fR511-R519

that we are kind-of re-constructing the wheel on calculating the actual SLA at the layer of interest. I think we discussed if this PR should try to tidy this stuff up, and build some functions that can be used consistently in different places so there is no repeated code that could diverge. I still think it is fine to put this in another change-set, but just wanted to acknowledge it here

Roger that. I started working up a branch for another PR a couple weeks ago: https://github.com/glemieux/fates/blob/9e04bf7434b7c06fd5138dc2801755a6c7d85624/biogeochem/FatesAllometryMod.F90#L2213-L2278

@glemieux
Copy link
Contributor Author

glemieux commented Apr 30, 2020

This is kind of a weird clause:

https://github.com/NGEET/fates/pull/623/files#diff-b2580903c042bbb143bb267a7b882e3fR579

Yeah, my assumption was that we wanted to make sure that we weren't trimming new plants coming in. @rgknox trim_canopy gets called at the end of ed_update_site; is that the routine called at the end of daily growth?

@rgknox
Copy link
Contributor

rgknox commented May 7, 2020

Checking if the cohorts height is greater than the minimum should achieve the same effect as filtering on the "%isnew" flag. The only situation it would not, and this could be meaningful, is this logic will filter out plants that have not grown in size yet because they have insufficient carbon balance (even though they , yet these are the plants that would potentially benefit from optimizing their allocation. I think we should at least create an issue of this.

@glemieux glemieux changed the title Canopy trimming subroutine update Canopy trimming subroutine optimization update May 7, 2020
@glemieux
Copy link
Contributor Author

glemieux commented May 7, 2020

Checking if the cohorts height is greater than the minimum should achieve the same effect as filtering on the "%isnew" flag. The only situation it would not, and this could be meaningful, is this logic will filter out plants that have not grown in size yet because they have insufficient carbon balance (even though they , yet these are the plants that would potentially benefit from optimizing their allocation. I think we should at least create an issue of this.

Created issue #645

@glemieux glemieux added the status: Not Ready The author is signaling that this PR is a work in progress and not ready for integration. label May 15, 2020
@glemieux
Copy link
Contributor Author

glemieux commented Jun 3, 2020

@rgknox @ckoven I have an alternate branch with the adjustments to the cumulative_lai calculation you mentioned in our weekly meeting here: https://github.com/glemieux/fates/blob/issue-383-linearfit-trimfraction-cohortcumulativelai/biogeochem/EDPhysiologyMod.F90#L501-L628

The results comparing the two versions are pretty much spot on in the acre outputs: https://github.com/glemieux/fates-jupyter/blob/develop/leaf-flutter/acre-output/100year-cumulativelaitest_plots.pdf. Interestingly the TSAI shows the most difference compared to the difference in the TLAI. A comparison against the baseline and the updated cumulative_lai branch is here: https://github.com/glemieux/fates-jupyter/blob/develop/leaf-flutter/acre-output/100year-base-cumulativelaitest_plots.pdf

@glemieux
Copy link
Contributor Author

@rgknox I've added some more comments at your suggestion. Would you give me your feedback when you get a chance?

@glemieux glemieux removed the status: Not Ready The author is signaling that this PR is a work in progress and not ready for integration. label Jun 26, 2020
@rgknox
Copy link
Contributor

rgknox commented Jul 2, 2020

All PASS:
/glade/scratch/rgknox/clmed-tests/fates-sci.1.37.0_api.11.2.0-clm5.0.30-Ce33b4658-Fc1d1372f.fates.cheyenne.gnu
/glade/scratch/rgknox/clmed-tests/fates-sci.1.37.0_api.11.2.0-clm5.0.30-Ce33b4658-Fc1d1372f.fates.cheyenne.intel

@rgknox rgknox merged commit 154bd29 into NGEET:master Jul 2, 2020
@glemieux glemieux deleted the issue-383-linearfit-trimfraction branch March 21, 2022 22:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

SLA profile implemented, weird interaction with canopy trimming

3 participants