Open
Conversation
…hunks. This was in response to some exchanges with @sdeherto, he found that compilation with intel is sometimes failing. I recall a similar issue in the past with ed_state_vars.F90, and the solution was to make sub-routines smaller.
…"install dependencies" step...
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
This refactoring is an attempt to prevent compilation crashes. @sdeherto mentioned to me offline that the compilation was crashing at
average_utils.f90in some systems. From the error description, it seems the problem is that we reach some internal memory limit when compiling with ifort. I recall a similar error a long time ago ined_state_vars.F90, and the solution back then was to reduce the size of sub-routines.This pull request splits the very long sub-routines in
average_utils.f90, by separating integration, normalisation and flushing by hierarchical level (polygons, sites, patches, cohorts). I used a nested approach to preserve the same logic. This means that the subroutine that integrates/normalises/flushes polygon-level variables call the site-level counterpart, which then calls the patch-level equivalent, which in turn calls the cohort-level one.In the process I found one minor bug in the integration of LAI, which I fixed.
Collaborators
@sdeherto who pointed out the problem.
Types of changes
Changes in Settings, Input Files or Output Files
Expectation of Answer Changes:
Or nearly bit-for-bit, because of the minor bug fix. But this would affect reporting only, no ecologically meaningful difference expected.
Checklist:
Testing :