You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Moreover, stackinator didn't complain about it, it just went on and actually it produced a uenv, but the uenv_vars were not fully set as expected. The problem is the duplicate prepend_path entry.
IMHO this should raise an error (or at least a warning) about this problem.
YAML Spec
PyYAML library, which stackinator uses for reading YAML files, is not fully compliant with the YAML spec, which states (starting from YAML 1.0)
A mapping is an unordered set of key/value node pairs, with the restriction that each of the keys is unique.
This restriction has non-trivial implications [...] Since YAML mappings require key uniqueness, representations must include a mechanism for testing the equality of nodes. This is non-trivial since YAML presentations allow various ways to write a given scalar.
The way PyYAML currently handles this problem is by ignoring duplicates and overwriting.
As said, I think that YAML spec should be enforced. I don't think there are, and IMHO there shouldn't be, other cases where the duplicate entries are useful.
The solutions I see to enforce YAML spec at the moment are
not sure about differences, but they self-describes as "derivated from PyYAML" but where "many of the bugs filed against PyYAML, but that were never acted upon, have been fixed in".
I opened this issue to decide if/how we would like to proceed.
This is the example that caused me problems and it was not immediately clear to me why it was problematic.
Moreover, stackinator didn't complain about it, it just went on and actually it produced a uenv, but the
uenv_varswere not fully set as expected. The problem is the duplicateprepend_pathentry.IMHO this should raise an error (or at least a warning) about this problem.
YAML Spec
PyYAML library, which stackinator uses for reading YAML files, is not fully compliant with the YAML spec, which states (starting from YAML 1.0)
And, in partial defense of PyYAML, this section of the YAML spec adds
The way PyYAML currently handles this problem is by ignoring duplicates and overwriting.
Solutions
As said, I think that YAML spec should be enforced. I don't think there are, and IMHO there shouldn't be, other cases where the duplicate entries are useful.
The solutions I see to enforce YAML spec at the moment are
ruamel.yamlruyamlI opened this issue to decide if/how we would like to proceed.