Working with an external XSD right now, and it has these simpleTypes defined:
<xs:simpleType name="String..1">
<xs:restriction base="xs:string">
<xs:maxLength value="1"/>
</xs:restriction>
</xs:simpleType>
<xs:simpleType name="String1">
<xs:restriction base="xs:string">
<xs:length value="1"/>
</xs:restriction>
</xs:simpleType>
Looks like xsd-parser is stripping out non-alphanumeric characters here and arriving at String1 for each one, causing a compilation error:
3 | ... } pub type String1Type = :: std :: string :: String ; pub type String20Type = :: std :: string :: String ; pub type String3Type = :: std :: string :: String ; pub type String35Type = :: std :: string :: String ; pub type String6Type = :: std :: string :: String ; pub type String60Type = :: std :: string :: String ; pub type String1Type = :: std :: string :: String ; pu...
| --------------------------------------------------- previous definition of the type `String1Type` here ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ `String1Type` redefined here
Is there a way to augment how it generates these? Even if it's ugly, we're treating these generated bindings more like a -sys crate that we'll build higher level abstractions over, I just need it to actually be correct/compilable.
Working with an external XSD right now, and it has these simpleTypes defined:
Looks like
xsd-parseris stripping out non-alphanumeric characters here and arriving atString1for each one, causing a compilation error:Is there a way to augment how it generates these? Even if it's ugly, we're treating these generated bindings more like a
-syscrate that we'll build higher level abstractions over, I just need it to actually be correct/compilable.