Some of the MS samples previously covered by ClamAV have
AlgorithmIdentifiers that omit the (required) NULL byte, and I
had changed the code to make this a hard requirement in some
places. Now we allow this is in all cases.
Also, I simplified the countersignature parsing code so that
any valid RSA OID is supported in the digestEncryptionAlgorithm
field... This makes the code cleaner and should avoid any
future variations from the specification (if SHA1RSA is an
acceptable value to pass, SHA256RSA probably is too)
This commit adds back in support for whitelisting files based on
signatures from .cat files loaded in via a '-d' flag to clamscan.
This also makes it so that a .crb blacklist rule match can't be
overruled by a signature in a .cat file
This commit makes the following changes:
- --dumpcerts will print certificates even if they already exist
in any .crb files loaded
- --dumpcerts will print certificates only once
- Having a whitelist CRB rule on a leaf certificate should no longer
prevent signature verification from happening. NOTE, this doesn't
mean that you can have whitelist rules for leaf certificates and have
that result in a trusted signature - that doesn't work yet
- Determining whether a certificate is blacklisted now includes comparing
the public key data (modulus and exponent) in addition to the subject
and serial hashes
- If a blacklisted certificate is detected, the code will return
immediately instead of continuing on to parse the rest of the signature
There are some Windows binaries that have certificates with version 1
TBSCertificate sections. This technically isn't allowed by the spec,
but the Windows API still seems to report these are being OK
In an earlier commit, I mistakenly check for whether a nested signature has
been seen when determining whether a countersignature is present instead of
checking that the countersignature has been seen
We used to get a pointer to file data without locking and for some samples
this pointer would be invalidated by the time we used it. Now, we just
store the offset for the sections that should be hashed as part of the
Authenticode hash computation and get the file data pointer right before
it's needed.
A more reliable way to calculate the authenticode hash appears to
be to hash the header (minus the checksum and security table) and
then just hash everything between the end of the header and the
start of the security section.
The diff is confusing, but basically I moved the countersignature
verification code into it's own function and then in asn1_parse_mscat
we now loop through the unauthenticatedAttributes to find the
counterSignature attribute (instead of assuming it's the first
attribute in the list.)
We also now do time-validation in the case where an unauthAttrs
section exists but doesn't include a counterSignature
If no unauthenticatedAttributes sections exist, the code will now judge
validity based on whether the code signing certificate is valid at the
time of the scan.
In my sample set of 2,000 signed binaries, there were 69 with x509
certificates included that didn't seem to comply with the spec. These
weren't in the actual certificate chain used to verify the binary,
though, and the Windows verification API had no problems with it, so
we shouldn't either. The specific errors varied. Specifically:
- 54 - expected NULL following RSA OID - For some
binaries this was due to an old "DUMMY CERTIFICATE" included
for some reason.
- 8 - module has got an unsupported length (392) - Binaries from
one company include 392-bit RSA keys for some reason
- 7 - expected [0] version container in TBSCertificate - Some
really older certificates don't seem to include the version
number (maybe the RFC didn't include one at the time?)