- introduce global max data scan limit (on a per libclamav entry basis)
if reached skip the current file (current archive member)
if the archive is solid, try to skip to non-solid archive member if any (applies to cab folders).
if skipping not possible, or if the end of the archive is reached, return CL_CLEAN (by default or MaxSmthng if configured)
(for solid archives skipping still uses CPU, thus we need to account for it towards the maxlimit. If archive doesn't have non-solid members,
we can't scan anything else from the archive, since once the limit is triggered, each attempt would trigger the limit again)
- keep maxratio, but block only if global max data limit is also reached
otherwise we skip the current file
account skipped size (wrt the global max) only if the archive is solid
upon blocking we return CL_CLEAN (by default or MaxRatio if configured)
- recursion: keep as is (configurable)
- drop mail specific limits
------ accounting API ------------
introduce an API to handle limits, to avoid duplicating logic in each archive handler.
global limit logic: account for file size only if we are extracting it (limit not yet reached)
if(maxglobal + next_file_to_extract_size > maxglobalimit) { action () }
else maxglobal += next_file_to_extract_size
cli_limitscheck(cli_limits*, packed_size, unpacked_size)
- will also do global maxfile count accounting
- does maxratio accounting
- does the global size limit accounting
introduce API for recursion handling:
default action: skip if reached, configurable: block with recursionLimitReached.
we should block only if we also hit a size limit.
-----defaults---------------
- increase default maxfiles (we now have got global max data scan size limit)
- increase recursion limits (we now have got global max data scan size limit)
git-svn-id: file:///var/lib/svn/clamav-devel/branches/newlimits@3548 77e5149b-7576-45b1-b177-96237e5ba77b
update dependencies to rebuild on hashtab.c change
support keys with common prefix by checking match length
update due to hashtab change
git-svn: trunk@3536
* use fewer entities, browsers don't support all either.
* update to generate code for new entconv.
* no need for configure, use just a simple Makefile
(it is an internal tool)
libclamav/entconv.c, hashtab.c, htmlnorm.c:
* don't allocate memory for each entity_norm call.
* don't touch length of mmaped area (bb #785)
* update htmlnorm to use new entity_norm
git-svn: trunk@3515
entconv improvements to improve security and performance
Part I for (bb #686, #386)
TODO:
* optimize entity_norm
* create testfiles for unicode encoding variants
* create a regression test
* check for memory leaks
git-svn: trunk@3511