use only cli_readline() we don't need exact conversion
drop unused functions,
simplify encoding_norm_readline(), and rename to encoding_normalize_toascii()
git-svn: trunk@3571
by default current prefix is also searched
allow to specify libb2 location, by using --with-bz2-prefix,
by default current prefix is also searched
locating gmp and bz2 works by default on OpenBSD (bb #301)
move local m4 macros to m4/
import lib-link.m4 and dependent files
quoting for autoconf macros (bb #452)
git-svn: trunk@3566
Markus Elfring <Markus.Elfring*web.de> in bb #452)
* use AC_CONFIG_HEADER, since AM_CONFIG_HEADER is obsolete
* put configure files into auxiliary directory
* fix main declaration in FD_SETSIZE test
* check for failure on fopen in FD_SETSIZE test
* move version from AM_INIT_AUTOMAKE to AC_INIT, old form was obsolete
* eliminate automake warnings, update Makefile.am
* rename .splitted to .split (requested by aCaB)
git-svn: trunk@3563
- introduce global max data scan limit (on a per libclamav entry basis)
if reached skip the current file (current archive member)
if the archive is solid, try to skip to non-solid archive member if any (applies to cab folders).
if skipping not possible, or if the end of the archive is reached, return CL_CLEAN (by default or MaxSmthng if configured)
(for solid archives skipping still uses CPU, thus we need to account for it towards the maxlimit. If archive doesn't have non-solid members,
we can't scan anything else from the archive, since once the limit is triggered, each attempt would trigger the limit again)
- keep maxratio, but block only if global max data limit is also reached
otherwise we skip the current file
account skipped size (wrt the global max) only if the archive is solid
upon blocking we return CL_CLEAN (by default or MaxRatio if configured)
- recursion: keep as is (configurable)
- drop mail specific limits
------ accounting API ------------
introduce an API to handle limits, to avoid duplicating logic in each archive handler.
global limit logic: account for file size only if we are extracting it (limit not yet reached)
if(maxglobal + next_file_to_extract_size > maxglobalimit) { action () }
else maxglobal += next_file_to_extract_size
cli_limitscheck(cli_limits*, packed_size, unpacked_size)
- will also do global maxfile count accounting
- does maxratio accounting
- does the global size limit accounting
introduce API for recursion handling:
default action: skip if reached, configurable: block with recursionLimitReached.
we should block only if we also hit a size limit.
-----defaults---------------
- increase default maxfiles (we now have got global max data scan size limit)
- increase recursion limits (we now have got global max data scan size limit)
git-svn-id: file:///var/lib/svn/clamav-devel/branches/newlimits@3548 77e5149b-7576-45b1-b177-96237e5ba77b
update dependencies to rebuild on hashtab.c change
support keys with common prefix by checking match length
update due to hashtab change
git-svn: trunk@3536