Recently I got the Problem that a System used by a customer was misused, maybe just a infected client – maybe intentional – doesn’t matter, to spread some trojans / malware. Since that I want to be sure that the Public Data Directories accessible on the web are at least scanned. ClamAV is a great tool and can also used to discover PUA (Potential Unwanted Applications) and Malware. Problem is, on that specific server are many thousands of files, running it will take nearly half a day. So I needed to figure out a way to improve that by scanning only new or modified files.
You can simply do that by generating a list of new/modified files with find and then pipe them with xargs into a clamscan scan command.
find /var/www -mtime -2 -type f -print0 | xargs -0 clamscan -io --detect-pua=yes --move=/safe-directory --log=/var/log/clamscan.log
That's basically it - put it as a Cronjob every night and only files created / modified in the last 2 days are scanned.
This was the old one:
----------- SCAN SUMMARY -----------
Scanned directories: 162494
Scanned files: 675697
Data scanned: 231365.06 MB
Data read: 241158.82 MB (ratio 0.96:1)
Time: 32030.824 sec (533 m 50 s)
And this is the New one:
----------- SCAN SUMMARY -----------
Scanned directories: 0
Scanned files: 446
Data scanned: 98.45 MB
Data read: 124.91 MB (ratio 0.79:1)
Time: 121.423 sec (2 m 1 s)