Skip to content

Increasing file size does not work with daemon + Clamscan "INSTREAM: Size limit reached, (requested: 65536, max: 0)" #131

@philly-vanilly

Description

@philly-vanilly

I am using google-cloud-sdk/slim docker image with this config (note the non default size limits):

RUN apt-get install clamav-daemon -y && \
    npm ci && \
    curl -sSL https://sdk.cloud.google.com/ | bash && \
    echo "StreamMaxLength 40M" >> /etc/clamav/clamd.conf && \
    echo "MaxFileSize 40M" >> /etc/clamav/clamd.conf && \
    echo "MaxScanSize 40M" >> /etc/clamav/clamd.conf && \
    mkdir /unscanned_files

and

  try {
    if (clamscan === null) {
      clamscan = await (new NodeClam().init({
        removeInfected: false,
        quarantineInfected: false,
        scanLog: null,
        debugMode: true,
        fileList: null,
        scanRecursively: true, // If true, deep scan folders recursively
        clamdscan: {
          host: XXX
          port: XXX,
          timeout: 60000,
          localFallback: false,
          path: null,
          multiscan: true,
          bypassTest: false,
          configFile: '/etc/clamav/clamd.conf'
        },
        preference: 'clamdscan'
      }));
    }

But all scan attempts of a 30 MB size file end in the titular error. I have tried multiple NodeJS clients (clamscan, clamdjs) and it is the same error, so I believe the problem is with the daemon itself. What is surprising is that the error says "max: 0" but 0 either means that there is no limit at all or it really is 0, but if it was 0, I would not be able to scan files below 25MB. But other ClamScan users have also reported on this: Cisco-Talos/clamav#1210 But I thought a slightly higher file size should be no problem. Others seem to use ClamAV for files with GB size, although perhaps not with NodeJS.

The error seems to be present in other (python) integrations as well: Cisco-Talos/clamav#942

The only hint with upvotes I could find was for the C# client https://stackoverflow.com/questions/39371037/how-change-limit-file-size-of-clamd-service-for-nclam that seems to have the option "MaxStreamSize = 52428800" in addition to the clamd configuration. Is ClamScan perhaps missing a buffering option?

How can I proceed here? Has anyone got clamscan to work with even slightly bigger files?

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions