How Can We Help?

Search for answers or browse our knowledge base.

Documentation | Demos | Support

< All Topics
Print

1.4.6 Release Notes

November 16, 2021

The 1.4.6 release mitigates important memory allocation issues.

In specific scenarios, the scan aggressively indexes large files which ramps up memory consumption. When the server is overloaded, the scans can be forced to reset in a perpetual loop.

This release addresses the issue in 2 ways.

Set Maximum File Size for Index and Classify

Admins can now preemptively exclude large files from being indexed and classified, which can trigger the memory issue ensuring the scans complete in a timely fashion.

In order to set Maximum File Size for Index and Classify, navigate to the Policies > Settings section, under “Tuning”
There will be 2 controls.

  1. Maximum File Size for Index and Classify, displayed as an integer.
  2. Unit (KB, MB, GB, TB)

By default, the Maximum File Size value is set to zero. A zero value means that there is NO limit.

If a file is larger than the max setting, indexing and classification of the file content will be skipped. The file will still be scanned and all meta data about the file will be captured.

The first 10 will display in the Task panel, and all skipped files will be posted in the general log with the following details:

  • File Name
  • File Size
  • Configured Max File Size.

Memory Allocation

To reduce memory consumption, memory allocation now increases in a linear pattern, 1GB at a time.

This throttles top line performance, but provides a more predictable memory allocation – suitable for a wider array of server resources.

Was this article helpful?
0 out Of 5 Stars
5 Stars 0%
4 Stars 0%
3 Stars 0%
2 Stars 0%
1 Stars 0%
How can we improve this article?
Please submit the reason for your vote so that we can improve the article.
Previous 1.4.0 Release Notes
Next 1.4.7 Release Notes
Table of Contents