Skip navigation
All Places > Snow Product Hub > General Licensing Forums > Blog > 2019 > June


You want to collect additional data that can’t be collected using the standard scanning capabilities of the Snow agent.


Previous Solution:

With Snow Inventory version 3 and version 5, Snow offered the possibility in the Windows client/agent to run PowerShell scripts to capture additional data.

These scripts were executed exclusively by our agent or client. Platforms like Linux, Unix or MAC were not covered by this PowerShell solution.


New Solution:

However, Snow Inventory 6 enables additional data to be captured independently of the platform it is executed from. This new capability is called “Snow Dynamic Inventory”

How does it work?

Since this new functionality is only available on Snow Inventory Version 6 and higher, you’ll need to upgrade to this version to unlock the potential. Besides that, you also need Snow Agent version 6 running on each of the platforms in your IT estate.


Step 1

Create your own script to capture your specific data. The script language does not matter; use the scripting language you know best.

Guidelines for your own scripts:

  • The script name cannot be longer than 100 characters.
  • No space in the filename.
  • The Script itself can be placed anywhere


Naming Examples:



  • The data that you want to collect has to be in valid json format and Base64 encoded.
  • Output the result to the designated output folders with the script name as the destination




Output pathOS

/var/tmp/SnowSoftware/Inventory/Agent/script-output/ + ScriptName/


/Users/Shared/SnowSoftware/Inventory/Agent/script-output/ + ScriptName/


%ProgramData%\SnowSoftware\Inventory\Agent\script-output\ + ScriptName\


/var/run/SnowSoftware/Inventory/Agent/script-output/ + ScriptName/




Step 2

Create and maintain a log file in the output folder named ScriptName.log

This handles the lifecycle of the logfile, i.e. deletion and/or overwrites.

Step 3

Create a metadata file along with the script output called metadata.json


The following tags must be included


ScriptNameused by the server to separate json blobs in the database table
ScriptRunUTCTimeStamptime in UTC when the script run
DaysToRetainused by the server to determine how many days the data should be kept in the database before getting cleaned out by the Garbage Collector


The Server will be parsing this information and use it when storing the data in the database.


Example of metadata.json file



Step 4

Schedule your own script to run!


Step 1 – Step 3 is included in the main script.

All output data is base64 encrypted

After running your default snow Agent scan, all output files are collected and deleted.


You can find your collected data in the inventory Database:



Example for Shell Script:

Dear SnowGlobe users,

our Knowledge Management team has activated a brand new page, where you can track all the Knowledge Base Articles you've subscribed.


Be sure to be logged in into the Support portal and then click here:  Notifications and Subscriptions

In this page you can see at a glance all the KB Articles, you can also unsubscribe from them or edit the notification settings.


To unsubscribe from a KB Article, click on "Subscribed":



To edit the notifications, click on "Notification Preferences":




Additional features will come later.

This mini-portal is very useful and we hope you like it!

New report: Files per computer


We’ve al been in the situation where we need to understand the reason for a certain application to be triggered in SLM. To be able to figure out what triggered this application, you could either edit a computer to check this on machine level or edit an application to see what triggered this application on all computers. This was only available to users with edit permissions on computer/application level.


To be able to provide a list with the combination of computers/applications this resulted a lot in creating custom reports for customers, this is the reason why we’ve introduced a new standard report in SLM 9.2 that brings this functionality. Let’s provide some more insight into this new report called "Files per computer". This report can be found in the “Standard reports” group.

With this report it makes finding all the relevant information like the path of a file much easier, which will make validating found software also a lot better. Unassigned software still needs to be reviewed in the SMACC, this information is not present in the “Files per Computer” report, however it does show the “hidden” applications that are used for recognition by SRS.


Please always make sure to add search criteria before hitting “Show report” since there is a lot of data to be loaded. However, if you forget to add search criteria, only the first 5 machines will be shown by default to prevent unnecessary load.


Let’s use Java as an example! Java is currently a hot topic ( due to the changes in licensing that Oracle implemented starting January 2019. If you make use of the Oracle Java versions you need to identify the specific version and if it was part of an installed third-party application, to determine if this needs to be licensed with Oracle. Therefore it’s good to have an overview of machines with Java and the executable path in one single report.

Using this “Files per Computer”, I’m now able to find all machines that have a java.exe installed and I can easily see in the path information if this has been part of a third-party application.