Skip navigation
All Places > Snow Product Hub > General Licensing Forums > Blog
1 2 3 Previous Next

General Licensing Forums

84 posts


In the past, we had no way to decrypt our inventory files (* .snowpack).

This is useful for customers with specific security requirements.


Previous Solution:

By default, our Snow Inventory Agent uses a standard key to encrypt (*.snowpack) files.

This key is hard coded in both Snow Inventory Agent for encryption and Snow Inventory Server for decryption.

All customers had to contact Snow Support to decrypt their own inventory files.

This was very time consuming,


New Solution:

Snow has now created the possibility to create your own encryption keys.


How does it work?


With the tool AESKEYGEN, you can create your own custom crypto keys for a certain agent or group of computers. The tool creates a key that needs to be copied to a folder on the Inventory server as well as to a folder on the computers to be inventoried. The file is named <fingerprint>.cryptkey.


Run aeskeygen.exe <Path> to create your own crypto key

The result is shown


Use this setting in the configuration file of the Inventory Server (snowserver.config) to specify the folder where the crypto keys are located:


To specify the fingerprint of the crypto key to use for Snowpack encryption, use these settings in the configuration file of the agent (snowagent.config), and the folder where it is located.


Deploy your Agent with your created crypto key.


With the tool SNOWPACK-UTIL, you can decrypt your generated (*.snowpack) files

Use it with the following options:


The unpack command will decrypt the file and unpack the content to a sub-folder of the current folder. To unpack the content, use the following syntax:


The pack command will generate a new snowpack file based on the content of a specified folder and encrypt it with the fingerprint of your custom key.

To generate a file, use the following syntax:



Both tools can be ordered from Snow Support


The encryption level is AES128 bit


You cannot handle inv files with the Tools.


The snowpack-util cannot decrypt the default encryption key


The standard snow encryption and decryption always works.

Another vote, another stalemate. There is still no definitive conclusion on Brexit or exact guidance on how it will impact technology licenses. And it is clear the uncertainty and an inability to prepare has already had an impact - while businesses are losing contracts or moving operations out of the UK to avoid the limbo, very few workers have heard anything from the IT departments, or indeed anyone else in the organization.


But asset managers and CIOs can lay the groundwork now to help them deal with whatever conclusion is eventually reached. Read the latest post from Victoria Barber, our resident Technology Guardian and former Gartner analyst, for her thoughts and recommendations on how to prepare for this unprecedented challenge.



What does Brexit mean for technology assets? 


How is Brexit impacting you and your business? What have you done to navigate this situation from a licensing perspective? Please share your thoughts and questions in the comments below. 


Snow implements more and more microservices in Snow license Manager and Snow Inventory.

We started with Oracle Middleware Scanner which are used the Snow Oracle Service.

This service transports the delivered JSON Blob from Snow Inventory Server to Snow License Manager.


The Snow Integration Manager 5.15 will also work partially with this new function. There is also a micro service planned. Today the new method for collecting data will only store JSON blob data in the Snow Inventory Database.


Previous Solution:

Previously, the data transfer from Snow Inventory Server to Snow License Manager depended exclusively on the DUJ. The data transfer may have been time critical as it was run only once a day.


New Solution:

Dynamic Inventory allows any kind of data to be sent from Snow Agents / Snow Integration Manager and to be stored in the Snow Inventory Database. It collects and send data as JSON blobs.


How does it work?

With dynamic inventory we add more elements to the snowpack, which are not bound to any specific schema as long as they are JSON formatted.


All JSON blob data will be stored in the Snow Inventory Database




Dynamic inventory data always comes in pairs:

One with Meta data



One with the JSON data



The metadata is important to understand the source, the scan date and DaysToRetain, which means how long we should store it before cleaning it out.


You can query the JSON data in or with the SQL Management Studio.

The ‘$’ sign means the Toplevel of your JSON data blob and the timestamp is used to distinguish the data records.



The column "type" is important


Type 0                 Null value

Type 1                 string value

Type 2                 DoublePrecisionFloatingPoint value

Type 3                   BooleanTrue or BooleanFalse value

Type 4                   Array value

Type 5                   Object value


When you have an Array value in your JSON data, you can drill down

We have two entrys in our array, so we can use

$.KeyProtector[0] for the first entry of the array



And $.KeyProtector[1] for the second entry of the array




If further arrays occur in a nested form, we can recognize this on Type 4 in the type column.


You can also query one special entry such as the RecoveryPassword.




From this point we need a process to transfer some of our data to… maybe a custom field in Snow License Manager or other systems.



This must be done, because we do not have our own service like the Snow Oracle Service.



OPENJSON function is available only under compatibility level 130 or higher. If your database compatibility level is lower than 130, SQL Server can't find and run the OPENJSON function. Other JSON functions are available at all compatibility levels

Step 4: Reconciliation

The goal of reconciliation is primarily to harmonize contract, purchase and entitlement information with normalized inventory data to establish an ELP - the balance of licenses purchased to licenses consumed. ELP forms the basis of compliance, risk-reduction, audit defense, contract (re)negotiations, license "true ups" and optimizing software spend. Given the variety of SAM stakeholders, to accomplish these tasks, reconciliation merges normalized asset data with related information from other, often external, sources.

Reconciliation - The questions you need to ask yourself

  • How/where do you maintain entitlement data for key publishers?
  • What are the sources of entitlement data?
  • How much time do you spend weekly/monthly/yearly on reconciling entitlements to inventory?
  • What processes are triggered when a shortfall or over-license situation is discovered?
  • What processes do you have which are fed by reconciled data?

Reconciliation holds together Contract, Purchase and Entitlement Information with normalized inventory data to establish an effective license position, which is the balance of purchased vs. consumed software. In practice, reconciliation requires you to add license information into a Software Asset Management (SAM) solution and then assign it to users, machines or organizational units, depending on the metric of the application.

License information is extensive, including the number of licenses purchased, license cost, additional use rights, maintenance and support contracts and who or what the license is assigned to. The accuracy of this process forms the basis of compliance, risk reduction, audit defense, contract negotiations, license true-ups and optimizing software spend. Due to the various stakeholders within the Software Asset Management process, to accomplish the previous tasks, reconciliation combines normalized inventory data with related information from other data sources. These sources can include procurement data, license and entitlement details and information about users and organizational structures from Active Directory.

Reconciled data is very useful for planning, modelling and dependency mapping, which are critical for Software Asset Management and License Optimization. Reconciliation is also very useful for other IT groups such as Service and Support, who may be developing CMDBs which contain configuration data or ticketing information. It is always highly recommended that any organization regularly performs reconciliation tasks to understand their current compliance position. This proactive approach will help minimize the risk of exposure in the event of a vendor audit.

The Snow Way

Within Snow License Manager companies can manage all crucial licensing information within their estate. Once licensing information is entered into Snow License Manager, it will calculate a compliance position based on the normalized inventory data provided before. The result will be a compliance position for all software applications across the estate, giving involved stakeholders action steps to either reduce over-licensing or to prevent audit fines in case of under-licensing.



Step 3: Normalization

Enterprises often have multiple discovery and inventory solutions. Normalization is the consolidation of discovered inventory datasets to remove duplicated or conflicting information.



Normalization - The questions you need to ask yourself

  • How do you normalize your inventory data today? What inventory data is included?
  • Do you maintain your own catalog? How is it updated?
  • To what level is data normalized (publishers, title, edition, version, release)
  • What other tools (i.e. ITSM, CMDB) do you populate with normalized data? 
  • What processes do you have which are fed by normalized data?

Data can be extracted from many different sources which means it will not be consistent. The process of Normalization presents it in a friendly and easily recognizable format. It removes duplicates to present just one source of truth about each software asset.

The primary benefit of a normalization process is an accurate organized inventory across different datasets. For example: One dataset recorded an executable file name as ‘application.exe’, whereas the other dataset recorded it as ‘application’. Even though it is only one licensed application, it has two unique descriptions, but should not be counted as two separate installations.

Many approaches to normalization classify and categorize inventory automatically using databases of vendors, product and service names to standardize naming conventions of discovered inventory. Another good example of normalization is bundling identification. The inventory process might have discovered multiple applications including Microsoft Word, Microsoft Outlook and Microsoft PowerPoint installed on a Laptop. It determined that these are not three independent licensable applications, but instead a single suite.

Some normalization solutions may also provide default metric information, group applications into application families, add upgrade and downgrade paths, release dates and end of service information.



The Snow Way

Snow Software addresses these needs with its offering of the Data Intelligence Service (DIS), formerly known as the Software Recognition Service (SRS). It is available as a subscription service and provides our customers with up-to-date information for more than 500.000 applications from 80.000 vendors. If you would like to know more about our DIS/SRS, visit this blog post.



Step 2: Inventory

The purpose of inventory is to capture platform configuration information and to extract a complete list of all its software.

Inventory - The questions you need to ask yourself

  • How do you compile discovery data?
  • What reporting tools do you have?
  • What confidence level do you have in the coverage and accuracy of your Inventory data?
  • If you're using SCCM, do you have software inventory enabled? What about metering? 
  • Do you review your inventory data? How often? 
  • What processes do you have which are fed by inventory data?


The goal of the inventory stage is to identify all the software installed or executed on every platform in your network - and how multiplatform technology can help you present this information back in one single place. The identification of these software installations is foundational to any software asset management solution.
Understanding the various ways software publishers license their software, is critical in developing an effective SAM practice. A complete inventory solution should help identify the various metrics that drive license requirements. Inventory of hardware configuration is also important to SAM as many licenses define specific hardware configurations such as CPU & Core. Some inventory technologies also gather detailed software usage metrics, which can help business decisions about deployed software and are often required to optimize software spending.
Usage is also critical to identify software that isn't installed on user devices such as web applications, Software as a service (SAAS) applications or Virtual Desktop Infrastructure (VDI). This inventory stage helps customers to have a clear view of all their software investments across their network, removing risks associated with IT blind spots and preventing necessary IT spend.


Inventory using Snow


Of course it is possible to inventory by using our tools. We provide the Snow Inventory Agent for several operation systems such as Windows, Linux and Unix, and macOS. We do also provide a solution to scan Oracle databases with our Snow Inventory Oracle Scanner.


If you use an alternative solution to inventory your environment, our Snow Integration Manager may be your choice - by offering various Snow Connectors for 3rd Party Inventory Sources, we are able to join the retrieved data into Snow License Manager.


The following third party integration connectors are available out-of-the-box:


  • BMC Atrium Discovery and Dependency Mapping (ADDM)
  • BMC Discovery
  • Dell KACE
  • Discovery data from file
  • FrontRange Discovery
  • HEAT Discovery
  • HP Discovery and Dependency Mapping Inventory (DDMI)
  • HP Server Automation
  • HP Universal Discovery
  • IBM BigFix Inventory
  • IBM License Metric Tool
  • IBM Tivoli Asset Discovery for Distributed (TAD4D)
  • iQuate iQSonar
  • Ivanti Endpoint Manager
  • LANDesk
  • Microsoft Intune
  • Microsoft Intune (SCCM Hybrid)
  • Microsoft System Center Configuration Manager (SCCM)
  • Miss Marple
  • MobileIron
  • ServiceNow Discovery
  • Snow XML
  • Symantec Altiris (7.x - 8.x)
  • VMware
  • AirWatch

Note that this list might be updated in the future.




Step 1: Discovery

Discovery is the act of interrogating TCP/IP networks to identify network-attached physical and virtualized platforms upon which software executes.

Discover all platforms across the network and categorize these into those which have the potential to run Enterprise Software (i.e Oracle) and those which do not with minimal impact.

This is to ascertain potential financial exposure from licensing requirements.


Discovery - The questions you need to ask yourself

  • What tools do you use to monitor your environment?
  • Are there platforms for which you don’t have discovery? (MAC, Virtual, Cloud, UNIX/Linux)
  • What percentage of your estate is covered by discovery tools?
  • What processes do you have which are fed by discovery data?


Discovery is the process of finding and identifying all platforms on which software resides. This first step is essential in developing a complete Software Asset Management strategy.
As the use of technology has evolved so has the requirement in the inventory of new technologies - often virtual or off network like Software as a service (SAAS), Infrastructure as a service (IAAS) and mobile devices. Organizations will see more of their technologies spent in cloud services and mobile assets. It is specially important as many cloud and mobile technologies are onboarded with little or no involvement of IT which results in lack of visibility making it increasingly difficult for SAM and security teams to fulfill their roles. It is critical to have an internal solution that provides appropriate discovery capabilities for the different technologies used within the environment. Essential for effective SAM is the ability to discover every asset
in the estate, that consumes software. A good inventory solution provides complete asset discovery as well as revealing the blind spots in the IT network. This means computers, servers, mobile devices, tablets, all connected network devices like routers printers and firewalls. Many customers invested in these capabilities like SCCM. It is critical that you have 100% visibility into the entire estate.


The Snow Way

With Snow Inventory, Computers and devices can be discovered using LDAP lookups in an Active Directory, or by using the following technologies for network discovery on specific IP address ranges:

  • SNMP (SNMPv1)
  • SSH
  • WinRPC/WMI
  • ICMP (“ping”)
  • TCP/IP fingerprinting
  • DNS lookup
  • NIC manufacturer lookup.

When TCP/IP fingerprinting is enabled, discovery will attempt to identify the type of OS installed on the device.For details about the discovery criteria and the columns included in each discovery view, see the following tables.


It is also possible to configure the Snow Inventory server to perform an Active Directory discovery using LDAP or secureLDAP for both devices and users. Multiple LDAP paths can be configured (for i.e. different subnetworks).


You can then check the result of there in the built-in report in Snow License Manager named "Discovered assets that are not inventoried". In the column "Source" you will be able to see the origin of the discovered asset.




DINROS - Create your SAM roadmap

The purpose of the framework DINROS is to understand current SAM tool capabilities, using that baseline to plan how and where to make improvements within six critical SAM activities.


DINROS - The six SAM activities


  1. Discovery

    Discovery is the act of interrogating TCP/IP networks to identify network-attached physical and virtualized platforms upon which software executes. Discover all platforms across the network and categorize these into those which have the potential to run Enterprise Software (i.e Oracle) and those which do not with minimal impact.

    This is to ascertain potential financial exposure from licensing requirements.

  2. Inventory

    The purpose of inventory is to capture platform configuration information and to extract a complete list of all its software.

  3. Normalization
    Enterprises often have multiple discovery and inventory solutions. Normalization is the consolidation of discovered inventory datasets to remove duplicated or conflicting information.

  4. Reconciliation
    The goal of reconciliation is primarily to harmonize contract, purchase and entitlement information with normalized inventory data to establish an ELP - the balance of licenses purchased to licenses consumed. ELP forms the basis of compliance, risk-reduction, audit defense, contract (re)negotiations, license "true ups" and optimizing software spend. Given the variety of SAM stakeholders, to accomplish these tasks, reconciliation merges normalized asset data with related information from other, often external, sources.

  5. Optimization
    The goals of license optimization are to optimize software spending, keep track of changing software license structures and better manage vendors. Optimizing license position means being able to reduce the number, type and expense of licenses needed and in use, as appropriate.

  6. Sharing

    SAM tools both consume and produce data and information, which they must share to be useful. Gathering technical, financial and contractual data in a central system of record for IT assets enables I&O leaders to manage vendors effectively, and software assets from requisition through retirement. Implicit is the need for an integrated workflow and data persistence, driving the need for a centralized information repository in which to store and make available the results of activities. 




Are you new to Snow? Be sure to check out the landing page:

Welcome to Snow – Where to start and things to know

Step 6: Share

Gathering technical, financial and contractual data in a central system of record for IT assets is good. But to be useful, the consumed and produced data and information of the SAM tool needs to be shared among all responsible stakeholders within an organization.


Share - The questions you need to ask yourself

  • What kind of reports do you share with your responsible colleagues?
  • Do you schedule reports on a regular basis?
  • Is the produced information in the reports fitting the target audience?
  • Are all necessary information data shared to effectively work with it?


The last step share is when you will collect the data that has been gathered in the steps 1 to 5 of DINROS to a customizable report and present them to the appropriate stakeholders within the business. SAM solutions will consume and produce valuable data which should be shared within the organization. Gathering and maintaining financial compliance and contractual information in a SAM solution for IT assets enables stakeholders to manage software assets from repositioning to retirement.

Into the reporting individualization tools such as interactive dashboards data import and export and advanced integration connectors from the backbone of a strong SAM implementation. Stakeholders need access to SAM data and require solid reporting capabilities that can produce ELP and compliance reports. Some external toolsets will also need access to SAM data. These toolsets may reside within the IT operations group such as passing normalize inventory gate of CMDBs and associated service catalogues in enterprise resource planning or ERP for software procurement data or in other areas of businesses such as security.

SAM solutions want to access external data as well such as importing software vendor entitlement data. Sharing reported data and SAM data helps to drive efficiency in the customer environment and create alignment between IT and business within the organization. It also enables multiple stakeholders to access different views which enable them to be more productive and track what they need to know.


The Snow Way

With Snow License Manager it is easy to create your reports or use existing ones which then can be shared among the different responsible stakeholders. Reports can be scheduled to a defined time, so that they are being sent automatically by the system with the actual information - also reminding you.

Of course stakeholders can get direct access to Snow License Manager to view directly your shared data - in the Snow Management and Configuration Tool, access to reports and the ability to share these can easily be given.  



Step 5 - Optimization

The step of optimization is important in your software estate to reduce risk and sure compliance and save significant cost for your organization. The focus on license optimization is to optimize software spending, keep track of changing software license structures and better manage vendors.



Optimization - The questions you need to ask yourself

  • Do you optimize your software spending?
  • Do you know if you are over- or underlicensed?
  • Do you keep track of changed software license structures?
  • Do you reallocate existing licenses instead of buying new ones?


Optimizing license position means being able to reduce the overall quantity and expense of licenses needed which are in use. Some approaches identify in use software licenses and recommend license harvest and reallocation, effectively reducing the numbers of due licenses required. One of the ways to create cost savings, lies in effective software license optimization. Proactively managing software license optimization can help your organization reduce the likelihood and cost of a software audit, as well as delivering bottom line financial benefits.


Most organizations have some level of underlicensing in their environment - they’re using more software that they’re entitled to. However, organizations can check or filter the allocated licenses against the ones they already purchased. By managing software licenses more effectively, organizations can dramatically cut their software licensing spending or still improving the service deliberately of their users. Regular evaluations of a customer’s license position not only can identify potential savings but can also reduce risk. Automated reporting and threshold-based alerting can predict future compliance issues by notifying the administrators when software is being used incorrectly or when approaching the limit of their existing entitlements. Examples of valuable SAM tools provide potential software cost saving reports. These reports will detail software compliance and potential cost savings based on application usage. Other key reports can include unused applications per computer and report indicating substandard computers based on criteria that the user would specify. This level of detail allows SAM stakeholders to be more proactive and prevent disruptive and expensive audit and compliance issues. Some SAM platforms offer automated workflow engines.


Each solution provides organizations with the capability to automate and integrate a diverse range of business processes leading to increased value and effectiveness of your SAM program, your cloud strategy and device management. These platforms facilitate communication between the SAM platform and other businesses systems, enabling automation of processes such as software request, license harvesting, cloud positioning and device enrollment.

Often these solutions extend the capabilities of the SAM solution into existing organizational processes. By facilitating the transformation of massive numbers of unconnected manual tasks into automated processes the automated workflow engines deliver benefit not only to software asset managers but to all other stakeholders including HR, procurement, finance and business.

This type of solution drives optimization in several ways into the organization: Users are more likely to contribute to optimize in the use of organizational resources if they are aware of their service consumption and own the tool to control it. License Management becomes a continuously optimized process in which users can access the resources they need, when they need them, and unused resources can automatically be appropriated elsewhere.

These automated processes can also help organization drive cost savings by automating manual processes including approvals and preventing some of the common resource related issues such as overuse of software licenses.


The Snow Way

Within Snow License Manager starting on the dashboard, under- or overlicensing is visible within seconds. On the detailed compliance page either about the application itself, the application family or the manufacturer you can easily see your status.

Naturally the optimize step is an ongoing task which is supported by adjustable scheduled reports, which will help to always stay up to date.





How to implement your own local cloud rules.



It was always possible to detect the usage of websites using web metering and customers had to enter their own patterns. This older version did not show the full numbers when for example there was a proxy connection due to the redirecting of the proxy server.

Since then there is a new way of collecting the website usage. This is now done by browser plugins from the browser developers and a central catalogue is being managed centrally by Snow.

New patterns are now added daily. Customers can also have links listed, but the links need to be universal and thus local patterns could possibly not be suitable for all the customers (for example to have the local canteen menu of a company listed).

The Cloud Rule Creator is the brainchild of our Brazilian Snow colleague gabriel.carvalho.



If “Cloud App Metering” has been activated, cloud rules will be sent to all defined computers.

Cloud rules for commercial portals are continuously being added by the Data Intelligence Service. Customers can also submit links, if they are for example commercial offerings.

These cloud rules are a collection of web links and can be viewed in the “webmetering.rules” file in the Snow Inventory Agent directory.

These cloud rules are then used to compare with the browser plugins to find any usage of these. Once a web link has been called up by a user, this information is then sent to the Snow servers and processed resulting in the web application showing up in the Snow License Manager.

These findings are identified by the suffix “(Cloud)” under applications in the Snow License Manager.



There is a tool, which is not standard out of the box, but developed by a Snow employee to create your own local web rules.

This tool can be used, if the web links required cannot be added to the global catalogue. These are often internal web links and/or non-commercial offerings.

The “Cloud Rule Creator” can be installed anywhere but needs access to the Snow SQL database.

The overview shows how many items you have customized yourself.

 The first step is always to add a new application and then the rules. In this example we are adding the Snow Globe website.

After saving all the information, the following row of data has been generated in the “webmetering.rules” file:

"friendlyName":"Snow Globe","pattern":"^(?i)(http|https)://(.+[.]|)community\\.snowsoftware\\.com/.*(\\?.*|)$","domain":"^(?i)(http|https)://(.+[.]|)community\\.snowsoftware\\.com


When this web site has been called, the metering information will be sent to the Snow servers and later shown on the Snow License Manager.



This tool is a great way to add your own non-commercial and/or local web sites that you want to discover. It is important to add the address of the website after a successful login, because only then is a usage correctly detected. Generic landing pages are not a recommended entry.



This tool is provided as is, it is not covered by the general support and update processes of Snow, thus the implementation needs to be planned.

It is strongly advisable to implement/support this tool under the close supervision of a Snow Consultant.


If you would like to know more about this, please reach out to your local Account Manager.


What is CefSharp?

Posted by mark.lillywhite Employee Sep 16, 2019

I've noticed recently when analysing Snow data that CefSharp shows up quite regularly. Initial investigation of this rather interesting application, is that its an embedded Web Browser for allowing a parent application to have a web capability. 

I didnt find that very useful though, as its not something a user would use as such - and its certainly not installed standalone. So where is it coming from?

CEF stands for Chromium Embedded Framework and its quite popular.

Digging in to the program files (or the new 'Files per Computer' report (as of 9.3.3) shows most are part of various Vendor bundles including Microsoft (Power BI Desktop and and Personal Data Gateway) as well as Intuit, Github, EDrawings, Altium, Adobe and more...

There are also many versions of CefSharp with v73 being a recent version.

My question is there are many versions of CefSharp - do they need tracking or are they embedded enough that there isnt a potential security issue?

Snow has always had two places to store the cost of an application and they are traditionally used for different reasons. The behaviour of this functionality recently changed – this article explains the current functionality and the recent change.


  1. Application Cost

    This is the cost to be used in any Risk, or Cost saving reports and the financial info tab for a computer (used in cost per business area / TCO type reports). It is entered by editing the application, and is easy to bulk upload prices for many applications as desired.

  2. License Cost

    This will be the Purchase price of the license divided by the number of actual licenses – in the example below the cost is £10 per installation.

  3. Financial Info

    This tab shows the cost of the Hardware and the application costs of any software on that hardware (see below)

    These values can be reported in the 'Potential Cost Savings' report, and also the 'Cost of Unused Applications per Computer' report.

    It is important to note that in versions of SLM between 8 and 9.1, if there was a license cost AND an Application cost for an application, the license cost would always win. This had one unfortunate side effect: If the license cost was Zero, it would override any application cost – showing risks as zero, and savings as Zero (if you had a license entered).

    As of SLM 9.2 there has been a change:

    If the license cost is zero, Snow uses the application cost. If the license cost has a value, Snow will use that value an ignore Application cost. In the screenshot below Application cost is £140, license cost is zero.

    In the screenshot below license cost per install is £10 and application cost is still £140

This means in the Family View Risk is shown at the license value - £10

Where as without a license cost, the Risk is shown using the application cost ONLY

Note: Snow will average the license cost per installation if there are many licenses but with different prices.
In the example below we have two licenses at different prices (£300 & £500)

meaning the License average is £400 per install in this case.


You want to collect additional data that can’t be collected using the standard scanning capabilities of the Snow agent.


Previous Solution:

With Snow Inventory version 3 and version 5, Snow offered the possibility in the Windows client/agent to run PowerShell scripts to capture additional data.

These scripts were executed exclusively by our agent or client. Platforms like Linux, Unix or MAC were not covered by this PowerShell solution.


New Solution:

However, Snow Inventory 6 enables additional data to be captured independently of the platform it is executed from. This new capability is called “Snow Dynamic Inventory”

How does it work?

Since this new functionality is only available on Snow Inventory Version 6 and higher, you’ll need to upgrade to this version to unlock the potential. Besides that, you also need Snow Agent version 6 running on each of the platforms in your IT estate.


Step 1

Create your own script to capture your specific data. The script language does not matter; use the scripting language you know best.

Guidelines for your own scripts:

  • The script name cannot be longer than 100 characters.
  • No space in the filename.
  • The Script itself can be placed anywhere


Naming Examples:



  • The data that you want to collect has to be in valid json format and Base64 encoded.
  • Output the result to the designated output folders with the script name as the destination




Output pathOS

/var/tmp/SnowSoftware/Inventory/Agent/script-output/ + ScriptName/


/Users/Shared/SnowSoftware/Inventory/Agent/script-output/ + ScriptName/


%ProgramData%\SnowSoftware\Inventory\Agent\script-output\ + ScriptName\


/var/run/SnowSoftware/Inventory/Agent/script-output/ + ScriptName/




Step 2

Create and maintain a log file in the output folder named ScriptName.log

This handles the lifecycle of the logfile, i.e. deletion and/or overwrites.

Step 3

Create a metadata file along with the script output called metadata.json


The following tags must be included


ScriptNameused by the server to separate json blobs in the database table
ScriptRunUTCTimeStamptime in UTC when the script run
DaysToRetainused by the server to determine how many days the data should be kept in the database before getting cleaned out by the Garbage Collector


The Server will be parsing this information and use it when storing the data in the database.


Example of metadata.json file



Step 4

Schedule your own script to run!


Step 1 – Step 3 is included in the main script.

All output data is base64 encrypted

After running your default snow Agent scan, all output files are collected and deleted.


You can find your collected data in the inventory Database:



Example for Shell Script:

Dear SnowGlobe users,

our Knowledge Management team has activated a brand new page, where you can track all the Knowledge Base Articles you've subscribed.


Be sure to be logged in into the Support portal and then click here:  Notifications and Subscriptions

In this page you can see at a glance all the KB Articles, you can also unsubscribe from them or edit the notification settings.


To unsubscribe from a KB Article, click on "Subscribed":



To edit the notifications, click on "Notification Preferences":




Additional features will come later.

This mini-portal is very useful and we hope you like it!