Skip navigation
All Places > Snow Product Hub > General Licensing Forums > Blog > Author: Anthony.Yip

General Licensing Forums

6 Posts authored by: Anthony.Yip Employee

This article relates to a real-world case – the customer’s name has been omitted.

The issues raised were all relate to the UNIX Snow Agent. The customer needed Oracle scanning, so the SIOS.jar process was running with the standard snowagent.jar scan. Before we begin, we must remember that:


  • We strongly recommend Java Runtime Environment (JRE) version 1.8 is used. 1.6 is the oldest version supported, but performance is best on 1.8 or above.
  • NFS (Network File System) mounts are rather common place on UNIX boxes – essentially, these are remote directories, much like network drives on Windows.


Mounting Issues and Troublesome Runtimes

The customer came to us reporting that the UNIX agent was taking too long to complete the scan and the Oracle scan. Why was this?

Two main reasons – the UNIX boxes in question were using JRE 1.6 and there was also a large amount of NFS mounts that the agent was trying to scan every time. This could have resulted in many, many more GBs of data being scanned, across the network, unintentionally.

The issue with the NFS mounts was easily resolved by looking at the snowagent.config file and adding exclusions for those mounts:



Next came the issue of the Java runtime – the system JRE was version 1.6, which is supported, but it is recommended to use 1.8 – which is precisely what we did.

The problem is, it was not desirable to upgrade the system JRE on all the UNIX boxes. The solution, therefore, was to specify our own JRE that the agent can use.

First, the version 1.8 JRE was placed in the agent directory - /opt/snow.



Next, the snowagent.config file is modified:



The value highlighted here can be modified to point to a specific Java Runtime.

Problem solved – almost. The crontab also needs to be changed. The crontab is how UNIX\Linux devices schedule processes. Crontab, by default, looks like this:



The crontab line as shown above was changed:




This troubleshooting and resolution method provided a fix for this customer’s particular “pain points” without the need for any down time for major changes to their platform.

Sluggish Snow License Manager tables, reports taking too long to load? This article aims to provide a starting point when diagnosing possible performance issues, particularly for larger environments.


If you have the following issues, you may need to look at your SQL Server platform:


  • Snow License Manager IIS web app usage generally slow and sluggish
  • Reports taking too long to generate
  • SLM website hangs
  • .SNOWPACK agent scan files in \Incoming\Data\Error folder with SQL timeout errors


Performance Anxiety

This article aims to provide advice on where to improve performance in the following areas:


  •  Speed of Inventory Agent .SNOWPACK file processing
  • Accessibility and usability of the Snow License Manager web application
  • Speed in which the over-night Data Update Job takes to run
  • Other general performance issues


Snow Software do not have any best practice guidelines for monitoring performance and usage. This task must be carried out by the customer. The advice offered in this document is largely based on previous engagements with similarly sized customers as well as the application of widely recognised industry best practices.



Dressed to Compress

SQL Server 2008-2014 Enterprise, or SQL Server 2016 Standard SP1 is recommended to allow Data Compression – this will reduce database size. This may increase the CPU usage as data is decompressed, but will not affect the IOs, so performance should not be affected overall and the decrease in database side should off-set this overhead.



A recommended disk configuration for large environments (over 100k seats) suggestion is as follows:

C: OS Drive

D: SnowInventory Database Data File, High-Speed Disk

E: SnowLicenseManager Database Data File, High-Speed Disk

F: TempDB Data File, High-Speed Disk

G: Log Files, High-Speed Disk

A configuration such as the above has been shown to give good performance for such large environments. The drives above should be separate physical disks and not separately partitioned volumes.


Monitoring - What's Going On?

To aid in identifying where the performance issue may lie, customers should investigate the SQL server. As a start, the monitoring counters common to all Windows Servers can be used.




Third-party monitoring tools may also be used to get further analytical information. Some suggestions on areas to monitor could be:

  • Processor(Total)\% Processor Time – This is a basic indicator that should demonstrate that the server is running within the accepted parameters. The counter being in the 20-40% range would be considered acceptable, but spikes of over 80% could be a concern.
  • Memory\Available MBs – tracking the available MB of memory can highlight if the amount of memory installed on the server is the issue. It can also help establish if other processes are using memory that could be used by SQL Server.
  • Paging File (Total)\% Usage – If a lack of memory is causing issues, it could be that the Page File is being used. Reading and writing to and from disk instead of memory will cause noticeable performance decreases.
  • PhysicalDisk(Total)\Avg. Disk Sec (Read & Write) – Two counters displaying this metric for both read and write can show how fast the I/O subsystem can respond to data requests. Latency values of more than 20ms may be an issue and should not be expected is SSDs are used.
  • System\Processor Queue Length – If this counter reports on the number of threads that are queued for processor. If this is above 0 then there are too many requests per core than the processor can handle, which will have an impact on performance.





Many of our customers now have large environments of 80-150k seats. For this amount of data and users, they would see improved performance by carrying out some or all of the following:

  • Ensure a 16 core processor is used
  • Ensure RAM size is 128GB
  • Ensure storage volumes are split across enough physical disks and that the disks are high performance SSDs
  • Migrate to SQL Server 2014 Enterprise or SQL Server 2016 Standard SP1 in order to enable SQL Data Compression
  • Monitor SQL server performance for an extended length of time to ascertain the times and extent of the performance issues


Enemy: Unknown

Your organisation has Snow Inventory and Snow License Manager up and running. Agents are rolled out to all devices on your estate and you are starting to make use of all the data. One fact remains: your network is huge. Multiple VLANs, multiple regions connected by MPLS, secured networks and more – how can you be sure that you can see every network device? After all, SAM is only as effective as the data you put in. In this article, we will discuss an all-to-often overlooked functionality that Snow Inventory provides – Discovery.


The Gateway Drug

Snow Inventory offers scalability through the use of Snow Inventory Gateways. We can install as many Inventory Gateways as is required – this is included within your Snow Inventory license. These Gateway Service instances can then be used to feed back discovery data on a network back to the Inventory Master Server.



Gateway Server instances can then be managed from within the Inventory SMaCC console on the Master Server:


Double-clicking into a Gateway will allow you to configure Network Discovery:



Discovery Methods

Now it’s time to look at the different types of discovery we can use…


Active Directory

Using an LDAP, we can identify machines across any number of domains. The data gathered can then be cross-referenced by Snow License Manager to identify any computers that are in the domain or domains and give an output of the machines that are not inventoried (i.e. there is no Snow Inventory Agent installed on the machine).

Any domains that do not have a Trust Relationship to the domain where the Master Server resides will require a Gateway Server within that domain.


SNMP (Simple Network Management Protocol)

Not all network devices can be fully inventoried, but you can still discover them. Who knows what devices you may have out there sitting in frame rooms? SNMP, or Simple Network Management Protocol is usually used for remote management of simple devices – uninterruptable power supplies (UPS), routers, switches, printers and other such devices may not even be running a full operating system but still have network connectivity so that they can report back basic information to your IT team – IP address, MAC address, serial number, firmware version etc. Snow Inventory can use this to discover and report on such devices.


DNS Lookup

Domain Name System lookup – DNS assigns a name to an IP address. Inventory can use this to attach hostnames to IP addresses to further identify devices.


TCP/IP Fingerprinting

TCP/IP Fingerprinting is used to try and determine what OS is behind the IP address that has been discovered. This can particularly help identify elusive Linux and Unix machines, as well as Windows, if WinRPC is unable to.



This protocol is used solely for Windows remote management and is another tool that Inventory could use to potentially identify a Windows machine on the network. Port 135 must be open on the target machine to be able to be scanned via WinRPC.


SSH (Secure Socket Shell)

SSH protocol is most often used to remotely manage Unix devices, for example, when using a tool like PuTTY to SSH protocol is used (usually via port 22) to secure copy (SCP) files to a Unix-based machine. Using this protocol, Inventory can identify Unix machines on the network.


Making Use of Discovery Data

Once Inventory has discovered two of the following – an IP address, a MAC address and a hostname, then this device is discovered and will show up on Discovered Devices reports within Snow Inventory.


Within Discovery, there are a number of default views:



AD and SIM Computers – All computers that have been found by the Active Directory discovery or any SIM Connectors.


Reachable Network Devices – Any devices picked up by the SNMP protocol, i.e. switches, printers etc.


Reachable Unknown Devices – These devices have been discovered but there is not enough information to determine much more than the IP address and MAC.


Reachable Computers – These devices have been discovered by either the WinRPC/WMI, TCP/IP Fingerprinting or Active Directory protocols to determine the operating system.


Reachable Computers with Snow Inventory Client 3.x for Windows – This is useful for identifying any Windows machines that are still using the old Inventory Client. These machines can then be targeted for Inventory Agent deployment.


This document describes how to install and run the Snow Inventory Agent for Linux. For more detail, please refer to the full user guide - SIAL510_UserGuide_LinuxAgent.pdf

The Snow Inventory Agent for Linux is part of the Snow Inventory solution and is used for inventory of Linux computers. The agent scans the computer and saves the collected data to a compressed and encrypted file, which is sent to a Snow Inventory server (Master Server or Service Gateway).

For detailed information on the configuration of the agent, refer to the document Snow Inventory Agent Configuration Guide.


This version of the Snow Inventory Agent can only be used in a Snow Inventory Server 5.x environment. Supported operating systems are found in the document System Requirements for all Snow Products found at Java Runtime Environment To be able to inventory Oracle database products by using the Snow Inventory Oracle Scanner (SIOS), the target computer is required to have Java Runtime Environment 6.0 (1.6) or later installed. Due to an internal defect in Java, Java Runtime Environment 1.7.0_7 must not be used.


Please note that in the example terminal commands, anything in [brackets] is not part of the command - for example [return] means to hit the return or enter key.


The Snow Inventory Agent for Linux can be installed using prepared packages or using copies of the binary files.

Installation Packages

Installation packages are prepared by and ordered from Snow Support. The current configuration file needs to be provided before any RPM or DEB package can be prepared. If no configuration file exists, certain information is needed in order to create one.

Required information:

  • Address to the Snow Inventory Server, including port number
  • Site name



  • Name of the configuration file
  • Cron scan time (default is “daily”)
  • If the Snow Inventory Oracle Scanner should be included
  • If the previous versions of the Snow Inventory Client for Linux should be removed


The installation package can be copied to any folder, but preferably not the /root folder. The install command must be run from the folder where the package is located, while the uninstall command can be run from any folder.

During the installation of the agent, an application argument will be automatically added to the cron. This command looks like this: nice -n 10 /opt/snow/snowagent

Prepared RPM Package

This section describes how to install and uninstall an RPM package from a terminal session. Use sudo or run the commands as root.


 Execute the rpm command with the argument -i, for example: sudo rpm -i snowagent_5.1-1_i386.rpm


To uninstall, execute a rpm command with the -e argument – you do not have to run this from within any particular folder: sudo rpm -e snowagent

Prepared DEB Package

This section describes how to install and uninstall a DEB package from a terminal session. Use sudo or run the commands as root.


Execute the dpkg command with the argument -i, for example:

sudo dpkg -i snowagent_5.1-1_i386.deb


Execute the dpkg command with the argument -P, for example: sudo dpkg -P snowagent


This section describes how to manually run a scan and then check the logs. Some light knowledge of Unix commands is advantageous but not completely necessary. Once the steps above have been followed, according to the type of Linux OS you are working on (RedHat or Debian) you can follow the below steps to complete and verify the installation.

  1. Verify the contents of /opt/snow by using the following commands from the terminal. Anything in [solid brackets] is an instruction or information and is not part of a command: cd /opt/snow [Return] You should now be looking at the directory the agent is installed into. Verify this with: ls [Return] This should output the contents of /opt/snow and should look something like this: /data [this is a directory] snowagent snowagent.config snowcron
  2. The files of the Snow Inventory Agent are one executable file called snowagent and one configuration file called snowagent.config. These two files are, by default, located in the /opt/snow directory. To see a command line summary of the executable, use the following command from a terminal window: sudo [you need this if you are not running as superuser] /opt/snow/snowagent -?


  1. You will notice that the one of the commands is scan. Run snowagent with the scan argument: sudo /opt/snow/snowagent scan [Return]
  2. You can now verify that the scan took place by looking at the /opt/snow/data folder. If you are already on the /opt/snow folder, you can simply do: cd data [Return] Or you can point to the full path: cd /opt/snow/data [Return] Use ls to list the contents of /data: ls [Return] This should list something like the following: result-000001512674563-1153644345674295837.snowpack [this is the .snowpack scan file] snowagent.lock snowagent.log [this is the log file] You can read the snowagent.log file if you wish: more snowagent.log [Return] [A successful scan will have an Info log stating that the agent has finished building snowpack]
  3. Before sending the data to Snow Inventory, you might want to verify your snowagent.config file. To do this, navigate back to /opt/snow either by simply using cd .. if you are in /opt/snow/data or by using cd /opt/snow. To read the snowagent.config file: more snowagent.config [Return] This will display the .config file. User the return key to read the document. You want to verify the <SiteName> and <Address> under <Endpoint> entries are correct as well as any other configuration items you may have specified when creating the agent. When you are finished reading, press Q.


  1. To send your generated .snowpack file, follow the same steps as in step 3, except use the send argument: sudo /opt/snow/snowagent send You can now navigate back to /opt/snow/data and use ls to list contents. The .snowpack file generated should now be gone as it has been sent to the Inventory server. Again, you can use more snowagent.log to verify that this has been sent.



To configure the scheduling, crontab is used with the argument -e. Run this at root level. You may need to run this from /root: sudo crontab -e [Return} It may ask what tool you want to use to edit the file. Usually, Nano is used, so we will use that for this example. Once the crontab has opened, the install package will have inserted the default line: 0 21 * * * nice -n 10 /opt/snow/snowagent>/dev/null 2>&1 This translates to the scan running at 21:00 – 9pm according to a 24-hour clock. You will notice that the minutes are put ahead of the hours. You must use single digits where applicable, so 5 minutes should be 5 and not 05. For example, to set the scan time to 10am, you would modify the line to read like this: 0 10 * * * nice -n 10 /opt/snow/snowagent>/dev/null 2>&1

Once the changes have been made, press CTRL-X to exit and press Y to save changes. The scheduling has now been set.

This article serves to provide a high-level “cheat sheet” set of common questions frequently asked by Snow Software customers. The purpose is to provide a starting point for further understanding the new ServiceNow 3.1 connector and how it works.


This document should be used in conjunction with the most up to date ServiceNow Connector documentation.


It is important to remember that while Snow Software consultants do have a good level of ServiceNow understanding, we are not expected to provide ServiceNow administration-level guidance or expertise.


“What does it do?!”

Usually the person asking this will want a high-level answer – our ServiceNow Connector actually comprises of two separate connectors available to download from the ServiceNow Store – the Snow Software Catalog Integration which populate the ServiceNow Product Catalog hardware and software models and the Snow Software CMDB Integration which populates the ServiceNow CMDB with information on hardware and software assets as well as users. Additional information can be shown in real-time but this functionality requires a Mid Server (see below) or for Snow License Manager to be externally facing.


“How often will ServiceNow be populated with this data?”

This depends on how the SIM connector is configured and requires discussions with the customer’s technical infrastructure contact – typically, it is run outside of working hours on a nightly basis, ensuring it is not run when the SLM Data Update Job is running (21:00 by default).


“What information will ServiceNow receive from SLM? What are the field mappings etc?”

Refer to document SSN31_TechnicalReference_ReplicatedAssetInformation.pdf – the ServiceNow administrator should review this document.


“Who do we need to involve during the implementation process?”

A technical infrastructure resource to work with to install the SIM connector and to allow you to log onto the SLM SMaCC to create apiuser and their ServiceNow administrator, whether it be in-house or out-sourced, plus any other stakeholders for both ServiceNow and Snow should be involved in any engagements regarding the ServiceNow Connector.


“What connectivity is required from our end?”

Typically, we will use the SIM on the Inventory server – this server will need to be able to contact their Snow License Manager 8 webpage as well as the ServiceNow instance webpage.


“What should we have in place prior to the implementation?”

The Catalog Integration and CMDB Integration ServiceNow applications are both required for implementation – it is important to notify customers that these must be requested through the ServiceNow Store, but that we approve every request manually – they must allow up to two working days for this request to come through. This must be highlighted to customers to avoid nasty surprises on the day of implementation!


Additionally, we require at least SIM v5.8 to be installed, usually on their Inventory server – we can carry out this upgrade on the day of the implementation, but they must be notified in-case they have to raise a change control request on their side to approve this change to their environment.


Finally, SLM must be on the latest version.


“What credentials/user accounts will we need to provide?”

As per SSN31_UserGuide_InstallConfig.pdf page 3, we require a Snow License Manager API User account (which we can help them create from the SMaCC) and a ServiceNow account with the two roles detailed in the pre-requisites.


“Will integrating Snow and ServiceNow overwrite any of our existing data on either SLM or ServiceNow?”

The ServiceNow CMDB aggregation is a one-way flow of data from the SLM Rest API to ServiceNow. Any data from other data sources flowing into ServiceNow will not be overwritten and no data within Snow License Manager will be changed.


“Why do we need a Mid Server/What is a Mid Server?”

A ServiceNow Mid Server is an on-premise Linux or Windows service that ServiceNow provides. It facilitates connectivity between ServiceNow and a third-party source. The ServiceNow Connector has the optional ability to provide real-time compliance information from within ServiceNow and this can be used for real-time workflows between ServiceNow and SLM. Where a customer requires this functionality, they will need a Mid Server within their internal network only if their SLM webpage is not externally facing – if it is externally facing, then ServiceNow can interface with SLM itself, without the need of the Mid Server.


It is important to note that it is not Snow’s responsibility to configure or maintain the ServiceNow mid-server.


“What is the success criteria for the implementation – how can we verify the implementation is a success?”

During the implementation, the Implementation Consultant will ensure that the SIM is set to show all details within the logs – the Catalog Connector will be run first and the SIM log will be monitored at the same time as the Transform History log within ServiceNow. Once this is complete, the same will be carried out for the CMDB Connector which needs to run after the Catalog Connector. Once both are run, information from Snow License Manager should be populated within the Computer asset tables within ServiceNow and there should also be a review of the Software and Hardware Models.


“How can we keep the ServiceNow Connector up to date?”

From within ServiceNow, you can check the Catalog Integration and CMDB Integration applications are up to date from within the System Applications > Applications > Updates section of ServiceNow.


Snow Inventory 6 – How Secure Is My Data?


For as long as Information Technology has been around security has been a concern – and for good reason. Data security should be a top priority within your organisation. Have you ever considered how secure the data is as it is gathered by the Inventory Agent and shuttled to the Inventory server to be written to the database and, after the Data Update Job has run, presented in Snow License Manager?

In this article, we will take a brief look at the measures the Snow Inventory Agent and the Snow Inventory solution itself takes to ensure your data is secured.


The .SNOWPACK file; it’s not just a clever name – it actually is an archived file that contains the result of the scan. It will typically contain:

  • json – Unique per machine (ID TAG)
  • xml – This contains all inventory content
  • config – A copy of the Agent configuration file
  • log – log file
  • Swidtags

This file takes all of the above, compresses it and packs it into the .SNOWPACK file. It is then encrypted – even the file name is encrypted – with 128-bit AES encryption. This encryption applies to any credentials that the Agent may be configured to use within the snowagent.config file.

The encryption key for this is a hard-coded part of the Snow Inventory product itself. Where absolute top-level security is necessary, you can even request your own encryption key using an app we can provide.

For complete data security, Snow recommends having the Agent send via HTTPS, usually via port 443. This will require an appropriate SSL certificate to be installed on the Inventory server hosting the landing page. This way, you can apply as much additional encryption as you require.

Even if HTTPS is not used and the data is sent using standard HTTP protocol, the SNOWPACK file itself is always encrypted with 128-bit encryption, protecting the data from being intercepted and opened.

Your organisation may require data to be anonymised – in which case, the Snow Inventory 5 Agent can anonymise user details and IP addresses.


Protecting the Back-End

Access to the Snow Inventory Console is facilitated through the Snow Management and Configuration Centre (aptly named ‘SMACC’). The SMACC uses SQL authentication to establish a connection to the database and this goes for Snow License Manager too.

The configuration for this is buried within the file system in Windows Server – C:\ProgramData\SnowSoftware. Anybody opening this config XML file will find a string of ASCII. This string contains the SQL server (or alias) on which the database is hosted and the credentials for the SQL account used. This is, of course, contained within this ASCII, encrypted and unreadable.

For Snow Inventory 5, the snowserver.config found in C:\Program Files\Snow Software\Snow Inventory\Server on the Inventory Master Server contains an encrypted string also – this contains the credentials and SQL server information necessary to establish a connection to the SnowInventory database.


SIC of Security Concerns?

The Snow Integration Connectors, or SIC, use the Snow Integration Manager (or SIM) to facilitate connections. The data gathered by the SIM is sent via .INV files. These are protected slightly differently from .SNOWPACK files, with 256-bit Rijndael encryption. Again, the encryption key is baked into the SIM product, but unlike with Inventory 5, custom encryption keys are not supported.

In addition to the data files themselves, the credentials and details for necessary for each connector to work are also secured. Examples of these may be vSphere logon credentials for the Virtual Management Option, Office 365 Connector credentials or ServiceNow Connector credentials. Such details are stored in the registry in the form of encrypted keys.



Correct installation and configuration of Snow License Manager and Snow Inventory products carried out by a Certified Snow Implementation Consultant, ensuring they are kept up to date and following best practices with regards to SSL certificates will ensure that your Snow platform is robust and secure.